Prosecution Insights
Last updated: April 19, 2026
Application No. 18/058,622

DETECTING AND MODIFYING OBJECT ATTRIBUTES

Non-Final OA §103
Filed
Nov 23, 2022
Examiner
TILLERY, RASHAWN N
Art Unit
2174
Tech Center
2100 — Computer Architecture & Software
Assignee
Adobe Inc.
OA Round
5 (Non-Final)
64%
Grant Probability
Moderate
5-6
OA Rounds
3y 10m
To Grant
76%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
394 granted / 611 resolved
+9.5% vs TC avg
Moderate +12% lift
Without
With
+11.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 10m
Avg Prosecution
32 currently pending
Career history
643
Total Applications
across all art units

Statute-Specific Performance

§101
5.1%
-34.9% vs TC avg
§103
61.3%
+21.3% vs TC avg
§102
22.8%
-17.2% vs TC avg
§112
5.4%
-34.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 611 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . 1. This communication is responsive to the Amendment filed 12/30/2025. 2. Claims 1-20 are pending in this application. Claims 1, 9 and 17 are independent claims. In the instant Amendment, claims 1, 9 and 17 were amended. Claims 10-12 were previously withdrawn. This is a Non-Final action on the RCE filed 12/30/2025. Claim Rejections - 35 USC § 103 3. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 4. Claim(s) 1-5 and 8 are rejected under 35 U.S.C. 103 as being unpatentable over Wainwright et al (“Wainwright” US 8,125,492) in view of Nishizawa (JP 2014235553) and further in view of Cohen et al (“Cohen” US 11,107,219). Regarding claim 1, Wainwright discloses a computer-implemented method comprising: detecting a selection of an object (see fig 1, 102) portrayed in a digital image displayed within a graphical user interface of a client device (see col. 4, lines 2-34; e.g., “the user has interacted with the input device to select sphere object 102. In response to selecting sphere object 102, the user is presented with an option window 106”); providing, for display within the graphical user interface, an interactive window as an overlay over at least a portion of the digital image (see fig 1 where menu 106 is shown overlaying object 102); and populating the interactive window with an attribute menu having one or more object attribute indicators that indicate one or more attributes of the object (see fig 1 where “object sphere” menu item 112 and “radius” menu item 116 are shown); receiving, via the interactive window, an additional selection of an attribute from the one or more attributes (see fig 1; also see col. 4, lines 23-34; e.g., “in response to selecting object attribute option 112, an attribute option menu 114 is presented to the user within user interface window 100…In one embodiment, attribute option menu 114 is dynamically generated based on the current attributes that are associated with the selected object…the user has selected radius attribute option 116”). Wainwright does not expressly disclose providing, for display within the graphical user interface and in response to receiving the additional selection of the attribute, an additional interactive window for modifying the attribute as an additional overlay over a portion of the digital image; receiving a user interaction with the additional interactive window for modifying the attribute, the user interaction providing a change to the attribute; and modifying, in response to receiving the user interaction with the additional interactive window for modifying the attribute, the digital image by changing the attribute of the object in accordance with the user interaction by: extracting a boundary of the object to be modified; and modifying the attribute of the object by modifying an area within the boundary of the object to incorporate the change provided by the user interaction. However, Nishizawa discloses providing, for display within the graphical user interface and in response to receiving the additional selection of the attribute, an additional interactive window for modifying the attribute as an additional overlay over a portion of the digital image (see fig 24, submenu 52B and slider 53; also see paragraphs [0073]-[0077]; e.g., “selects an adjustment item which is tapped in the submenu 52B.…window 53 for displaying the parameter adjustment scale and a slider corresponding to the adjustment item selected on the display unit”); receiving a user interaction with the additional interactive window for modifying the attribute, the user interaction providing a change to the attribute (e.g., “the position of the knob 53a is changed by a user operation, the "OK" button 53b is when tapped, increase or decrease the corresponding parameters according to the position of the knob 53a. Thus, the program executing section 31, the same memory area and processing performing image processing that reflects the above parameters increase or decrease with respect to the target image”); and modifying, in response to receiving a user interaction with the additional interactive window for modifying the attribute, the digital image by changing the attribute of the object in accordance with the user interaction (see paragraphs [0073]-[0077]; e.g., use slider to change color or brightness of target image, 51). It would have been obvious to an artisan before the effective filing date of the present invention to include Nishizawa’s teachings in Wainwright’s user interface in an effort to provide a more user-friendly interface by simplifying input entry thereby saving user time. Moreover, Cohen discloses extracting a boundary of the object to be modified; and modifying the attribute of the object by modifying an area within the boundary of the object to incorporate the change provided by the user interaction (see figs 11C and 11D where the tortoise is shown selected and modified). It would have been obvious to an artisan before the effective filing date of the present invention to include Cohen’s teachings in Wainwright’s user interface in an effort to provide a more user-friendly interface by simplifying input entry thereby saving user time. Regarding claim 2, the modified Wainwright discloses wherein: populating the interactive window with an attribute menu having one or more object attribute indicators that indicate one or more attributes of the object comprises providing, within the interactive window, textual descriptions of the one or more attributes (see Wainwright, figs 1, 114 and fig 2, 214); and receiving the user interaction with the additional interactive window for modifying the attribute comprises receiving one or more user interactions with the additional interactive window to change a textual description of the attribute (see Nishizawa, fig 24, slider 53). Regarding claim 3, the modified Wainwright discloses wherein providing, for display within the graphical user interface in response to receiving the additional selection of the attribute, the additional interactive window for modifying the attribute comprises providing, for display within the graphical user interface in response to receiving the additional selection of the attribute, a digital slider (see Nishizawa fig 24, submenu 52B and slider 53; also see paragraphs [0073]-[0077]; e.g., “selects an adjustment item which is tapped in the submenu 52B.…window 53 for displaying the parameter adjustment scale and a slider corresponding to the adjustment item selected on the display unit”); and receiving the one or more user interactions with the additional interactive window to change the textual description of the attribute comprises receiving, via the digital slider, the one or more user interactions entering text to change the textual description of the attribute (see Wainwright, figs 1, 114 and fig 2, 214). The modified Wainwright does not expressly disclose a digital keyboard. However, Official Notice is taken that a digital keyboard and a digital slider are well-known alternatives for inputting data and therefore it would have been obvious to an artisan before the effective filing date of the present invention to replace Nishizawa’s teachings of adjusting parameter values with a keyboard since doing so would yield a predictable result. Regarding claim 4, Wainwright discloses wherein providing, for display within the graphical user interface in response to receiving the additional selection of the attribute, the additional interactive window for modifying the attribute comprises providing, for display within the graphical user interface in response to receiving the additional selection of the attribute, an alternative attribute menu comprising one or more alternatives for the attribute (see fig 1 where “object sphere” menu item 112 and “radius” menu item 116 are shown); and receiving the user interaction with the additional interactive window for modifying the attribute comprises receiving the user interaction selecting an alternative from the one or more alternatives (see fig 1; also see col. 4, lines 23-34; e.g., “in response to selecting object attribute option 112, an attribute option menu 114 is presented to the user within user interface window 100…In one embodiment, attribute option menu 114 is dynamically generated based on the current attributes that are associated with the selected object…the user has selected radius attribute option 116”). Regarding claim 5, Nishizawa discloses providing, for display within the graphical user interface in response to receiving the additional selection of the attribute, the additional interactive window for modifying the attribute comprises providing, for display within the graphical user interface in response to receiving the additional selection of the attribute, a slider bar having an interactive slider element that indicates a strength of appearance of the attribute of the object within the digital image; and receiving the user interaction with the additional interactive window for modifying the attribute comprises receiving the user interaction with the interactive slider element to modify the strength of appearance of the attribute of the object within the digital image (see paragraphs [0073]-[0077]; e.g., use slider to change color or brightness of target image, 51). Regarding claim 8, Cohen discloses wherein modifying the attribute of the object by modifying the area within the boundary of the object comprises modifying the attribute of the object by using the boundary of the object as a boundary of the area being modified (see figs 11C and 11D where the tortoise is shown selected and modified). 5. Claim(s) 6, 9, 13-20 are rejected under 35 U.S.C. 103 as being unpatentable over Wainwright and Nishizawa and Cohen in view of Chopra et al (“Chopra” US 2021/0073267). Regrading claim 6, the modified Wainwright does not expressly disclose further comprising determining the one or more attributes of the object using an attribute classification neural network, wherein modifying the digital image by changing the attribute of the object in accordance with the user interaction comprises modifying the digital image by changing the attribute of the object in accordance with the user interaction using an object modification neural network. However, Chopra discloses determining the one or more attributes of the object using an attribute classification neural network (see paragraph [0023]; e.g., identify digital attributes using attribute classification neural network), wherein modifying the digital image by changing the attribute of the object in accordance with the user interaction comprises modifying the digital image by changing the attribute of the object in accordance with the user interaction using an object modification neural network (see paragraphs [0026] and [0028]; e.g., modify attributes based on neural network). It would have been obvious to an artisan before the effective filing date of the present invention to include Chopra’s teachings in the modified Wainwright’s user interface in an effort to simplify the identification of attributes of an object thereby saving user time and resources. Claim 9 is similar in scope to claim 1 and is therefore rejected under similar rationale. Additionally, the modified Wainwright does not expressly disclose modifying using a neural network. However, Chopra discloses it is well-known in the art to modify objects using a neural network (see paragraphs [0026] and [0028]; e.g., modify attributes based on neural network). It would have been obvious to an artisan before the effective filing date of the present invention to include Chopra’s teachings in the modified Wainwright’s user interface in an effort to simplify the identification of attributes of an object thereby saving user time and resources. Claim 13 is similar in scope to claim 3 and is therefore rejected under similar rationale. Regarding claim 14, Nishizawa discloses providing, for display within the graphical user interface, the additional interactive window for modifying the attribute comprises providing, for display within the graphical user interface and in association with the attribute, the additional interactive window having a slider bar that includes an interactive slider element that indicates a strength of appearance of the attribute of the object within the digital image; and receiving the user interaction with the additional interactive window for modifying the attribute comprises receiving the user interaction with the interactive slider element to modify the strength of appearance of the attribute of the object within the digital image (see paragraphs [0073]-[0077]; e.g., use slider to change color or brightness of target image, 51). Claim 15 is similar in scope to claim 4 and is therefore rejected under similar rationale. Regarding claim 16, Nishizawa discloses wherein modifying the digital image by changing the attribute of the object in accordance with the user interaction comprises modifying the digital image by changing the attribute of the object within the graphical user interface to provide an updated representation of the object within the digital image (see paragraphs [0073]-[0077]; e.g., use slider to change color or brightness of target image, 51). Claim 17 is similar in scope to claim 1 and is therefore rejected under similar rationale. Additionally, the modified Wainwright does not an attribute classification neural network; and determine, utilizing the attribute classification neural network and without user input, attributes for one or more objects portrayed in a digital image. However, Chopra discloses determine, utilizing the attribute classification neural network and without user input, attributes for one or more objects portrayed in a digital image (see paragraph [0023]; e.g., identify digital attributes using attribute classification neural network). It would have been obvious to an artisan before the effective filing date of the present invention to include Chopra’s teachings in the modified Wainwright’s user interface in an effort to simplify the identification of attributes of an object thereby saving user time and resources. Regarding claim 18, Nishizawa discloses wherein the at least one processor is configured to cause the system to: receive the one or more user interactions selecting the at least one attribute by receiving a plurality of user interactions selecting a plurality of attributes; and provide the visual representation to the change of the at least one attribute by providing a modified digital image for display within the graphical user interface, the modified digital image having the object with the plurality of attributes changed in accordance with the one or more additional user interactions with the additional interactive window (see paragraphs [0073]-[0077]; e.g., use slider to change color or brightness of target image, 51). Regarding claim 19, Nishizawa discloses wherein the at least one processor is configured to cause the system to receive the one or more additional user interactions with the additional interactive window by receiving the one or more additional user interactions with at least one of an alternatives attribute menu, a slider bar, or a textual description provided for display in association with the at least one attribute (see claim 1 above where the slider is discussed). Regarding claim 20, Nishizawa discloses populate the interactive window with the attribute menu having the one or more object attribute indicators that indicate the set of attributes of the object by populating the interactive window with the attribute menu having at least one object attribute indicator indicating a color of the object; provide, for display within the graphical user interface, the additional interactive window for modifying the at least one attribute comprises providing, for display within the graphical user interface, a slider bar having an interactive slider element indicating a color intensity of the color of the object within the digital image; and receive the one or more additional user interactions with the additional interactive window by receiving at least one user interaction with the interactive slider element to modify a color intensity of the color of the object within the digital image (see paragraphs [0073]-[0077]; e.g., use slider to change color or brightness of target image, 51). 6. Claim(s) 7 is rejected under 35 U.S.C. 103 as being unpatentable over Wainwright and Nishizawa and Cohen and Chopra further in view of Jin et al (“Jin” US 11,238,362). Regrading claim 7, the modified Wainwright does not expressly disclose wherein modifying the digital image utilizing the object modification neural network comprises modifying the digital image using a neural network that implements a visual- semantic embedding space that incorporates visual features and textual features. However, Jin discloses wherein modifying the digital image utilizing the object modification neural network comprises modifying the digital image using a neural network that implements a visual- semantic embedding space that incorporates visual features and textual features (see the Abstract; e.g., visual semantic embedding). It would have been obvious to an artisan before the effective filing date of the present invention to include Jin’s teachings in the modified Wainwright’s user interface in an effort to simplify the identification of attributes of an object thereby saving user time and resources. Response to Arguments 7. Applicant's arguments filed 12/30/2025 have been fully considered but they are not persuasive. Regarding Applicant’s arguments concerning claims 1 and 3-4 regarding Nishizawa failing to disclose “receiving a user interaction with the additional interactive window for modifying the attribute, the user interaction providing a change to the attribute,” the Examiner respectfully disagrees. Nishizawa clearly discloses “the position of the knob 53a is changed by a user operation, the "OK" button 53b is when tapped, increase or decrease the corresponding parameters according to the position of the knob 53a. Thus..the above parameters increase or decrease with respect to the target image” (see figs 81-85 and paragraphs [0073-[0077]). Therefore, the rejection is maintained. Conclusion 8. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Jarvis (US 10,515,160). 9. Any inquiry concerning this communication or earlier communications from the examiner should be directed to RASHAWN N TILLERY whose telephone number is (571)272-6480. The examiner can normally be reached M-F 9:00a - 5:30p. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William L Bashore can be reached on (571) 272-4088. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /RASHAWN N TILLERY/Primary Examiner, Art Unit 2174
Read full office action

Prosecution Timeline

Nov 23, 2022
Application Filed
Mar 16, 2024
Non-Final Rejection — §103
Jun 10, 2024
Interview Requested
Jun 17, 2024
Applicant Interview (Telephonic)
Jun 17, 2024
Examiner Interview Summary
Jun 21, 2024
Response Filed
Oct 03, 2024
Final Rejection — §103
Dec 09, 2024
Interview Requested
Dec 20, 2024
Applicant Interview (Telephonic)
Dec 20, 2024
Examiner Interview Summary
Dec 31, 2024
Request for Continued Examination
Jan 13, 2025
Response after Non-Final Action
Feb 22, 2025
Non-Final Rejection — §103
May 01, 2025
Interview Requested
May 08, 2025
Applicant Interview (Telephonic)
May 08, 2025
Examiner Interview Summary
May 09, 2025
Interview Requested
May 19, 2025
Response Filed
Aug 29, 2025
Final Rejection — §103
Nov 06, 2025
Interview Requested
Dec 30, 2025
Request for Continued Examination
Jan 16, 2026
Response after Non-Final Action
Feb 21, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602701
INTERACTIVE MAP INTERFACE INCORPORATING CUSTOMIZABLE GEOSPATIAL DATA
2y 5m to grant Granted Apr 14, 2026
Patent 12547302
PAGE PRESENTATION METHOD, DISPLAY SYSTEM AND STORAGE MEDIUM
2y 5m to grant Granted Feb 10, 2026
Patent 12542871
DATA PROCESSING METHOD AND APPARATUS, DEVICE, AND READABLE STORAGE MEDIUM
2y 5m to grant Granted Feb 03, 2026
Patent 12536219
DIGITAL CONTAINER FILE FOR MULTIMEDIA PRESENTATION
2y 5m to grant Granted Jan 27, 2026
Patent 12524138
METHOD AND APPARATUS FOR ADJUSTING POSITION OF VIRTUAL BUTTON, DEVICE, STORAGE MEDIUM, AND PROGRAM PRODUCT
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
64%
Grant Probability
76%
With Interview (+11.6%)
3y 10m
Median Time to Grant
High
PTA Risk
Based on 611 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month