Prosecution Insights
Last updated: April 19, 2026
Application No. 18/694,354

Object Manipulation in Graphical Environment

Non-Final OA §102§103§112
Filed
Mar 21, 2024
Examiner
BLOOMQUIST, KEITH D
Art Unit
2171
Tech Center
2100 — Computer Architecture & Software
Assignee
Apple Inc.
OA Round
1 (Non-Final)
63%
Grant Probability
Moderate
1-2
OA Rounds
3y 0m
To Grant
83%
With Interview

Examiner Intelligence

Grants 63% of resolved cases
63%
Career Allow Rate
440 granted / 702 resolved
+7.7% vs TC avg
Strong +20% interview lift
Without
With
+20.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
49 currently pending
Career history
751
Total Applications
across all art units

Statute-Specific Performance

§101
7.9%
-32.1% vs TC avg
§103
59.7%
+19.7% vs TC avg
§102
21.1%
-18.9% vs TC avg
§112
7.7%
-32.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 702 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION This action is responsive to the application filed 3/21/2024. Claims 1-20 are pending. Claims 1, 3-11, 13-16 and 18-20 have been amended by preliminary amendment, and Claims 21-40 are canceled. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claims 19 and 20 are rejected under 35 U.S.C. 112(b) as being indefinite for failing to particularly point out and distinctly claim the subject matter which a joint inventor regards as the invention. Each of Claims 19 and 20 recites the limitation of determining a distance via "the one or more sensors.” There is insufficient antecedent basis for this limitation, as neither claim previously recites one or more sensors. Therefore, these claims are indefinite and appropriate correction is required. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 2, 4, 5, 10, 16-18 and 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kin, et al., U.S. PGPUB No. 2019/0212827 (“Kin”). With regard to Claim 1, Kin teaches a method comprising: at a device comprising one or more processors, non-transitory memory, and one or more sensors ([0136] describes storage on non-transitory media; [0057] describes an application executed by a processor. [0058] describes tracking the device and user hands using information from sensors): detecting a gesture being performed via a first object in association with a second object in a graphical environment ([0069] describes that an electronic display in a near eye device can include a virtual recreation of a user’s hands, as well as other objects. [0074] describes that the system can recognize gestures of the user’s hand as well as a recognized object to which the gesture was related); determining, via the one or more sensors, a distance between a representation of the first object and the second object ([0092] describes that the distance between a gaze terminating at a particular object and the user’s hand can be determined); on a condition that the distance is greater than a threshold, displaying a change in the graphical environment according to the gesture and a determined gaze ([0118]-[0119] describe an example where a user gesture is detected within a threshold such that a user pinch gesture is detected when an object is between the user’s fingers; upon detecting that the pinching hand has moved past a threshold distance, an informational display is shown relating to the object which is the terminus of the user’s gaze); and on a condition that the distance is not greater than the threshold, displaying the change in the graphical environment according to the gesture and a projection of the representation of the first object on the second object ([0094]-[0095] describe that when the hand is within the threshold distance, such as identifying that an object is between the user’s fingers, and a gesture is input, a change to the object in the environment is carried out, such as changing a switch indicator and allowing a user to modify the switch position in the graphical environment). Claim 20 recites the non-transitory memory recited in the method of Claim 1, and the claim is similarly rejected. With regard to Claim 2, Kin teaches that the first object comprises an extremity. [0069] describes a representation of a user’s hand, and [0074] describes recognizing gestures of a user’s hand. With regard to Claim 4, Kin teaches that the representation of the first object comprises an image of the first object. [0069] describes that the virtual environment includes a virtual recreation using computer graphics of the user’s hands. With regard to Claim 5, Kin teaches that the first object is a physical object and the second object is a virtual object. [0069] describes that the representation of the user’s hands are in the environment along with virtual objects. With regard to Claim 10, Kin teaches that the change in the graphical environment comprises a manipulation of the second object. [0094]-[0095] describe an example input through which a user gesture is used to manipulate a switch. With regard to Claim 16, Kin teaches on a condition that the distance is greater than the threshold, displaying the change in the graphical environment according to a gaze vector based on a gaze and an offset determined based on a position of the first object. [0118]-[0119] describe an example where a user gesture is detected within a threshold such that a user pinch gesture is detected when an object is between the user’s fingers; upon detecting that the pinching hand has moved past a threshold distance, an informational display is shown relating to the object which is the terminus of the user’s gaze. With regard to Claim 17, Kin teaches that displaying the change in the graphical environment at a location corresponding to an end portion of the first object. Fig. 9B shows that the change to the environment is displayed at a location where the fingers of the user’s hand are located. With regard to Claim 18, Kin teaches that the device comprises a head-mountable device (HMD). [0027] describes that the artificial reality system may be implemented on a head-mounted device. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 3, 6, 15 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Kin, in view of Poulos, et al., U.S. PGPUB No. 2017/0363867 (“Poulos”). With regard to Claim 3, Poulos teaches that the first object comprises an input device. [0025] describes a head mounted device which models a physical space. [0028] describes that the location of a control device is also tracked. [0039] describes that a control device such as a baton can be represented in the display, altered based on proximity to various objects and used to interact therewith. It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Poulos with Kin. One of skill in the art would have sought the combination, to improve user experience by providing additional interactive features in a virtual environment, and by allowing dynamic response to user input in the environment. With regard to Claim 6, Poulos teaches on a condition that the distance is not greater than the threshold, displaying a virtual writing instrument. [0047] describes that the system can determine that the control device is within a predetermined proximity to a holographic object, and modify the representation of the control device. [0061] describes that the device can be changed to writing instruments such as a paintbrush, pencil, or pen. It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Poulos with Kin. One of skill in the art would have sought the combination, to improve user experience by providing additional interactive features in a virtual environment, and by allowing dynamic response to user input in the environment. With regard to Claim 15, Poulos teaches selecting a brush stroke type based on the distance between the representation of the first object and the second object. [0065] describes that a user can have a handheld baton, which is represented graphically in a virtual space. As a user varies the distance between the baton and a virtual painting surface object, the brush stroke can be changed in width, shade, density, or other visual characteristic. It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Poulos with Kin. One of skill in the art would have sought the combination, to improve user experience by providing additional interactive features in a virtual environment, and by allowing dynamic response to user input in the environment. With regard to Claim 19, Kin teaches a device comprising: one or more processors, a non-transitory memory; a display; an input device; and one or more programs stored in the non-transitory memory, which, when executed by the one or more processors ([0136] describes storage on non-transitory media; [0055] describes input devices and [0057] describes an application executed by a processor. [0058] describes tracking the device and user hands using information from sensors. Fig. 3 shows a near-eye display), cause the device to: detect a gesture being performed via a first object in association with a second object in a graphical environment ([0069] describes that an electronic display in a near eye device can include a virtual recreation of a user’s hands, as well as other objects. [0074] describes that the system can recognize gestures of the user’s hand as well as a recognized object to which the gesture was related); determine, via the one or more sensors, a distance between a representation of the first object and the second object ([0092] describes that the distance between a gaze terminating at a particular object and the user’s hand can be determined); on a condition that the distance is greater than a threshold, display a change in the graphical environment according to the gesture and a determined gaze ([0118]-[0119] describe an example where a user gesture is detected within a threshold such that a user pinch gesture is detected when an object is between the user’s fingers; upon detecting that the pinching hand has moved past a threshold distance, an informational display is shown relating to the object which is the terminus of the user’s gaze); and on a condition that the distance is not greater than the threshold, display the change in the graphical environment according to the gesture and a projection of the representation of the first object on the second object ([0094]-[0095] describe that when the hand is within the threshold distance, such as identifying that an object is between the user’s fingers, and a gesture is input, a change to the object in the environment is carried out, such as changing a switch indicator and allowing a user to modify the switch position in the graphical environment). Kin does not teach an audio sensor; Poulos teaches at [0021] that the head-mounted device has sensors which include a microphone. It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Poulos with Kin. One of skill in the art would have sought the combination, to improve user experience by providing additional interactive features in a virtual environment, and by allowing dynamic response to user input in the environment. Claims 7-9 are rejected under 35 U.S.C. 103 as being unpatentable over Kin, in view of Gebbie, et al., U.S. PGPUB No. 2019/0130656 (“Gebbie”). With regard to Claim 7, Gebbie teaches that the change in the graphical environment comprises a creation of an annotation associated with the second object. [0027] describes that a user can annotate an object by moving a tool within a threshold of the object to initiate the annotation. [0023] describes that the tool can be a finger, handheld controller, gaze, etc. [0004] describes the environment is interaction with virtual objects in a virtual environment, and [0006] describes that the tool is a virtual tool. It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Gebbie with Kin. One of skill in the art would have sought the combination, to improve user experience and system utility by enabling additional collaboration tools such as annotations, thereby providing additional use cases and functionality to the virtual environment. With regard to Claim 8, Gebbie teaches that the change in the graphical environment comprises a modification of an annotation associated with the second object. [0052] describes that a user can type content for an annotation, thereby modifying annotation text while writing content of the annotation. It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Gebbie with Kin. One of skill in the art would have sought the combination, to improve user experience and system utility by enabling additional collaboration tools such as annotations, thereby providing additional use cases and functionality to the virtual environment. With regard to Claim 9, Gebbie teaches that the change in the graphical environment comprises a removal of an annotation associated with the second object. [0028] describes that users are able to undo any attachment or drawing on a virtual object. It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Gebbie with Kin. One of skill in the art would have sought the combination, to improve user experience and system utility by enabling additional collaboration tools such as annotations, thereby providing additional use cases and functionality to the virtual environment. Claims 11-14 are rejected under 35 U.S.C. 103 as being unpatentable over Kin, in view of Duheille, U.S. PGPUB No. 2014/0055385 (“Duheille”). With regard to Claim 11, Duheille teaches applying a scale factor to the gesture. [0098] describes that gestures can scale with dimensions of objects displayed to a user. It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Duheille with Kin. [0004] of Duheille describes that gesture-based interfaces are not necessarily intuitive to users. Therefore, one of skill in the art would have sought to combine the gesture scaling described in Duheille with Kin, to improve user experience by providing a more intuitive interface experience. With regard to Claim 12, Kin, in view of Duheille teaches selecting the scale factor based on the distance between the representation of the first object and the second object. Duheille teaches at [0099] that a gesture can be scaled with the distance of the user’s hand from the object being manipulated using the gesture. Kin teaches the use of a representation of a hand and a second object at [0069]. It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Duheille with Kin. [0004] of Duheille describes that gesture-based interfaces are not necessarily intuitive to users. Therefore, one of skill in the art would have sought to combine the gesture scaling described in Duheille with Kin, to improve user experience by providing a more intuitive interface experience. With regard to Claim 13, Duheille teaches selecting the scale factor based on a size of the second object. [0098] describes that gestures can scale with the dimensions of a displayed object. It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Duheille with Kin. [0004] of Duheille describes that gesture-based interfaces are not necessarily intuitive to users. Therefore, one of skill in the art would have sought to combine the gesture scaling described in Duheille with Kin, to improve user experience by providing a more intuitive interface experience. With regard to Claim 14, Duheille teaches selecting the scale factor based on an input. [0098] describes that the gesture scale can be determined for specific objects, thereby selecting a scale factor based on which object a user provides input to. It would have been obvious to one of ordinary skill in the art at the time this application was filed to combine Duheille with Kin. [0004] of Duheille describes that gesture-based interfaces are not necessarily intuitive to users. Therefore, one of skill in the art would have sought to combine the gesture scaling described in Duheille with Kin, to improve user experience by providing a more intuitive interface experience. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to KEITH D BLOOMQUIST whose telephone number is (571)270-7718. The examiner can normally be reached M-F, 8:30-5 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kieu Vu can be reached at 571-272-4057. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /KEITH D BLOOMQUIST/Primary Examiner, Art Unit 2171 1/22/2026
Read full office action

Prosecution Timeline

Mar 21, 2024
Application Filed
Jan 22, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602941
SYSTEM AND METHOD FOR IDENTIFYING ATYPICAL EVENTS AND GENERATING AN ALERT USING DEEP LEARNING MODEL
2y 5m to grant Granted Apr 14, 2026
Patent 12602082
Electronic Devices with Translating Flexible Displays and Corresponding Methods for Providing Haptic Feedback
2y 5m to grant Granted Apr 14, 2026
Patent 12590442
CONTROL SYSTEM AND CONTROL METHOD FOR WORK MACHINE
2y 5m to grant Granted Mar 31, 2026
Patent 12578205
METHOD AND SYSTEM FOR AUTOMATICALLY GENERATING A MAP OF AN INDOOR SPACE
2y 5m to grant Granted Mar 17, 2026
Patent 12570413
UNIFIED DATA LIBRARY, FLYING OBJECT COPING SYSTEM, FLYING PATH PREDICTION METHOD, AND COMMUNICATION ROUTE SEARCH METHOD FOR ACCURATELY PREDICTING A PATH OF A FLYING OBJECT
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
63%
Grant Probability
83%
With Interview (+20.0%)
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 702 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month