Prosecution Insights
Last updated: April 19, 2026
Application No. 18/823,516

Pinch Compensation for Markup

Final Rejection §103
Filed
Sep 03, 2024
Examiner
GUPTA, PARUL H
Art Unit
2627
Tech Center
2600 — Communications
Assignee
Apple Inc.
OA Round
2 (Final)
61%
Grant Probability
Moderate
3-4
OA Rounds
2y 11m
To Grant
94%
With Interview

Examiner Intelligence

Grants 61% of resolved cases
61%
Career Allow Rate
375 granted / 617 resolved
-1.2% vs TC avg
Strong +33% interview lift
Without
With
+33.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
14 currently pending
Career history
631
Total Applications
across all art units

Statute-Specific Performance

§101
1.9%
-38.1% vs TC avg
§103
71.3%
+31.3% vs TC avg
§102
15.2%
-24.8% vs TC avg
§112
6.4%
-33.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 617 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Li et al., US Patent Publication 2021/0333884 in view of Krivoruchko et al., US Patent Publication 2024/0256049. Regarding independent claim 1, Li et al. teaches a method comprising: obtaining hand tracking data for a hand based on one or more camera frames (paragraph 0066 explains how the hand detection and tracking subsystem 516 tracks the hand based on a video frame and subsequent frames from a camera of the gesture-controlled device 100); detecting, at a first frame, a change in motion characteristics of the hand satisfies a threshold change in motion characteristics (as given in paragraph 0066 where changes in location of hand and threshold to be determined change are detected by the hand detection and tracking subsystem 516) based on the hand tracking data for the hand (paragraph 0066 explains the use of the hand detection and tracking subsystem 516 and allowing changes in the location of the hand within subsequent frames to be interpreted as corresponding changes to the location of the hand); detecting, at a second frame, a change in hand gesture status (changes to the location of the hand that are the movement of the hand as given in paragraph 0066) corresponding to a hand gesture associated with a user input action (paragraph 0059 explains that the hand movements are classified as hand gestures that are used as command inputs); and in response to the determination, adjusting a hand location for a user input action in accordance with a hand location at the first frame (paragraph 0066 explains how the hand location is determined based on the actions and gestures of the user and adjusted appropriately). Although implied, Li et al. does not specify the method determining that the second frame is within a threshold time period of the first frame based on the hand tracking data for the hand. Krivoruchko et al. teaches the method determining that the second frame is within a threshold time period of the first frame (paragraph 0174 describes the pinch gesture as being changes in hand location in frames within threshold times of each other) based on the hand tracking data for the hand (paragraph 0154 explains that the hand location is determined by the image sensors 314 that are the hand-tracking camera). It would have been obvious to one of ordinary skill in the art before the effective filing date to define the pinch gesture based on time thresholds as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to determine if the gesture is an immediate gesture and to differentiate between different gestures based on the timing (paragraph 0174 of Krivoruchko et al.). Regarding claim 2, Krivoruchko et al. teaches further the method of claim 1, wherein the user input action is a markup action comprising adding or removing a markup on a user interface in accordance with the change in motion characteristics (paragraph 0236 describes removing the markup of the line 766 based on the pinch gesture and paragraphs 0212 and 0274 discuss adding things in a markup mode and paragraphs 0233 and 0235 describes adding the markup of the line 760 based on the pinch gesture). Regarding claim 3, Li et al. teaches the method of claim 2, wherein the first frame is captured before the second frame, and wherein the change in hand gesture status comprises a transition from a pinch to an unpinch (paragraph 0065 explains the gesture as going between the pinch open shape 36 and the pinch closed shape 38). Li et al. does not specify the method further comprising, in response to the determination: removing markup on the user interface from a location on the user interface associated with a hand location in the first frame to a location on the user interface associated with the hand location in the second frame. Krivoruchko et al. teaches the method further comprising, in response to the determination: removing markup on the user interface from a location on the user interface associated with a hand location in the first frame to a location on the user interface associated with the hand location in the second frame (paragraph 0236 describes removing the markup of the line 766 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Regarding claim 4, Li et al. teaches the method of claim 2, wherein the first frame is captured after the second frame, and wherein the change in hand gesture status comprises a transition from an unpinch to a pinch (paragraph 0065 explains the gesture as going between the pinch open shape 36 and the pinch closed shape 38). Li et al. does not specify the method further comprising, in response to the determination: removing markup on the user interface from a location on the user interface associated with the hand location in the first frame to a location on the user interface the hand location associated with the second frame. Krivoruchko et al. teaches the method further comprising, in response to the determination: removing markup on the user interface from a location on the user interface associated with the hand location in the first frame to a location on the user interface the hand location associated with the second frame (paragraph 0236 describes removing the markup of the line 766 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Regarding claim 5, Li et al. teaches the method of claim 4, and obtaining historic hand locations for frames captured between the first frame and the second frame (paragraph 0073 explains the capture of historic hand locations at all of the different time points). Li et al. does not specify that wherein removing markup comprises: removing markup on the user interface in accordance with the historic hand locations. Krivoruchko et al. teaches wherein removing markup comprises: removing markup on the user interface in accordance with the historic hand locations (paragraph 0236 describes removing the markup of the line 766 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other to be historic hand locations in other frames). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Regarding claim 6, Li et al. teaches the method of claim 2, wherein the first frame is captured after the second frame, and wherein the change in hand gesture status comprises a transition from a pinch to an unpinch (paragraph 0065 explains the gesture as going between the pinch open shape 36 and the pinch closed shape 38). Li et al. does not specify the method further comprising, in response to the determination: adding markup on the user interface between a location on the user interface associated with the hand location in the second frame and a location on the user interface associated with the hand location in the first frame. Krivoruchko teaches the method further comprising, in response to the determination: adding markup on the user interface between a location on the user interface associated with the hand location in the second frame and a location on the user interface associated with the hand location in the first frame (paragraphs 0212 and 0274 discuss adding things in a markup mode and paragraphs 0233 and 0235 describes adding the markup of the line 760 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Regarding claim 7, Li et al. teaches the method of claim 2, wherein the first frame is captured before the second frame, and wherein the change in hand gesture status comprises a transition from an unpinch to a pinch (paragraph 0065 explains the gesture as going between the pinch open shape 36 and the pinch closed shape 38). Li et al. does not specify the method further comprising, in response to the determination: adding markup on the user interface between a location on the user interface associated with the hand location in the second frame and a location on the user interface associated with the hand location in the first frame. Krivoruchko et al. the method further comprising, in response to the determination: adding markup on the user interface between a location on the user interface associated with the hand location in the second frame and a location on the user interface associated with the hand location in the first frame (paragraphs 0212 and 0274 discuss adding things in a markup mode and paragraphs 0233 and 0235 describes adding the markup of the line 760 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Regarding claim 8, Li et al. teaches the method of claim 7, and obtaining historic hand locations for frames captured between the first frame and the second frame (paragraph 0073 explains the capture of historic hand locations at all of the different time points); and Li et al. does not specify wherein adding markup comprises: adding the markup on the user interface in accordance with the historic hand locations. Krivoruchko et al. teaches wherein adding markup comprises: adding the markup on the user interface in accordance with the historic hand locations (paragraphs 0212 and 0274 discuss adding things in a markup mode and paragraphs 0233 and 0235 describes adding the markup of the line 760 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other to be historic hand locations in other frames). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Regarding independent claim 9, Li et al. teaches a non-transitory computer readable medium comprising computer readable code executable by one or more processors (paragraphs 0023, 0055, 0099, and 0173 describe the use) to: obtain hand tracking data for a hand based on one or more camera frames (paragraph 0066 explains how the hand detection and tracking subsystem 516 tracks the hand based on a video frame and subsequent frames from a camera of the gesture-controlled device 100); detect, at a first frame, a change in motion characteristics of the hand satisfies a threshold change in motion characteristics (as given in paragraph 0066 where changes in location of hand and threshold to be determined change are detected by the hand detection and tracking subsystem 516) based on the hand tracking data for the hand (paragraph 0066 explains the use of the hand detection and tracking subsystem 516 and allowing changes in the location of the hand within subsequent frames to be interpreted as corresponding changes to the location of the hand); detect, at a second frame, a change in hand gesture status (changes to the location of the hand that are the movement of the hand as given in paragraph 0066) corresponding to a hand gesture associated with a user input action (paragraph 0059 explains that the hand movements are classified as hand gestures that are used as command inputs); and in response to the determination, adjust a user input action in accordance with a hand location at the first frame (paragraph 0066 explains how the hand location is determined based on the actions and gestures of the user and adjusted appropriately). Although implied, Li et al. does not specify the medium comprising computer readable code executable by one or more processors to determine that the second frame is within a threshold time period of the first frame based on the hand tracking data for the hand. Krivoruchko et al. teaches the medium comprising computer readable code executable by one or more processors (as given in paragraph 0052) to determine that the second frame is within a threshold time period of the first frame (paragraph 0174 describes the pinch gesture as being changes in hand location in frames within threshold times of each other) based on the hand tracking data for the hand (paragraph 0154 explains that the hand location is determined by the image sensors 314 that are the hand-tracking camera). It would have been obvious to one of ordinary skill in the art before the effective filing date to define the pinch gesture based on time thresholds as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to determine if the gesture is an immediate gesture and to differentiate between different gestures based on the timing (paragraph 0174 of Krivoruchko et al.). Regarding claim 10, Krivoruchko et al. teaches further the non-transitory computer readable medium of claim 9, wherein the user input action is a markup action comprising adding or removing a markup on a user interface in accordance with the change in motion characteristics (paragraph 0236 describes removing the markup of the line 766 based on the pinch gesture and paragraphs 0212 and 0274 discuss adding things in a markup mode and paragraphs 0233 and 0235 describes adding the markup of the line 760 based on the pinch gesture). Regarding claim 11, Li et al. teaches the non-transitory computer readable medium of claim 10, wherein the first frame is captured after the second frame, and wherein the change in hand gesture status comprises a transition from an unpinch to a pinch (paragraph 0065 explains the gesture as going between the pinch open shape 36 and the pinch closed shape 38). Li et al. does not specify the medium further comprising computer readable code to, in response to the determination: remove markup on the user interface from a location on the user interface associated with the hand location in the first frame to a location on the user interface the hand location associated with the second frame. Krivoruchko teaches the medium further comprising computer readable code to, in response to the determination: remove markup on the user interface from a location on the user interface associated with the hand location in the first frame to a location on the user interface the hand location associated with the second frame (paragraph 0236 describes removing the markup of the line 766 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Regarding claim 12, Li et al. teaches the non-transitory computer readable medium of claim 10, wherein the first frame is captured after the second frame, and wherein the change in hand gesture status comprises a transition from a pinch to an unpinch (paragraph 0065 explains the gesture as going between the pinch open shape 36 and the pinch closed shape 38). Li et al. does not specify the medium further comprising computer readable code to, in response to the determination: add markup on the user interface between a location on the user interface associated with the hand location in the second frame and a location on the user interface associated with the hand location in the first frame. Krivoruchko et al. teaches the medium further comprising computer readable code to, in response to the determination: add markup on the user interface between a location on the user interface associated with the hand location in the second frame and a location on the user interface associated with the hand location in the first frame (paragraphs 0212 and 0274 discuss adding things in a markup mode and paragraphs 0233 and 0235 describes adding the markup of the line 760 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Regarding claim 13, Li et al. teaches the non-transitory computer readable medium of claim 10, wherein the first frame is captured before the second frame, and wherein the change in hand gesture status comprises a transition from an unpinch to a pinch (paragraph 0065 explains the gesture as going between the pinch open shape 36 and the pinch closed shape 38). Li et al. does not specify the medium further comprising computer readable code to, in response to the determination: add markup on the user interface between a location on the user interface associated with the hand location in the second frame and a location on the user interface associated with the hand location in the first frame. Krivoruchko et al. teaches the medium further comprising computer readable code to, in response to the determination: add markup on the user interface between a location on the user interface associated with the hand location in the second frame and a location on the user interface associated with the hand location in the first frame (paragraphs 0212 and 0274 discuss adding things in a markup mode and paragraphs 0233 and 0235 describes adding the markup of the line 760 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Regarding claim 14, Li et al. teaches the non-transitory computer readable medium of claim 13, and to obtain historic hand locations for frames captured between the first frame and the second frame (paragraph 0073 explains the capture of historic hand locations at all of the different time points). Li et al. does not specify the medium wherein the computer readable code to add markup comprises computer readable code to: add the markup on the user interface in accordance with the historic hand locations. Krivoruchko et al. teaches the medium wherein the computer readable code to add markup comprises computer readable code to: add the markup on the user interface in accordance with the historic hand locations (paragraphs 0212 and 0274 discuss adding things in a markup mode and paragraphs 0233 and 0235 describes adding the markup of the line 760 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other to be historic hand locations in other frames). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Regarding independent claim 15, Li et al. teaches a system comprising: one or more processors (paragraphs 0023, 0055, 0099, and 0173 describe the use); and one or more computer readable media comprising computer readable code executable by the one or more processors (paragraphs 0023, 0055, 0099, and 0173 describe the use) to: obtain hand tracking data for a hand based on one or more camera frames (paragraph 0066 explains how the hand detection and tracking subsystem 516 tracks the hand based on a video frame and subsequent frames from a camera of the gesture-controlled device 100); detect, at a first frame, a change in motion characteristics of the hand satisfies a threshold change in motion characteristics (as given in paragraph 0066 where changes in location of hand and threshold to be determined change are detected by the hand detection and tracking subsystem 516) based on the hand tracking data for the hand (paragraph 0066 explains the use of the hand detection and tracking subsystem 516 and allowing changes in the location of the hand within subsequent frames to be interpreted as corresponding changes to the location of the hand); detect, at a second frame, a change in hand gesture status (changes to the location of the hand that are the movement of the hand as given in paragraph 0066) corresponding to a hand gesture associated with a user input action (paragraph 0059 explains that the hand movements are classified as hand gestures that are used as command inputs); and in response to the determination, adjust a user input action in accordance with a hand location at the first frame (paragraph 0066 explains how the hand location is determined based on the actions and gestures of the user and adjusted appropriately). Although implied, Li et al. does not specify the system comprising computer readable code executable by one or more processors to determine that the second frame is within a threshold time period of the first frame based on the hand tracking data for the hand. Krivoruchko et al. teaches the system comprising computer readable code executable by one or more processors (as given in paragraph 0052) to determine that the second frame is within a threshold time period of the first frame (paragraph 0174 describes the pinch gesture as being changes in hand location in frames within threshold times of each other) based on the hand tracking data for the hand (paragraph 0154 explains that the hand location is determined by the image sensors 314 that are the hand-tracking camera). It would have been obvious to one of ordinary skill in the art before the effective filing date to define the pinch gesture based on time thresholds as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to determine if the gesture is an immediate gesture and to differentiate between different gestures based on the timing (paragraph 0174 of Krivoruchko et al.). Regarding claim 16, Krivoruchko et al. teaches further the system of claim 15, wherein the user input action is a markup action comprising adding or removing a markup on a user interface in accordance with the change in motion characteristics (paragraph 0236 describes removing the markup of the line 766 based on the pinch gesture and paragraphs 0212 and 0274 discuss adding things in a markup mode and paragraphs 0233 and 0235 describes adding the markup of the line 760 based on the pinch gesture). Regarding claim 17, Li et al. teaches the system of claim 16, wherein the first frame is captured before the second frame, and wherein the change in hand gesture status comprises a transition from a pinch to an unpinch (paragraph 0065 explains the gesture as going between the pinch open shape 36 and the pinch closed shape 38). Li et al. teaches the system further comprising computer readable code to, in response to the determination: remove markup on the user interface from a location on the user interface associated with a hand location in the first frame to a location on the user interface associated with the hand location in the second frame. Krivoruchko et al. teaches the system further comprising computer readable code to, in response to the determination: remove markup on the user interface from a location on the user interface associated with a hand location in the first frame to a location on the user interface associated with the hand location in the second frame (paragraph 0236 describes removing the markup of the line 766 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Regarding claim 18, Li et al. teaches the system of claim 16, and to obtain historic hand locations for frames captured between the first frame and the second frame (paragraph 0073 explains the capture of historic hand locations at all of the different time points); and Li et al. does not specify the system wherein the computer readable code to remove the markup comprises computer readable code to: remove markup on the user interface in accordance with the historic hand locations. Krivoruchko et al. the system wherein the computer readable code to remove the markup comprises computer readable code to: remove markup on the user interface in accordance with the historic hand locations (paragraph 0236 describes removing the markup of the line 766 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other to be historic hand locations in other frames). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Regarding claim 19, Li et al. teaches the system of claim 16, wherein the first frame is captured after the second frame, and wherein the change in hand gesture status comprises a transition from a pinch to an unpinch (paragraph 0065 explains the gesture as going between the pinch open shape 36 and the pinch closed shape 38). Li et al. does not specify the system further comprising computer readable code to, in response to the determination: add markup on the user interface between a location on the user interface associated with the hand location in the second frame and a location on the user interface associated with the hand location in the first frame. Krivoruchko et al. teaches the system further comprising computer readable code to, in response to the determination: add markup on the user interface between a location on the user interface associated with the hand location in the second frame and a location on the user interface associated with the hand location in the first frame (paragraphs 0212 and 0274 discuss adding things in a markup mode and paragraphs 0233 and 0235 describes adding the markup of the line 760 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Regarding claim 20, Li et al. teaches the system of claim 16, wherein the first frame is captured before the second frame, and wherein the change in hand gesture status comprises a transition from an unpinch to a pinch (paragraph 0065 explains the gesture as going between the pinch open shape 36 and the pinch closed shape 38). Li et al. does not specify the system further comprising computer readable code to, in response to the determination: add markup on the user interface between a location on the user interface associated with the hand location in the second frame and a location on the user interface associated with the hand location in the first frame. Krivoruchko et al. teaches the system further comprising computer readable code to, in response to the determination: add markup on the user interface between a location on the user interface associated with the hand location in the second frame and a location on the user interface associated with the hand location in the first frame (paragraphs 0212 and 0274 discuss adding things in a markup mode and paragraphs 0233 and 0235 describes adding the markup of the line 760 based on the pinch gesture which is described in paragraph 0174 as being changes in hand location in frames within threshold times of each other). It would have been obvious to one of ordinary skill in the art before the effective filing date to adjust markup based on gesture as taught by Krivoruchko et al. in the system of Li et al. The rationale to combine would be to reduce the number, extent, and/or the nature of the inputs from a user and produce a more efficient human-machine interface (paragraph 0007 of Krivoruchko et al.). Response to Arguments Applicant's arguments filed 3/2/26 have been fully considered but they are not persuasive. Applicant contends that Li is silent regarding a change in hand gesture status based on the hand gesture because there is no discussion of the threshold change in motion characteristics being satisfied to yield an adjustment. The examiner disagrees. Paragraph 0066 explains that the hand is tracked and defined relatively to the other objects in the frame and Krivoruchko et al. is cited for teaching the threshold as explained above. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. The closest prior art is made of record in the attached notice of references cited. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to PARUL H GUPTA whose telephone number is (571)272-5260. The examiner can normally be reached Monday through Friday, from 10 AM to 7 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ke Xiao can be reached at 571-272-7776. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /PARUL H GUPTA/Primary Examiner, Art Unit 2627
Read full office action

Prosecution Timeline

Sep 03, 2024
Application Filed
Oct 23, 2025
Non-Final Rejection — §103
Mar 02, 2026
Response Filed
Mar 19, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593588
DISPLAY SUBSTRATE
2y 5m to grant Granted Mar 31, 2026
Patent 12585342
WRIST-WORN DEVICE CONTROL METHOD, RELATED SYSTEM, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12578913
DISPLAY METHOD, ELECTRONIC DEVICE, AND SYSTEM
2y 5m to grant Granted Mar 17, 2026
Patent 12579953
DISPLAY APPARATUS, CONTROL MODULE THEREOF AND DRIVE METHOD THEREFOR
2y 5m to grant Granted Mar 17, 2026
Patent 12579941
PIXEL DRIVING CIRCUIT AND DISPLAY PANEL
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
61%
Grant Probability
94%
With Interview (+33.0%)
2y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 617 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month