Prosecution Insights
Last updated: April 19, 2026
Application No. 18/629,712

Application Programming Interfaces for Gesture Operations

Non-Final OA §101§103
Filed
Apr 08, 2024
Examiner
DAO, TUAN C.
Art Unit
2198
Tech Center
2100 — Computer Architecture & Software
Assignee
Apple Inc.
OA Round
1 (Non-Final)
82%
Grant Probability
Favorable
1-2
OA Rounds
3y 1m
To Grant
98%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
642 granted / 782 resolved
+27.1% vs TC avg
Strong +16% interview lift
Without
With
+15.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
38 currently pending
Career history
820
Total Applications
across all art units

Statute-Specific Performance

§101
18.3%
-21.7% vs TC avg
§103
51.8%
+11.8% vs TC avg
§102
18.6%
-21.4% vs TC avg
§112
5.3%
-34.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 782 resolved cases

Office Action

§101 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION The instant application having Application No. 18/629712 filed on 04/08/2024 is presented for examination by the examiner. Examiner Notes Examiner cites particular columns and line numbers in the references as applied to the claims below for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the applicant fully consider the references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner. Drawings The applicant’s drawings submitted are acceptable for examination purposes. Information Disclosure Statement As required by M.P.E.P. 609, the applicant’s submissions of the Information Disclosure Statement dated 07/16/2025, 05/07/2024 and 05/06/2024 are acknowledged by the examiner and the cited references have been considered in the examination of the claims now pending. Specification Objections The disclosure is objected to because of the following informalities: under “related Applications” section, the status of U.S Patent Application No. 17/945962 now is patented need to be updated. Appropriate correction is required. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Regarding claim 17; claims 17 is rejected under 35 U.S.C. 101 because the claims is directed to non-statutory subject matter. Claim 17 recites “[a] computer-readable storage medium”. Under a recent precedential opinion, the scope of the recited “computer readable storage medium” encompasses transitory media such as signals or carrier waves, where, as here the Specification does not limit the computer readable storage medium to non-transitory forms. See Ex parte Mewherter, 107 USPQ2d 1857, 1862 (PTAB 2013) (precedential) (holding recited machine-readable storage medium ineligible under § 35 U.S.C. 101 since it encompassed transitory media). The Examiner respectfully suggests that the claim be amended to either “A non-transitory computer-readable storage medium” or “a computer-readable storage device” to make the claim statutory under 35 USC 101; (emphasis added). Claims 18-20 are rejected based on the above reasons. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1, 6, 9, 14 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over US 2005/0024341 to Gillespie et al. (hereafter “Gillespie”) in view of US 2002/0015064 to Robotham et al. (hereafter “Robotham”) As per claim 1, Gillespie discloses a method, comprising: at a computer system with one or more display devices (FIGs. 1-2) and one or more input devices (FIGs. 1-2: mouse, pen, keyboard, input sensors …): displaying, via the one or more display devices, a user interface of a software application that includes one or more views (FIGs. 4 and 16; paragraphs 0034, 0046, 0048, and 0017-0120: “In the iconic mode, the screen displays an image that includes a number of small icons such as pictures or buttons.”); detecting, via the one or more input devices, a user input that comprises one or more input points directed to a respective view of the one or more views (FIG. 4-5, 8, 13B and 16; paragraphs 0053-0054, 0057, and 0067-0070: “finger motion gestures, voice commands, foot switches, retinal gaze tracking, etc.” [Wingdings font/0xE0] applying gestures to icons, buttons, keys …); in response to detecting the user input, transferring an input start event function call (FIGs. 4 and 16; paragraphs 0052-0054: ” These icons suggest a natural action that could be taken when the user taps on the icons, such as opening the associated e-mail reading or appointment scheduling software. Because these icons are located nearer the center of the touch sensing area and could easily be tapped by accident, icons 416 and 418 may be made sensitive to finger taps only when they have been activated by some separate means such as pressing a special function key on keyboard 104.” [Wingdings font/0xE0] tapping the email application icon [Wingdings font/0xE0] open the email application to read an email) through an application programming interface to software associated with the respective view (FIGs. 4 and 16; paragraphs 0117-0120: API 1624); detecting, via the one or more input devices, a change in the user input, the change corresponding to input movement relative to the respective view (FIGs. 4 and 16; paragraphs 0048 and 0054: “Icon 424 includes a visual slider and "thumb." The position of the thumb on the slider reflects the current volume setting. When the touch screen is in the activated state, finger motions within the volume control region can move the thumb to a different location on the slider to adjust the volume level.”); in response detecting to the change in the user input (FIGs. 4 and 16; paragraphs 0048 and 0054: “Icon 424 includes a visual slider and "thumb." The position of the thumb on the slider reflects the current volume setting. When the touch screen is in the activated state, finger motions within the volume control region can move the thumb to a different location on the slider to adjust the volume level.”), transferring an input changed event function call through the application programming interface to software associated with the respective view (FIGs. 4 and 16; paragraphs 0120-0122: “Software applications running on the computer, represented in FIG. 16 by software applications 1640, 1642, and 1644 in application layer 1606, can use API 1624 to obtain special access to the touch screen. API 1624 exports a variety of touch pad and touch screen commands to the applications in application layer 1606. These commands include requests for information about finger and "mouse" button activities on the touch sensor, as well as requests to override the cursor motion normally conveyed to pointing device driver 1622 with different cursor motion generated by the application based on finger movements. The API commands also include requests to display or update an icon on the iconic screen image, or to display or update a full-screen auxiliary or pop-up image.”); and Gillespie discloses transferring the input changed event function call through the application programming interface to software associated with the respective view (FIGs. 4 and 16; paragraphs 0120-0122: “Software applications running on the computer, represented in FIG. 16 by software applications 1640, 1642, and 1644 in application layer 1606, can use API 1624 to obtain special access to the touch screen. API 1624 exports a variety of touch pad and touch screen commands to the applications in application layer 1606. These commands include requests for information about finger and "mouse" button activities on the touch sensor, as well as requests to override the cursor motion normally conveyed to pointing device driver 1622 with different cursor motion generated by the application based on finger movements. The API commands also include requests to display or update an icon on the iconic screen image, or to display or update a full-screen auxiliary or pop-up image.”), the input function call (FIGs. 4 and 16; paragraphs 0120-0122), and transferring the input event function call through the application programming interface to software associated with the respective view (FIGs. 4 and 16; paragraphs 0120-0122), however, Gillespie does not explicitly disclose the event function call is an input end function call; and after transferring the input changed event function call, transferring an input end function call. Robotham further discloses the event function call is an input end function call (FIG. 5-6; paragraphs 0463, 0470-0474, 0476, and 0554: end current gesture function (left mouse button up or pen up) [Wingdings font/0xE0] added into the list of events that is processed); and after transferring the input changed event function call (FIG. 5: step change input mode? 5-3: YES; paragraphs 0342, 0343, 0458, 0461, 0463 and 0476: “If the current gesture should be ended (or pending gesture should be processed), then the "end current gesture" function (5-6) is performed. This function is further described below in the section "Ending the Current Gesture".” [Wingdings font/0xE0] gestures are sequence of events -[Wingdings font/0xE0] events are added into a list that is processed), transferring the input end function call (FIG. 5-6; paragraphs 0463, 0470-0474, 0476, and 0554: end current gesture function (left mouse button up or pen up) [Wingdings font/0xE0] added into the list of events that is processed). It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine a teaching of Robotham into Gillespie’s teaching because it would provide for the purpose of an end event is any event that the client software recognizes as ending the current gesture. For example, a left mouse-button up or pen-up are commonly used as selection end events (Robotham, paragraph 0554). As per claim 6, Gillespie discloses the user input comprises a plurality of input points (FIG. 4-5, 8, 13B and 16; paragraphs 0053-0054, 0057, and 0067-0070: “finger motion gestures, voice commands, foot switches, retinal gaze tracking, etc.” [Wingdings font/0xE0] applying gestures to icons, buttons, keys …) touching a touch device (FIGs. 1-2: mouse, pen, keyboard, input sensors …) of the computer system; and the method includes: in response to detecting the user input, transferring a gesture start event function call (FIGs. 4 and 16; paragraphs 0052-0054: ” These icons suggest a natural action that could be taken when the user taps on the icons, such as opening the associated e-mail reading or appointment scheduling software. Because these icons are located nearer the center of the touch sensing area and could easily be tapped by accident, icons 416 and 418 may be made sensitive to finger taps only when they have been activated by some separate means such as pressing a special function key on keyboard 104.” [Wingdings font/0xE0] tapping the email application icon [Wingdings font/0xE0] open the email application to read an email) through the application programming interface to the software application (FIGs. 4 and 16; paragraphs 0117-0120: API 1624); in response detecting to the change in the user input (FIGs. 4 and 16; paragraphs 0048 and 0054: “Icon 424 includes a visual slider and "thumb." The position of the thumb on the slider reflects the current volume setting. When the touch screen is in the activated state, finger motions within the volume control region can move the thumb to a different location on the slider to adjust the volume level.”), transferring a gesture changed event function call through the application programming interface to the software application (FIGs. 4 and 16; paragraphs 0120-0122: “Software applications running on the computer, represented in FIG. 16 by software applications 1640, 1642, and 1644 in application layer 1606, can use API 1624 to obtain special access to the touch screen. API 1624 exports a variety of touch pad and touch screen commands to the applications in application layer 1606. These commands include requests for information about finger and "mouse" button activities on the touch sensor, as well as requests to override the cursor motion normally conveyed to pointing device driver 1622 with different cursor motion generated by the application based on finger movements. The API commands also include requests to display or update an icon on the iconic screen image, or to display or update a full-screen auxiliary or pop-up image.”); Gillespie discloses transferring the gesture changed event function call through the application programming interface to software associated with the respective view (FIGs. 4 and 16; paragraphs 0120-0122: “Software applications running on the computer, represented in FIG. 16 by software applications 1640, 1642, and 1644 in application layer 1606, can use API 1624 to obtain special access to the touch screen. API 1624 exports a variety of touch pad and touch screen commands to the applications in application layer 1606. These commands include requests for information about finger and "mouse" button activities on the touch sensor, as well as requests to override the cursor motion normally conveyed to pointing device driver 1622 with different cursor motion generated by the application based on finger movements. The API commands also include requests to display or update an icon on the iconic screen image, or to display or update a full-screen auxiliary or pop-up image.”), the gesture function call (FIGs. 4 and 16; paragraphs 0120-0122), and transferring the gesture function call through the application programming interface to software associated with the respective view (FIGs. 4 and 16; paragraphs 0120-0122), however, Gillespie does not explicitly disclose the gesture function call is an gesture end function call; and after transferring the input changed event function call, transferring gesture end function call. Robotham further discloses the gesture function call is an gesture end function call (FIG. 5-6; paragraphs 0463, 0470-0474, 0476, and 0554: end current gesture function (left mouse button up or pen up) [Wingdings font/0xE0] added into the list of events that is processed); and after transferring the gesture changed event function call (FIG. 5: step change input mode? 5-3: YES; paragraphs 0342, 0343, 0458, 0461, 0463 and 0476: “If the current gesture should be ended (or pending gesture should be processed), then the "end current gesture" function (5-6) is performed. This function is further described below in the section "Ending the Current Gesture".” [Wingdings font/0xE0] gestures are sequence of events -[Wingdings font/0xE0] events are added into a list that is processed), transferring the gesture end function call (FIG. 5-6; paragraphs 0463, 0470-0474, 0476, and 0554: end current gesture function (left mouse button up or pen up) [Wingdings font/0xE0] added into the list of events that is processed). It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine a teaching of Robotham into Gillespie’s teaching because it would provide for the purpose of an end event is any event that the client software recognizes as ending the current gesture. For example, a left mouse-button up or pen-up are commonly used as selection end events (Robotham, paragraph 0554). As per claim 9, it is a system claim, which recite(s) the same limitations as those of claim 1. Accordingly, claim 9 is rejected for the same reasons as set forth in the rejection of claim 1. As per claim 14, it is a system claim, which recite(s) the same limitations as those of claim 6. Accordingly, claim 14 is rejected for the same reasons as set forth in the rejection of claim 6. As per claim 17, it is a medium claim, which recite(s) the same limitations as those of claim 1. Accordingly, claim 17 is rejected for the same reasons as set forth in the rejection of claim 1. Claims 2-3, 7-8, 10-11, 15-16 and 18-19 are rejected under 35 U.S.C. 103 as being unpatentable over Gillespie in view of Robotham, as applied to claims 1, 9 and 17, and further in view of US 2009/0135162 to Van De Wijdeven et al. (hereafter “Van De”) As per claim 2, Gillespie discloses the start input event function (FIGs. 4 and 16; paragraphs 0052-0054: ” These icons suggest a natural action that could be taken when the user taps on the icons, such as opening the associated e-mail reading or appointment scheduling software. Because these icons are located nearer the center of the touch sensing area and could easily be tapped by accident, icons 416 and 418 may be made sensitive to finger taps only when they have been activated by some separate means such as pressing a special function key on keyboard 104.” [Wingdings font/0xE0] tapping the email application icon [Wingdings font/0xE0] open the email application to read an email), and input changed event function (FIGs. 4 and 16; paragraphs 0120-0122: “Software applications running on the computer, represented in FIG. 16 by software applications 1640, 1642, and 1644 in application layer 1606, can use API 1624 to obtain special access to the touch screen. API 1624 exports a variety of touch pad and touch screen commands to the applications in application layer 1606. These commands include requests for information about finger and "mouse" button activities on the touch sensor, as well as requests to override the cursor motion normally conveyed to pointing device driver 1622 with different cursor motion generated by the application based on finger movements. The API commands also include requests to display or update an icon on the iconic screen image, or to display or update a full-screen auxiliary or pop-up image.”), however, Gillespie does not explicitly disclose wherein the input start event function call includes a first list of two or more input points detected at a first time, and the input changed event function call includes a second list of two or more input points detected at a second time that is later than the first time. Van De further discloses wherein the input start event function call includes a first list of two or more input points detected at a first time (paragraphs 0171, and 0175-0176: the initiated position of 2 fingers touching the screen are detected and recorded), and the input changed event function call includes a second list of two or more input points detected at a second time that is later than the first time (paragraphs 0171, and 0175-0176: the changed position of 2 fingers (zoom in/zoom out) touching the screen are detected and recorded). It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine a teaching of Van De into Gillespie’s teaching and Robotham’s teaching because it would provide for the purpose of gesture can for instance be interpreted as `enlarge the window on screen to this new size relative to the starting point (of the gestures)` in a desktop environment or `zoom in on this picture on the position of the starting point (of the gesture), with the zoom factor relative to the distance both fingers have traveled across the screen` in a picture viewer application (Van De, paragraph 0176). As per claim 3, Gillespie does not explicitly disclose in response detecting to the change in the user input, generating a zoom to scale setting for a view of the software application displayed via the one or more display devices. Van De further discloses in response detecting to the change in the user input, generating a zoom to scale setting for a view of the software application displayed via the one or more display devices (paragraphs 0171, and 0175-0176: “gesture can for instance be interpreted as `enlarge the window on screen to this new size relative to the starting point (of the gestures)` in a desktop environment or `zoom in on this picture on the position of the starting point (of the gesture), with the zoom factor relative to the distance both fingers have traveled across the screen` in a picture viewer application” [Wingdings font/0xE0] zoom factor (scale setting) relative to the distance of two fingers). It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine a teaching of Van De into Gillespie’s teaching and Robotham’s teaching because it would provide for the purpose of gesture can for instance be interpreted as `enlarge the window on screen to this new size relative to the starting point (of the gestures)` in a desktop environment or `zoom in on this picture on the position of the starting point (of the gesture), with the zoom factor relative to the distance both fingers have traveled across the screen` in a picture viewer application (Van De, paragraph 0176). As per claim 7, Gillespie does not explicitly disclose determining that the user input corresponds to a gesture, wherein: the gesture start event function call indicates that the gesture has started the gesture changed event function call indicates that the gesture has changed; and the gesture end function call indicates that the gesture has ended. Robotham further discloses the gesture end function call indicates that the gesture has ended (FIG. 5-6; paragraphs 0463, 0470-0474, 0476, and 0554: end current gesture function (left mouse button up or pen up)) It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine a teaching of Robotham into Gillespie’s teaching because it would provide for the purpose of an end event is any event that the client software recognizes as ending the current gesture. For example, a left mouse-button up or pen-up are commonly used as selection end events (Robotham, paragraph 0554). Van De further discloses the gesture start event function call indicates that the gesture has started (paragraphs 0171, and 0175-0176: the initiated position of 2 fingers touching the screen are detected and recorded); and the gesture changed event function call indicates that the gesture has changed (paragraphs 0171, and 0175-0176: the changed position of 2 fingers (zoom in/zoom out) touching the screen are detected and recorded). It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine a teaching of Van De into Gillespie’s teaching and Robotham’s teaching because it would provide for the purpose of gesture can for instance be interpreted as `enlarge the window on screen to this new size relative to the starting point (of the gestures)` in a desktop environment or `zoom in on this picture on the position of the starting point (of the gesture), with the zoom factor relative to the distance both fingers have traveled across the screen` in a picture viewer application (Van De, paragraph 0176). As per claim 8, Gillespie discloses transferring the gesture start event function call includes transferring the gesture start event function call to a control associated with the respective view (FIGs. 4 and 16; paragraphs 0120-0122: “Software applications running on the computer, represented in FIG. 16 by software applications 1640, 1642, and 1644 in application layer 1606, can use API 1624 to obtain special access to the touch screen. API 1624 exports a variety of touch pad and touch screen commands to the applications in application layer 1606. These commands include requests for information about finger and "mouse" button activities on the touch sensor, as well as requests to override the cursor motion normally conveyed to pointing device driver 1622 with different cursor motion generated by the application based on finger movements. The API commands also include requests to display or update an icon on the iconic screen image, or to display or update a full-screen auxiliary or pop-up image.”); transferring the gesture changed event function call includes transferring the gesture changed event function call to the control associated with the respective view (FIGs. 4 and 16; paragraphs 0120-0122: “Software applications running on the computer, represented in FIG. 16 by software applications 1640, 1642, and 1644 in application layer 1606, can use API 1624 to obtain special access to the touch screen. API 1624 exports a variety of touch pad and touch screen commands to the applications in application layer 1606. These commands include requests for information about finger and "mouse" button activities on the touch sensor, as well as requests to override the cursor motion normally conveyed to pointing device driver 1622 with different cursor motion generated by the application based on finger movements. The API commands also include requests to display or update an icon on the iconic screen image, or to display or update a full-screen auxiliary or pop-up image.”). Gillespie discloses the gesture function call (FIGs. 4 and 16; paragraphs 0120-0122), and transferring the gesture function call includes transferring the gesture event function call to the control associated with the respective view (FIGs. 4 and 16; paragraphs 0120-0122), however, Gillespie does not explicitly disclose the gesture function call is a gesture end function call. Robotham further discloses the gesture function call is a gesture end function call (FIG. 5-6; paragraphs 0463, 0470-0474, 0476, and 0554: end current gesture function (left mouse button up or pen up) [Wingdings font/0xE0] added into the list of events that is processed). It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine a teaching of Robotham into Gillespie’s teaching because it would provide for the purpose of an end event is any event that the client software recognizes as ending the current gesture. For example, a left mouse-button up or pen-up are commonly used as selection end events (Robotham, paragraph 0554). As per claim 10, it is a system claim, which recite(s) the same limitations as those of claim 2. Accordingly, claim 10 is rejected for the same reasons as set forth in the rejection of claim 2. As per claim 11, it is a system claim, which recite(s) the same limitations as those of claim 3. Accordingly, claim 11 is rejected for the same reasons as set forth in the rejection of claim 3. As per claim 15, it is a system claim, which recite(s) the same limitations as those of claim 7. Accordingly, claim 15 is rejected for the same reasons as set forth in the rejection of claim 7. As per claim 16, it is a system claim, which recite(s) the same limitations as those of claim 8. Accordingly, claim 16 is rejected for the same reasons as set forth in the rejection of claim 8. As per claim 18, it is a medium claim, which recite(s) the same limitations as those of claim 2. Accordingly, claim 18 is rejected for the same reasons as set forth in the rejection of claim 2. As per claim 19, it is a medium claim, which recite(s) the same limitations as those of claim 3. Accordingly, claim 19 is rejected for the same reasons as set forth in the rejection of claim 3. Claims 4, 12 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Gillespie in view of Robotham, as applied to claims 1, 9, and 17, and further in view of US 2008/0028325 to Ferren et al. (hereafter “Ferren”) As per claim 4, Gillespie does not explicitly disclose wherein the detected change in the user input comprises a rotation to invoke a gesture event function call that initiates a rotation transform on a view of the software application associated with the user input. Ferren further discloses wherein the detected change in the user input comprises a rotation to invoke a gesture event function call that initiates a rotation transform on a view of the software application associated with the user input (paragraphs 0031 and 0035). It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine a teaching of Ferren into Gillespie’s teaching and Robotham’s teaching because it would provide for the purpose of employing gestures to manipulate the visual data displayed on the gesture collaboration display 50 (Ferren, paragraph 0031). As per claim 12, it is a system claim, which recite(s) the same limitations as those of claim 4. Accordingly, claim 12 is rejected for the same reasons as set forth in the rejection of claim 4. As per claim 20, it is a medium claim, which recite(s) the same limitations as those of claim 4. Accordingly, claim 20 is rejected for the same reasons as set forth in the rejection of claim 4. Claims 5, and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Gillespie in view of Robotham and Ferren, as applied to claims 4 and 12, and further in view of US 2004/0130550 to Blanco et al. (hereafter “Blanco”) As per claim 5, Gillespie does not explicitly disclose wherein the rotation transform sets a start rotation angle and an end rotation angle. Blanco further discloses wherein the rotation transform sets a start rotation angle and an end rotation angle (paragraph 0138) It would have been obvious to a person having ordinary skill in the art at the time the invention was made to combine a teaching of Blanco into Gillespie’s teaching, Robotham’s teaching, and Ferren’s teaching because it would provide for the purpose of once the current interval is determined via the current time, such an interpolation to determine the property value is a very straightforward, rapid calculation based on a simple formula using the values in the above table and the current time relative to the beginning time (Blanco, paragraph 0138). As per claim 13, it is a system claim, which recite(s) the same limitations as those of claim 5. Accordingly, claim 13 is rejected for the same reasons as set forth in the rejection of claim 5. Conclusion The following prior art made of record and not relied upon is cited to establish the level of skill in the applicant’s art and those arts considered reasonably pertinent to applicant’s disclosure. See MPEP 707.05(c). Prior arts;US 2007/0150902 to Meyer The key property of "PTT" or "PTT-like" interfaces is a gesture, which indicates that the user has finished speaking and transfers execution to the program. The technology is described with relation to PTT, but is useful for any interface which implements a user gesture that indicates "end of speech," such interfaces are generically referred to as PTT or PTT-like. US 2007/0039450 to Ohshima if the user whose eye contact has already been made by the eye movement recognition shows a predetermined gesture (sign) indicating the finish of the solo part performance US 2007/0002015 to Mohri input of initialization information is ended by an initialization end gesture operation (e.g., the paper/stone operation is performed as an end operation). It is to be noted that in this motion, it is important that a linear distance connecting the first motion to the last motion be constantly certain and a known distance. For example, a moving distance by a series of motions from a state in which the hand is fully extended forwards until the hand reaches the body can be used as approximately constant distance information. US 2006/0033701 to Wilson Due to the unique properties of flow fields, they can be employed to recognize gestures (e.g., hand gestures) and then apply these gestures to other useful applications such as navigational control. US 2003/0182279 to Willows Alternately the operator may use a gesture when selecting "notification" 2106 thus entering an acceptance command along with punctuation associated with the gesture and terminating the session 800. US 2003/0156756 to Gokturk For example, one or more of the following delimiter functions may be employed: (a) a specific hand gesture to delimit the beginning and/or end of a hand gesture; (b) a person stopping at the end of a gesture that is to be recognized; (c) a person creating a specific sound to delimit the beginning and/or end of a hand gesture; and (d) for computer applications and the like, the user inputting a specific key to delimit the beginning and/or end of the hand gesture. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Tuan Dao whose telephone number is (571) 270 3387. The examiner can normally be reached on Monday to Friday from 09am to 05pm. The examiner can also be reached on alternate Fridays. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chat Do, can be reached at telephone number (571) 272 3721. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form. /TUAN C DAO/Primary Examiner, Art Unit 2193
Read full office action

Prosecution Timeline

Apr 08, 2024
Application Filed
Apr 25, 2025
Response after Non-Final Action
Oct 30, 2025
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602257
ELECTRONIC DEVICE AND OPERATING METHOD WITH MODEL CO-LOCATION
2y 5m to grant Granted Apr 14, 2026
Patent 12566648
METHOD OF PROCESSING AGREEMENT TASK
2y 5m to grant Granted Mar 03, 2026
Patent 12566627
PREDICTING THE NEXT BEST COMPRESSOR IN A STREAM DATA PLATFORM
2y 5m to grant Granted Mar 03, 2026
Patent 12561173
METHOD FOR DATA PROCESSING AND APPARATUS, AND ELECTRONIC DEVICE
2y 5m to grant Granted Feb 24, 2026
Patent 12561591
CLASSIFICATION AND TRANSFORMATION OF SEQUENTIAL EVENT DATA
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
82%
Grant Probability
98%
With Interview (+15.6%)
3y 1m
Median Time to Grant
Low
PTA Risk
Based on 782 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month