Prosecution Insights
Last updated: April 19, 2026
Application No. 18/677,150

TECHNIQUES FOR SELECTING TEXT

Final Rejection §102§103§112
Filed
May 29, 2024
Examiner
HOANG, AMY P
Art Unit
2143
Tech Center
2100 — Computer Architecture & Software
Assignee
Apple Inc.
OA Round
2 (Final)
70%
Grant Probability
Favorable
3-4
OA Rounds
3y 3m
To Grant
99%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
163 granted / 232 resolved
+15.3% vs TC avg
Strong +64% interview lift
Without
With
+64.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
31 currently pending
Career history
263
Total Applications
across all art units

Statute-Specific Performance

§101
15.9%
-24.1% vs TC avg
§103
46.0%
+6.0% vs TC avg
§102
17.0%
-23.0% vs TC avg
§112
13.4%
-26.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 232 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The Amendment filed on 12/15/2025 has been entered. Claim 9 is canceled. Claims 13-28 are added. Claims 1-8 and 10-28 remain pending in the application. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. Claims 1-8 and 10-28 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for pre-AIA the inventor(s), at the time the application was filed, had possession of the claimed invention. Claims 1, 11 and 12 recite “in accordance with a determination that the second touch location is in a first direction relative to the first touch location, moving the focus indicator to a second indicator location that has a first spatial relationship to the second touch location, wherein moving the focus indicator to the second indicator location that has the first spatial relationship to the second touch location includes moving the focus indicator by an amount that is less than a threshold distance from the second touch location; and in accordance with a determination that the second touch location is in a second direction relative to the first touch location, moving the focus indicator to a third indicator location that has a second spatial relationship, different from the first spatial relationship, to the second touch location, wherein moving the focus indicator to the third indicator location that has the second spatial relationship to the second touch location includes moving the focus indicator by an amount that is greater than the threshold distance from the second touch location;”. In the Remarks filed on 12/15/2025, Applicant stated that support for the amendment can be found at least in paragraphs [0182]-[0186], [0213], and [0214], as well as at FIGS. 6A-6D of the specification as originally filed. Below are Figs. 6A-6D and [0183]-[0185] from the Specification, PNG media_image1.png 692 506 media_image1.png Greyscale [0183] FIG. 6B depicts electronic device 600 displaying user interface 604 via touch-sensitive display device 602 at a second time after the first time. In FIG. 6B, user input 610 has moved horizontally to the right relative to where user input 610 was located in FIG. 6A. With such movement, insertion marker 608 is displayed in an area corresponding to user input 610. In some examples, insertion marker 608 is not displayed while user input 610 is detected (e.g., due to insertion marker 608 not needing to be displayed because a user's finger will be covering where insertion marker 608 would be displayed). While only illustrated as moving to the right, in some examples, insertion marker 608 would remain within the area corresponding to user input 610 when user input 610 is moved to the left and when moved up or down below a threshold. [0184] FIG. 6C depicts electronic device 600 displaying user interface 604 via touch-sensitive display device 602 at a third time after the second time. In FIG. 6C, user input 610 has moved vertically relative to where user input 610 was located in FIG. 6B (e.g., in an upward direction). With such movement, insertion marker 608 is maintained at a location in words 606 (e.g., between the character “R” and the character “L”). Such functionality can be referred to as line stickiness, meaning that insertion marker 608 remains displayed on a line of text when user input has moved in a vertical direction below a threshold distance (in some examples, a non-zero threshold distance). In some examples, horizontal movement of insertion marker 608 in combination with vertical movement below the threshold distance causes insertion marker 608 to move horizontally in accordance with the horizontal movement. [0185] FIG. 6D depicts electronic device 600 displaying user interface 604 via touch-sensitive display device 602 at a fourth time after the third time. In FIG. 6D, insertion marker 608 has been moved outside of an area determined to correspond to user input 610, to a location adjacent to the area determined to correspond to user input 610 (e.g., above the area determined to correspond to user input 610). In some examples, the area determined to correspond to user input 610 is determined based on a distance from a center point of user input 610. Fig. 6B and [0183] seem to describe “in response to detecting the movement of the touch gesture to the second touch location: in accordance with a determination that the second touch location is in a first direction relative to the first touch location, moving the focus indicator to a second indicator location that has a first spatial relationship to the second touch location”. The Examiner has been unable to find any references to “wherein moving the focus indicator to the second indicator location that has the first spatial relationship to the second touch location includes moving the focus indicator by an amount that is less than a threshold distance from the second touch location”. [0183] discloses insertion marker 608 would remain within the area corresponding to user input 610 when user input 610 is moved to the left and when moved up or down below a threshold. It does not indicate the insertion marker 608 would move by an amount that is less than a threshold distance from the second touch location. Fig. 6D and [0185] seem to describe “and in accordance with a determination that the second touch location is in a second direction relative to the first touch location, moving the focus indicator to a third indicator location that has a second spatial relationship, different from the first spatial relationship, to the second touch location”. The Examiner has been unable to find any references to “wherein moving the focus indicator to the third indicator location that has the second spatial relationship to the second touch location includes moving the focus indicator by an amount that is greater than the threshold distance from the second touch location;” [0185] discloses insertion marker 608 has been moved outside of an area determined to correspond to user input 610, to a location adjacent to the area determined to correspond to user input 610 (e.g., above the area determined to correspond to user input 610). In some examples, the area determined to correspond to user input 610 is determined based on a distance from a center point of user input 610. It does not indicate the insertion marker 608 would move by an amount that is greater than the threshold distance from the second touch location. For the purpose of examination, examiner will interpret the limitation “in accordance with a determination that the second touch location is in a first direction relative to the first touch location, moving the focus indicator to a second indicator location that has a first spatial relationship to the second touch location, wherein moving the focus indicator to the second indicator location that has the first spatial relationship to the second touch location includes moving the focus indicator by an amount that is less than a threshold distance from the second touch location; and in accordance with a determination that the second touch location is in a second direction relative to the first touch location, moving the focus indicator to a third indicator location that has a second spatial relationship, different from the first spatial relationship, to the second touch location, wherein moving the focus indicator to the third indicator location that has the second spatial relationship to the second touch location includes moving the focus indicator by an amount that is greater than the threshold distance from the second touch location;” as “in accordance with a determination that the second touch location is in a first direction relative to the first touch location, moving the focus indicator to a second indicator location that has a first spatial relationship to the second touch location; and in accordance with a determination that the second touch location is in a second direction relative to the first touch location, moving the focus indicator to a third indicator location that has a second spatial relationship, different from the first spatial relationship, to the second touch location” as disclosed in paragraph [0212] and [0213] of the specification. Therefore, claims 1, 11 and 12 are rejected for containing subject matter which was not described in the specification. Claims 2-8, 10, and 13-28 are rejected for failing to cure the deficiency from their respective parent claims. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 11 and 12 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Ording et al. (hereinafter Ording), US 20080259040 A1. Regarding independent claim 1, Ording teaches an electronic device (Fig. 1, 100; [0026]), comprising: a touch-sensitive display device (Fig. 1, 112; [0026]); one or more processors (Fig. 1, 118; [0026]); and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for (Fig. 1, 102; [0031]): displaying, via the touch-sensitive display device, a focus indicator at a first indicator location on the touch-sensitive display device (Fig. 3, 302; [0037] An insertion marker is displayed on a touch screen (302). The insertion marker may be displayed in an application that includes text entry, such as a memo pad, email, or short message service (SMS) application. In some embodiments, the insertion marker is displayed in a first area (for example, display tray 214) that also includes text entered by the user via a keyboard (for example, keyboard 210) that is located in a second area); while displaying the focus indicator at the first indicator location, detecting a touch gesture at a first touch location that corresponds to the focus indicator (Fig. 3, 304; [0037] A contact on the touch screen, formed by a finger, is detected (304). The finger contact forms a contact area on the touch screen); while continuing to detect the touch gesture on the touch-sensitive display device, detecting movement of the touch gesture to a second touch location that is different from the first touch location (Fig. 3, 308; [0038] A movement of the finger across the touch screen is detected (308)); in response to detecting the movement of the touch gesture to the second touch location: in accordance with a determination that the second touch location is in a first direction relative to the first touch location, moving the focus indicator to a second indicator location that has a first spatial relationship to the second touch location, wherein moving the focus indicator to the second indicator location that has the first spatial relationship to the second touch location includes moving the focus indicator by an amount that is less than a threshold distance from the second touch location (Fig. 3, 310; [0038] In response to the detected movement, the insertion marker and the insertion marker placement aid is moved in accordance with the detected movement (310). Both the insertion marker and the insertion marker placement aid are moved in the general direction of the detected movement. For example, if the detected movement is rightward, the insertion marker and the insertion marker placement aid are moved rightward); in accordance with a determination that the second touch location is in a second direction relative to the first touch location, moving the focus indicator to a third indicator location that has a second spatial relationship, different from the first spatial relationship, to the second touch location, wherein moving the focus indicator to the third indicator location that has the second spatial relationship to the second touch location includes moving the focus indicator by an amount that is greater than the threshold distance from the second touch location ([0039] The insertion marker and the insertion marker placement aid moves in accordance with any movement of the finger across the touch screen as long as the finger contact on the touch screen remains unbroken from when the finger contact on the touch screen is detected in block 304); while the focus indicator is at a fourth indicator location, detecting liftoff of the touch gesture; and in response to detecting the liftoff, maintaining display of the focus indicator at the fourth indicator location ([0039] If the contact with the touch screen is broken (and thus the contact with the touch screen is no longer detected), the insertion marker placement aid is removed from display and the insertion marker remains at its last position). Regarding independent claim 11, it is a medium claim that corresponding to the device of claim 1. Therefore, it is rejected for the same reason as claim 1 above. Regarding independent claim 12, it is a method claim that corresponding to the device of claim 1. Therefore, it is rejected for the same reason as claim 1 above. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 2-4, 10, 13-15, 20, 21-23 and 28 are rejected under 35 U.S.C. 103 as being unpatentable over Ording as applied in claim 1, 11 and 12 in view of CLARKSON et al. (hereinafter CLARKSON), US 20150346998 A1. Regarding dependent claim 2, Ording teaches all the limitations as set forth in the rejection of claim 1 that is incorporated. Ording does not explicitly teach wherein the touch-sensitive display has an upper edge and a lower edge, and wherein the one or more programs further include instructions for: in response to detecting the movement of the touch gesture to the second touch location: in accordance with a determination that the second touch location is in a third direction relative to the first touch location that is towards the upper edge of the touch-sensitive display, moving the focus indicator in the third direction to a respective indicator location that is a first predetermined distance closer to the upper edge of the touch-sensitive display than the second touch location. However, in the same field of endeavor, CLARKSON teaches wherein the touch-sensitive display has an upper edge and a lower edge, and ([0010] An example device 100 adapted for touch applications is illustrated in FIG. 1. The device 100 is shown comprising hardware elements that can be electrically coupled via a bus 105 (or may otherwise be in communication, as appropriate). The hardware elements may include one or more processors 110, including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, and/or the like); one or more input devices 115, which include at least a touch input device 116, and can further include without limitation a mouse, a keyboard, and/or the like; and one or more output devices 120, which include at least a display device 121, and can further include without limitation a speaker, a printer, and/or the like. The touch input device 116 and the display device 121 may be combined into a touchscreen; [0026] Example methods, apparatuses, or articles of manufacture presented herein may be implemented, in whole or in part, for use in or with mobile communication devices. As used herein, “mobile device,” “mobile communication device,” “hand-held device,” “tablets,” etc., or the plural form of such terms may be used interchangeably and may refer to any kind of special purpose computing platform or device that may communicate through wireless transmission or receipt of information over suitable communications networks according to one or more communication protocols, and that may from time to time have a position or location that changes. As a way of illustration, special purpose mobile communication devices, may include, for example, cellular telephones, satellite telephones, smart telephones, heat map or radio map generation tools or devices, observed signal parameter generation tools or devices, personal digital assistants (PDAs), laptop computers, personal entertainment systems, e-book readers, tablet personal computers (PC), personal audio or video devices, personal navigation units, or the like. Examiner notes that the touchscreen of these devices have an upper edge and a lower edge) wherein the one or more programs further include instructions for ([0013] The device 100 also can comprise software elements, shown as being currently located within the working memory 135, including an operating system 140, device drivers, executable libraries, and/or other code, such as one or more application programs 145, which may comprise or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the method(s) discussed below might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods): in response to detecting the movement of the touch gesture to the second touch location: in accordance with a determination that the second touch location is in a third direction relative to the first touch location that is towards the upper edge of the touch-sensitive display, moving the focus indicator in the third direction to a respective indicator location that is a first predetermined distance closer to the upper edge of the touch-sensitive display than the second touch location ([0019] Referring to FIG. 3, an example method 300 for moving a text cursor to a desired location within a pre-existing text body with rotations and/or changes in the tilt of a finger on a touch input device 116 is shown. The user may first place the text cursor 315 at an initial location within the text body 305 by touching the initial location on the touch input device 116. Of course, this operation may be omitted if a text cursor is already present within the text body and the user merely wishes to change its location. In this scenario, the user may simply place the finger where the existent text cursor is. Then, while keeping the finger in touch with the touch input device 116, the user may move the text cursor 315 to the desired location with rotations and changes in the tilt of the finger. A rotation of the finger may cause the text cursor 315 to move horizontally within the same line of the text, while a change in the tilt of the finger may cause the text cursor 315 to move vertically between the lines. For example, the user may move the text cursor 315 rightward within the same line of text by rotating the finger clockwise, and vice versa, and the use may move the text cursor 315 to the line directly above the line where the text cursor 315 is currently located by increasing the angle between the finger and the touch surface of the touch input device 116, and vice versa). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of moving a text cursor utilizing touch object orientation with a touch user interface as suggested in CLARKSON into Ording’s system because both of these systems are addressing moving the text cursor according to a touch movement. This modification would have been motivated by the desire to facilitate location selections within a text body on a touch user interface with greater precision than is possible with conventional touch inputs (CLARKSON, [0016]). Regarding dependent claim 3, the combination of Ording and CLARKSON teaches all the limitations as set forth in the rejection of claim 2 that is incorporated. CLARKSON further teaches wherein the one or more programs further include instructions for: while displaying the focus indicator at the third indicator location, detecting movement of the touch gesture in a fifth direction parallel to the upper edge of the touch-sensitive display to a third touch location; and in response to detecting movement of the touch gesture to the third touch location, moving the focus indicator in the fifth direction to a seventh indicator location that is the first predetermined distance closer to the upper edge of the touch-sensitive display than the third touch location ([0020] In another embodiment, the user may perform the rotation and/or change in the tilt of the finger operations within a predetermined area of the touch user interface. The user may place the text cursor at an initial location within a text body by a simple touch at the location. Alternatively, a text cursor may already be present and its location may be considered as the initial location. In yet another embodiment, the initial location of the text cursor may be supplied by the system, either randomly or at a predetermined location, after it has been detected that the user has started performing rotation and/or change in the tilt of the finger operations within the predetermined area of the touch user interface. The user may then move the text cursor to a desired location within the text body by performing the rotation and/or change in the tilt of the finger operations while keeping the finger in contact with the touch device within the predetermined area of the touch user interface. The text cursor may move in the same way in response to the user's operations as described above: a rotation of the finger may cause the text cursor to move horizontally within the same line of the text, while a change in the tilt of the finger may cause the text cursor to move vertically between the lines). Regarding dependent claim 4, Ording teaches all the limitations as set forth in the rejection of claim 1 that is incorporated. Ording does not explicitly teach wherein the touch-sensitive display has an upper edge and a lower edge, and wherein the one or more programs further include instructions for: in response to detecting the movement of the touch gesture to the second touch location: in accordance with a determination that the second touch location is in a fourth direction relative to the first touch location that is towards the lower edge of the touch-sensitive display and a determination that the second touch location is greater than a second predetermined distance from the first touch location, moving the focus indicator to a fifth indicator location that is along an axis perpendicular to the upper edge of the touch-sensitive display and that is a third predetermined distance closer to the upper edge than the second touch location; and in accordance with a determination that the second touch location is in the fourth direction and a determination that the second touch location is less than the second predetermined distance from the first touch location, moving the focus indicator in the fourth direction to a sixth indicator location that is less than the third predetermined distance closer to the upper edge than the second touch location. However, in the same field of endeavor, CLARKSON teaches wherein the touch-sensitive display has an upper edge and a lower edge ([0010] An example device 100 adapted for touch applications is illustrated in FIG. 1. The device 100 is shown comprising hardware elements that can be electrically coupled via a bus 105 (or may otherwise be in communication, as appropriate). The hardware elements may include one or more processors 110, including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, and/or the like); one or more input devices 115, which include at least a touch input device 116, and can further include without limitation a mouse, a keyboard, and/or the like; and one or more output devices 120, which include at least a display device 121, and can further include without limitation a speaker, a printer, and/or the like. The touch input device 116 and the display device 121 may be combined into a touchscreen; [0026] Example methods, apparatuses, or articles of manufacture presented herein may be implemented, in whole or in part, for use in or with mobile communication devices. As used herein, “mobile device,” “mobile communication device,” “hand-held device,” “tablets,” etc., or the plural form of such terms may be used interchangeably and may refer to any kind of special purpose computing platform or device that may communicate through wireless transmission or receipt of information over suitable communications networks according to one or more communication protocols, and that may from time to time have a position or location that changes. As a way of illustration, special purpose mobile communication devices, may include, for example, cellular telephones, satellite telephones, smart telephones, heat map or radio map generation tools or devices, observed signal parameter generation tools or devices, personal digital assistants (PDAs), laptop computers, personal entertainment systems, e-book readers, tablet personal computers (PC), personal audio or video devices, personal navigation units, or the like. Examiner notes that the touchscreen of these devices have an upper edge and a lower edge), and wherein the one or more programs further include instructions for ([0013] The device 100 also can comprise software elements, shown as being currently located within the working memory 135, including an operating system 140, device drivers, executable libraries, and/or other code, such as one or more application programs 145, which may comprise or may be designed to implement methods, and/or configure systems, provided by other embodiments, as described herein. Merely by way of example, one or more procedures described with respect to the method(s) discussed below might be implemented as code and/or instructions executable by a computer (and/or a processor within a computer); in an aspect, then, such code and/or instructions can be used to configure and/or adapt a general purpose computer (or other device) to perform one or more operations in accordance with the described methods): in response to detecting the movement of the touch gesture to the second touch location: in accordance with a determination that the second touch location is in a fourth direction relative to the first touch location that is towards the lower edge of the touch-sensitive display and a determination that the second touch location is greater than a second predetermined distance from the first touch location, moving the focus indicator to a fifth indicator location that is along an axis perpendicular to the upper edge of the touch-sensitive display and that is a third predetermined distance closer to the upper edge than the second touch location ([0019] Referring to FIG. 3, an example method 300 for moving a text cursor to a desired location within a pre-existing text body with rotations and/or changes in the tilt of a finger on a touch input device 116 is shown. The user may first place the text cursor 315 at an initial location within the text body 305 by touching the initial location on the touch input device 116. Of course, this operation may be omitted if a text cursor is already present within the text body and the user merely wishes to change its location. In this scenario, the user may simply place the finger where the existent text cursor is. Then, while keeping the finger in touch with the touch input device 116, the user may move the text cursor 315 to the desired location with rotations and changes in the tilt of the finger. A rotation of the finger may cause the text cursor 315 to move horizontally within the same line of the text, while a change in the tilt of the finger may cause the text cursor 315 to move vertically between the lines. For example, the user may move the text cursor 315 rightward within the same line of text by rotating the finger clockwise, and vice versa, and the use may move the text cursor 315 to the line directly above the line where the text cursor 315 is currently located by increasing the angle between the finger and the touch surface of the touch input device 116, and vice versa; and in accordance with a determination that the second touch location is in the fourth direction and a determination that the second touch location is less than the second predetermined distance from the first touch location, moving the focus indicator in the fourth direction to a sixth indicator location that is less than the third predetermined distance closer to the upper edge than the second touch location ([0020] In another embodiment, the user may perform the rotation and/or change in the tilt of the finger operations within a predetermined area of the touch user interface. The user may place the text cursor at an initial location within a text body by a simple touch at the location. Alternatively, a text cursor may already be present and its location may be considered as the initial location. In yet another embodiment, the initial location of the text cursor may be supplied by the system, either randomly or at a predetermined location, after it has been detected that the user has started performing rotation and/or change in the tilt of the finger operations within the predetermined area of the touch user interface. The user may then move the text cursor to a desired location within the text body by performing the rotation and/or change in the tilt of the finger operations while keeping the finger in contact with the touch device within the predetermined area of the touch user interface. The text cursor may move in the same way in response to the user's operations as described above: a rotation of the finger may cause the text cursor to move horizontally within the same line of the text, while a change in the tilt of the finger may cause the text cursor to move vertically between the lines). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of moving a text cursor utilizing touch object orientation with a touch user interface as suggested in CLARKSON into Ording’s system because both of these systems are addressing moving the text cursor according to a touch movement. This modification would have been motivated by the desire to facilitate location selections within a text body on a touch user interface with greater precision than is possible with conventional touch inputs (CLARKSON, [0016]). Regarding dependent claim 10, Ording teaches all the limitations as set forth in the rejection of claim 1 that is incorporated. Ording does not explicitly teach wherein the touch-sensitive display has an upper edge and a lower edge wherein the third indicator location is closer to the upper edge of the touch-sensitive display than the second touch location. However, in the same field of endeavor, CLARKSON teaches wherein the touch-sensitive display has an upper edge and a lower edge ([0010] An example device 100 adapted for touch applications is illustrated in FIG. 1. The device 100 is shown comprising hardware elements that can be electrically coupled via a bus 105 (or may otherwise be in communication, as appropriate). The hardware elements may include one or more processors 110, including without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, and/or the like); one or more input devices 115, which include at least a touch input device 116, and can further include without limitation a mouse, a keyboard, and/or the like; and one or more output devices 120, which include at least a display device 121, and can further include without limitation a speaker, a printer, and/or the like. The touch input device 116 and the display device 121 may be combined into a touchscreen; [0026] Example methods, apparatuses, or articles of manufacture presented herein may be implemented, in whole or in part, for use in or with mobile communication devices. As used herein, “mobile device,” “mobile communication device,” “hand-held device,” “tablets,” etc., or the plural form of such terms may be used interchangeably and may refer to any kind of special purpose computing platform or device that may communicate through wireless transmission or receipt of information over suitable communications networks according to one or more communication protocols, and that may from time to time have a position or location that changes. As a way of illustration, special purpose mobile communication devices, may include, for example, cellular telephones, satellite telephones, smart telephones, heat map or radio map generation tools or devices, observed signal parameter generation tools or devices, personal digital assistants (PDAs), laptop computers, personal entertainment systems, e-book readers, tablet personal computers (PC), personal audio or video devices, personal navigation units, or the like. Examiner notes that the touchscreen of these devices have an upper edge and a lower edge) wherein the third indicator location is closer to the upper edge of the touch-sensitive display than the second touch location ([0019] Referring to FIG. 3, an example method 300 for moving a text cursor to a desired location within a pre-existing text body with rotations and/or changes in the tilt of a finger on a touch input device 116 is shown. The user may first place the text cursor 315 at an initial location within the text body 305 by touching the initial location on the touch input device 116. Of course, this operation may be omitted if a text cursor is already present within the text body and the user merely wishes to change its location. In this scenario, the user may simply place the finger where the existent text cursor is. Then, while keeping the finger in touch with the touch input device 116, the user may move the text cursor 315 to the desired location with rotations and changes in the tilt of the finger. A rotation of the finger may cause the text cursor 315 to move horizontally within the same line of the text, while a change in the tilt of the finger may cause the text cursor 315 to move vertically between the lines. For example, the user may move the text cursor 315 rightward within the same line of text by rotating the finger clockwise, and vice versa, and the use may move the text cursor 315 to the line directly above the line where the text cursor 315 is currently located by increasing the angle between the finger and the touch surface of the touch input device 116, and vice versa). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of moving a text cursor utilizing touch object orientation with a touch user interface as suggested in CLARKSON into Ording’s system because both of these systems are addressing moving the text cursor according to a touch movement. This modification would have been motivated by the desire to facilitate location selections within a text body on a touch user interface with greater precision than is possible with conventional touch inputs (CLARKSON, [0016]). Regarding dependent claim 13, it is a medium claim that corresponding to the device of claim 2. Therefore, it is rejected for the same reason as claim 2 above. Regarding dependent claim 14, it is a medium claim that corresponding to the device of claim 3. Therefore, it is rejected for the same reason as claim 3 above. Regarding dependent claim 15, it is a medium claim that corresponding to the device of claim 4. Therefore, it is rejected for the same reason as claim 4 above. Regarding dependent claim 20, it is a medium claim that corresponding to the device of claim 10. Therefore, it is rejected for the same reason as claim 10 above. Regarding dependent claim 21, it is a method claim that corresponding to the device of claim 2. Therefore, it is rejected for the same reason as claim 2 above. Regarding dependent claim 22, it is a method claim that corresponding to the device of claim 3. Therefore, it is rejected for the same reason as claim 3 above. Regarding dependent claim 23, it is a method claim that corresponding to the device of claim 4. Therefore, it is rejected for the same reason as claim 4 above. Regarding dependent claim 28, it is a method claim that corresponding to the device of claim 10. Therefore, it is rejected for the same reason as claim 10 above. Claims 5-6, 16-17 and 24-25 are rejected under 35 U.S.C. 103 as being unpatentable over Ording as applied in claim 1, 11 and 12 in view of THORSANDER (hereinafter THORSANDER), US 20130285928 A1. Regarding dependent claim 5, Ording teaches all the limitations as set forth in the rejection of claim 1 that is incorporated. Ording does not explicitly teach the touch-sensitive display has an upper edge and a lower edge; the first direction is parallel to the upper edge of the touch-sensitive display; the first spatial relationship includes being a first distance from the second touch location; and the one or more programs further include instructions for: while displaying the focus indicator at the second indicator location, detecting movement of the touch gesture in a sixth direction that is along an axis perpendicular to the upper edge of the touch-sensitive display to a fourth touch location; in response to detecting movement of the touch gesture to the fourth touch location, moving the focus indicator in the sixth direction to an eighth indicator location that is greater than the first distance from the fourth touch location. However, in the same field of endeavor, THORSANDER teaches the touch-sensitive display has an upper edge and a lower edge (Figs. 1-2, 118; [0022] The processor 102 interacts with other components, such as a Random Access Memory (RAM) 108, memory 110, a touch-sensitive display 118, one or more actuators 120, one or more force sensors 122, an auxiliary input/output (I/O) subsystem 124, a data port 126, a speaker 128, a microphone 130, short-range communications 132 and other device subsystems 134. The touch-sensitive display 118 includes a display 112 and touch sensors 114 that are coupled to at least one controller 116 that is utilized to interact with the processor 102. Input via a graphical user interface is provided via the touch-sensitive display 118. Information, such as text, characters, symbols, images, icons, and other items that may be displayed or rendered on a electronic device, is displayed on the touch-sensitive display 118 via the processor 102. The processor 102 may also interact with an accelerometer 136 that may be utilized to detect direction of gravitational forces or gravity-induced reaction forces); the first direction is parallel to the upper edge of the touch-sensitive display ([0065] One proposed solution is illustrated in FIG. 7C. On performing a touch interaction with the selection handle 730, the contents of the display may move 780 such that the text in close proximity to the touched selection handle 730 is not obscured by the touch input object 760. In this way, the user may be able to view the content just selected. Also, or instead of this movement 780, an extended selection handle 777 may appear. This extended selection handle 777 may provide a graphical link between the point of touch on the touch-sensitive display 118 and the corresponding end of a selection area 720. The touch point may not only be graphically coupled to the end of the selection area 720 by the extended selection handle 777, but may also be operatively coupled to it. Therefore, if the touch point moves (for example, because the user 760 performs a drag while still touching on the selection handle 777), the corresponding end of the selection area 720 may move as well); the first spatial relationship includes being a first distance from the second touch location ([0065] This extended selection handle 777 may provide a graphical link between the point of touch on the touch-sensitive display 118 and the corresponding end of a selection area 720. The touch point may not only be graphically coupled to the end of the selection area 720 by the extended selection handle 777, but may also be operatively coupled to it); and the one or more programs further include instructions for: while displaying the focus indicator at the second indicator location, detecting movement of the touch gesture in a sixth direction that is along an axis perpendicular to the upper edge of the touch-sensitive display to a fourth touch location ([0065] if the touch point moves (for example, because the user 760 performs a drag while still touching on the selection handle 777), the corresponding end of the selection area 720 may move as well); in response to detecting movement of the touch gesture to the fourth touch location, moving the focus indicator in the sixth direction to an eighth indicator location that is greater than the first distance from the fourth touch location ([0066] FIG. 8A shows a more detailed view of the extended selection handle 777. On this extended selection handle 777 there may be a touch portion 830 (also referred to as a `handle`), a neck portion 820 (also referred to as a `cursor neck`) and a content selection portion 810 (also referred to as a `content selection portion`). The touch portion 830 may be the portion of the selection handle 777 that responds to user input and can be touched and dragged to cause the rest of the selection handle 777 to be moved. If a touch input is used to drag the selection handle 777, the touch portion may remain coupled to the location on the display corresponding to the detected touch location such that it always remains under the user's finger as the selection handle 777 is moved; [0067] The content selection portion 810 may be coupled to a selection area 720. For example, as shown in FIG. 8B, the content selection portion 810 is coupled to a start end of the selection area 720 such that as the selection handle 777 moves, as does the start of the selection area 720. This may be represented graphically in a different way, such as shown in FIG. 8C, where the content selection portion 810 is also coupled to the start end of the selection area 720, but is displayed to reach the top left portion of the selection area 720. Functionally, there may be no difference between the two selection handles 777 shown in FIG. 8B and 8C; [0068] The neck portion 820 graphically connects the touch portion 830 to the content selection portion 810. While the touch portion 830 may be obscured by a user's touch, the user may be able to see the neck portion 820 extending from the touch portion 830 (under the user's finger) to the content selection portion 810. This may indicate to the user that the touch portion 830 and content selection portion 810 are connected, and that by dragging the touch portion 830, the content selection portion 810 will also be moved. Referring back to FIG. 7C, although the part of the selection area 720 that the user touched has moved away, because the extended selection handle 777 has been displayed the user will see a connection between where they originally pressed and where the corresponding selection area 720 has now moved to. The extended selection handle 777 may be displayed as an animation, showing a transformation of the original selection handle 730 to the extended selection handle 777. Such an animation may be a neck portion extending out of the original selection handle 730 at the same rate as the underlying content moves up 780). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of the movement of content in response to a touch input as suggested in THORSANDER into Ording’s system because both of these systems are addressing moving the text cursor according to a touch movement. This modification would have been motivated by the desire to improve on existing selection handle technology (THORSANDER, [0062]). Regarding dependent claim 6, Ording teaches all the limitations as set forth in the rejection of claim 1 that is incorporated. Ording does not explicitly teach wherein the first spatial relationship includes: in accordance with a determination that the movement of the touch gesture to the second touch location has a first speed and a first movement distance, the second indicator location being a fourth predetermined distance from the second touch location; and in accordance with a determination that the movement of the touch gesture to the second touch location has a second speed, less than the first speed, and the first movement distance, the second indicator location being a fifth predetermined distance from the second touch location that is greater than the fourth predetermined distance. However, in the same field of endeavor, THORSANDER teaches wherein the first spatial relationship includes: in accordance with a determination that the movement of the touch gesture to the second touch location has a first speed and a first movement distance, the second indicator location being a fourth predetermined distance from the second touch location (Figs. 9A-9B; [0069]-[0071] As the extended selection handle is moved, the neck portion may extend 977 to increase the distance between the touch portion and the content selection portion. In other words, to prevent the physical location of the touch object `catching up` with the coupled part of the selection area 935, the neck portion extends faster than the finger moves. This extension may also cater for a changed angle of the user's finger. The length of the neck portion may change dynamically depending on factors including the speed of the drag, position of the selection area with respect to the edges of the screen, the detected angle of the user's finger and the size of the font of the content being selected. The neck portion may have a maximum length and it may have a minimum length); and in accordance with a determination that the movement of the touch gesture to the second touch location has a second speed, less than the first speed, and the first movement distance, the second indicator location being a fifth predetermined distance from the second touch location that is greater than the fourth predetermined distance (Figs. 9A-9B; [0069]-[0071] As the extended selection handle is moved, the neck portion may extend 977 to increase the distance between the touch portion and the content selection portion. In other words, to prevent the physical location of the touch object `catching up` with the coupled part of the selection area 935, the neck portion extends faster than the finger moves. This extension may also cater for a changed angle of the user's finger. The length of the neck portion may change dynamically depending on factors including the speed of the drag, position of the selection area with respect to the edges of the screen, the detected angle of the user's finger and the size of the font of the content being selected. The neck portion may have a maximum length and it may have a minimum length). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of the movement of content in response to a drag speed as suggested in THORSANDER into Ording’s system because both of these systems are addressing moving the text cursor according to a touch movement. This modification would have been motivated by the desire to improve on existing selection handle technology (THORSANDER, [0062]). Regarding dependent claim 16, it is a medium claim that corresponding to the device of claim 5. Therefore, it is rejected for the same reason as claim 5 above. Regarding dependent claim 17, it is a medium claim that corresponding to the device of claim 6. Therefore, it is rejected for the same reason as claim 6 above. Regarding dependent claim 24, it is a method claim that corresponding to the device of claim 5. Therefore, it is rejected for the same reason as claim 5 above. Regarding dependent claim 25, it is a method claim that corresponding to the device of claim 6. Therefore, it is rejected for the same reason as claim 6 above. Claims 7-8, 18-19 and 26-27 are rejected under 35 U.S.C. 103 as being unpatentable over Ording as applied in claim 1, 11 and12, in view of Ruiz et al. (hereinafter Ruiz), US 20160274686 A1. Regarding dependent claim 7, Ording teaches all the limitations as set forth in the rejection of claim 1 that is incorporated. Ording further teaches wherein: displaying the focus indicator at the first indicator location includes displaying a first text content (Fig. 3; [0037] An insertion marker is displayed on a touch screen (302). The insertion marker may be displayed in an application that includes text entry, such as a memo pad, email, or short message service (SMS) application. In some embodiments, the insertion marker is displayed in a first area (for example, display tray 214) that also includes text entered by the user via a keyboard (for example, keyboard 210) that is located in a second area); the first indicator location corresponds to a first text insertion position at a first text insertion location in the first text content (Fig. 4; [0040] An insertion marker 402 may be displayed in the display tray 214 to indicate the location where the next entered character will be inserted); and the focus indicator is moved to the second indicator location in response to detecting the movement of the touch gesture to the second touch location ([0045] While the finger 212 is still in contact with the touch screen 208, the user may move the finger 212 across the touch screen 208, and thus moving the contact area 404 in the process. As shown in FIG. 4E, the insertion marker placement aid 406 and the insertion marker 402 move along with the contact area 404. The insertion marker 402, which was at the end of the text 401, is now at a position that is closer to the middle of the text 401); the one or more programs further include instructions for: in response to detecting the movement of the touch gesture to the second touch location: moving the first text insertion position from the first text insertion location to a second text insertion location ([0045] While the finger 212 is still in contact with the touch screen 208, the user may move the finger 212 across the touch screen 208, and thus moving the contact area 404 in the process. As shown in FIG. 4E, the insertion marker placement aid 406 and the insertion marker 402 move along with the contact area 404. The insertion marker 402, which was at the end of the text 401, is now at a position that is closer to the middle of the text 401). Ording does not explicitly teach in accordance with a determination that the second indicator location is separated from the second text insertion location by greater than a fifth predetermined distance, displaying a visual indication of the text insertion position at the second text insertion location; and in accordance with a determination that the second indicator location is separated from the second text insertion location by less than the fifth predetermined distance, forgoing displaying a visual indication of the text insertion position at the second text insertion location. However, in the same field of endeavor, Ruiz teaches in accordance with a determination that the second indicator location is separated from the second text insertion location by greater than a fifth predetermined distance, displaying a visual indication of the text insertion position at the second text insertion location ([0026] In accordance with some embodiments, a method of cursor manipulation is performed at a portable multifunction device including one or more processors, memory, and a touch screen display. The method includes: displaying text on the display; displaying a cursor at a line within the text; detecting a two-finger swipe gesture on the touch screen display in a direction at least partially parallel to the line and towards an edge of the touch screen display; and in response to detecting the two-finger swipe gesture moving the cursor to a distal point of the text. For example, an end or beginning of a line or a top or bottom of a page or document; [0028] In accordance with some embodiments, the distal point of the text is at a location in the direction of the gesture. For example, an end or beginning of a line or a top or bottom of a page or document; [0029] In accordance with some embodiments, moving the cursor to the distal point of the text includes moving the cursor to a beginning or an end of the line of the text or a beginning or an end of the text (e.g., the top or bottom of a document or page) in accordance with the direction of the two-finger swipe); and in accordance with a determination that the second indicator location is separated from the second text insertion location by less than the fifth predetermined distance, forgoing displaying a visual indication of the text insertion position at the second text insertion location ([0034] In accordance with some embodiments, the method further places a cursor in the closest space after a preceding word when the distance is less than the predetermined threshold distance). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of manipulating a cursor within the electronic document presented on a display as suggested in Ruiz into Ording’s system because both of these systems are addressing moving the text cursor according to a touch movement. This modification would have been motivated by the desire to provide a simple and intuitive way to manipulate the cursor for content selection and editing (Ruiz, [0004]). Regarding dependent claim 8, Ording teaches all the limitations as set forth in the rejection of claim 1 that is incorporated. Ording does not explicitly teach wherein the one or more programs further include instructions for: displaying, via the touch-sensitive display device, content; and detecting a second touch gesture starting at a fifth touch location that includes a component of movement; in response to detecting the second touch gesture: in accordance with a determination that the fifth touch location does not correspond to the focus indicator, scrolling the content; and in accordance with a determination that the fifth touch location corresponds to the focus indicator, forgoing scrolling the content. However, in the same field of endeavor, Ruiz teaches wherein the one or more programs further include instructions for ([0009]): displaying, via the touch-sensitive display device, content ([0387] FIG. 4B is a flow chart illustrating a method 470 of cursor manipulation with a single touch input (e.g., a single-finger touch input), in accordance with some embodiments. The method 470 is performed at a portable multifunction device (e.g., the device 100 in FIG. 1A) with a touch screen display (e.g., the touch screen display 112 in FIG. 1A). As described below, the method 470 provides an expedient mechanism for moving the cursor, selecting the content, and providing options to edit the content at a portable multifunction device with a touch screen display; [0388] In some embodiments, the device 100 displays content of an electronic document on the touch screen display 112. In some embodiments, the content comprises text (e.g., plain text, unstructured text, formatted text, or text in a web page)); and detecting a second touch gesture starting at a fifth touch location that includes a component of movement ([0408] Still referring to FIG. 4B, if the device 100 determines (at 480) that the single-finger touch input is a drag gesture); in response to detecting the second touch gesture: in accordance with a determination that the fifth touch location does not correspond to the focus indicator, scrolling the content ([0388] In some embodiments, the device 100 displays content of an electronic document on the touch screen display 112. In some embodiments, the content comprises text (e.g., plain text, unstructured text, formatted text, or text in a web page). In other embodiments, the content comprises graphics with or without text. Moreover, the content may be editable or read-only. In addition to displaying the content, when no content is selected, the device 100 may display a cursor within the electronic document. In some embodiments, while displaying the content of the electronic document, the device 100 detects a single-finger touch input (e.g., a single-finger tap) at 472. The portable multifunction device 100 then determines whether prior to detecting the touch input, there is an existing selection of the content at 474; [0389] If the device 100 detects an existing selection (474—Yes), then the device 100 further determines if the single-finger touch input is located on the selection at 475. If the device 100 determines that the single-finger touch input is not located on the selection (475—No), then device 100 dismisses the selection at 476 and proceeds to step 480; [0408] the device 100 scrolls the content displayed in the content region of the touch screen display 112 in accordance with the direction of the drag at 488. For example, an upward drag scrolls the content upward, a downward drag scrolls the content downward, a left drag scrolls the content to the left, and likewise, a right drag scrolls the content to the right); and in accordance with a determination that the fifth touch location corresponds to the focus indicator, forgoing scrolling the content ([0389] if the device 100 determines that the single-finger touch input is located on the selection (475—Yes), the device 100 determines the type of the touch input at 477; [0390] If the device 100 determines the type of single-finger touch input is tap (down), lift (up), tap (down), and a drag (477—Tap- PNG media_image2.png 37 25 media_image2.png Greyscale half PNG media_image2.png 37 25 media_image2.png Greyscale drag), the device 100 at 487 first dismisses the selection, then performs actions similar to those performed at step 471, such as selecting a word closest to the touch input and expanding the selection while dragging). It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the teaching of manipulating a cursor within the electronic document presented on a display as suggested in Ruiz into Ording’s system because both of these systems are addressing moving the text cursor according to a touch movement. This modification would have been motivated by the desire to provide a simple and intuitive way to manipulate the cursor for content selection and editing (Ruiz, [0004]). Regarding dependent claim 18, it is a medium claim that corresponding to the device of claim 7. Therefore, it is rejected for the same reason as claim 7 above. Regarding dependent claim 19, it is a medium claim that corresponding to the device of claim 8. Therefore, it is rejected for the same reason as claim 8 above. Regarding dependent claim 26, it is a method claim that corresponding to the device of claim 7. Therefore, it is rejected for the same reason as claim 7 above. Regarding dependent claim 27, it is a method claim that corresponding to the device of claim 8. Therefore, it is rejected for the same reason as claim 8 above. Response to Arguments Applicant's arguments filed 12/15/2025 have been fully considered. In the Remarks, Applicant alleges that amended independent claims 1, 11, and 12 clarify that "moving the focus indicator to the second indicator location that has the first spatial relationship to the second touch location includes moving the focus indicator by an amount that is less than a threshold distance from the second touch location" and "moving the focus indicator to the third indicator location that has the second spatial relationship to the second touch location includes moving the focus indicator by an amount that is greater than the threshold distance from the second touch location." (Emphasis added). Applicant respectfully asserts that Ording does not disclose at least these newly added features of amended independent claims 1, 11, and 12. Examiner respectfully disagrees. Newly added features of amended independent claims 1, 11, and 12 are rejected for containing subject matter which was not described in the specification. Thus, claims 1, 11 and 12 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement as set forth above. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Applicant is required under 37 C.F.R. § 1.111(c) to consider these references fully when responding to this action. Ronkainen (US 20140157201 A1) discloses an apparatus configured to provide a two-step cursor placement or selection, the first step providing zooming with pan ability in response to a hover input and the second step providing fine cursor placement or selection whilst a zoomed display is locked. It is noted that any citation to specific pages, columns, lines, or figures in the prior art references and any interpretation of the references should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. In re Heck, 699 F.2d 1331, 1332-33, 216 U.S.P.Q. 1038, 1039 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 U.S.P.Q. 275, 277 (C.C.P.A. 1968)). Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AMY P HOANG whose telephone number is (469)295-9134. The examiner can normally be reached M-TH 8:30-5:00PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JENNIFER WELCH can be reached at 571-272-7212. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /AMY P HOANG/Examiner, Art Unit 2143 /JENNIFER N WELCH/Supervisory Patent Examiner, Art Unit 2143
Read full office action

Prosecution Timeline

May 29, 2024
Application Filed
Dec 23, 2024
Response after Non-Final Action
Nov 04, 2025
Non-Final Rejection — §102, §103, §112
Nov 20, 2025
Examiner Interview Summary
Nov 20, 2025
Applicant Interview (Telephonic)
Dec 15, 2025
Response Filed
Feb 27, 2026
Final Rejection — §102, §103, §112
Apr 08, 2026
Applicant Interview (Telephonic)
Apr 08, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602596
APPARATUS AND METHOD FOR VALIDATING DATASET BASED ON FEATURE COVERAGE
2y 5m to grant Granted Apr 14, 2026
Patent 12572263
ACCESS CARD WITH CONFIGURABLE RULES
2y 5m to grant Granted Mar 10, 2026
Patent 12536432
PRE-TRAINING METHOD OF NEURAL NETWORK MODEL, ELECTRONIC DEVICE AND MEDIUM
2y 5m to grant Granted Jan 27, 2026
Patent 12475669
METHOD AND APPARATUS WITH NEURAL NETWORK OPERATION FOR DATA NORMALIZATION
2y 5m to grant Granted Nov 18, 2025
Patent 12461595
SYSTEM AND METHOD FOR EMBEDDED COGNITIVE STATE METRIC SYSTEM
2y 5m to grant Granted Nov 04, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
70%
Grant Probability
99%
With Interview (+64.2%)
3y 3m
Median Time to Grant
Moderate
PTA Risk
Based on 232 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month