Prosecution Insights
Last updated: April 19, 2026
Application No. 18/168,364

SOFTWARE FOR KEYBOARD-LESS TYPING BASED UPON GESTURES

Final Rejection §103§DP
Filed
Feb 13, 2023
Examiner
KLICOS, NICHOLAS GEORGE
Art Unit
2118
Tech Center
2100 — Computer Architecture & Software
Assignee
Typyn Inc.
OA Round
8 (Final)
57%
Grant Probability
Moderate
9-10
OA Rounds
3y 6m
To Grant
87%
With Interview

Examiner Intelligence

Grants 57% of resolved cases
57%
Career Allow Rate
205 granted / 361 resolved
+1.8% vs TC avg
Strong +30% interview lift
Without
With
+30.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 6m
Avg Prosecution
24 currently pending
Career history
385
Total Applications
across all art units

Statute-Specific Performance

§101
11.9%
-28.1% vs TC avg
§103
49.0%
+9.0% vs TC avg
§102
14.0%
-26.0% vs TC avg
§112
19.5%
-20.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 361 resolved cases

Office Action

§103 §DP
DETAILED ACTION The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This Action is FINAL and is in response to the claims filed December 22, 2025. Claims 1-20 are currently pending, of which claims 1, 8, and 15 are currently amended. Response to Arguments Double Patenting Rejections Applicant requests that the double patent rejections be held in abeyance until the other rejections have been resolved. See Remarks 9. Examiner notes this request and that the amendments regarding the second input interface are still subject to the double patenting rejections in light of their respective rejections detailed below. Prior Art Rejections Applicant’s arguments with respect to the prior art rejections of the claims have been fully considered. Specifically, Applicant argues that the prior art of reference does not teach the newly amended “prior to” language as it relates to the displaying of the menu. See Remarks 9-10. Examiner respectfully disagrees with Applicant’s characterization of the teachings of the prior references, specifically Chen. Chen is explicitly clear that the lookup table can be displayed prior to the matching of the gesture. Specifically, Chen discloses “with the change in the motions, the display of pictures representing the available gestures may change accordingly, that is, the generation of the triggering lookup table is before the determination of a matched gesture.” See Chen para. [0051]. It is for at least these reasons and the reasons cited below that the claims remain rejected in this Action. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of U.S. Patent No. 11,609,693 B2 (hereinafter “the ‘693 patent”) in view of Martensson (U.S. Publication No. 2011/0069012) and Chen (U.S. Publication No. 2013/0106707). Claims 1, 8, and 15 of the present application are taught by at least claim 1 of the ‘693 patent. However, while the ‘693 patent discloses an index of characters, it does not disclose the vertical column and vertical position as claimed in the present application. Martensson teaches these features of the claim (See Martensson Fig. 6A and paras. [0074-5]: number wheel to enter characters, where the wheel is a vertical column with a plurality of characters that can be scrolled and selected). It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to combine, with a reasonable expectation of success, the gesture character input of the ‘693 patent with the vertical characters of Martensson. One would have been motivated to combine these references because both references disclose arranging individual characters across a navigable axis and Martensson enhances the user experience of the ‘693 patent by allowing for more flexible ways to interact with the interface, as interfaces Furthermore, while claim 1 of the ‘693 patent discloses the right hand or left hand gestures and a number of fingers associated therewith, the ‘693 patent does not disclose two gestures and then a subsequent determination of character. Furthermore, the ‘693 patent does not explicitly display a gesture menu. Chen teaches displaying, prior to receiving a first indication of a first gesture of a first user, a menu on a display of a first electronic device indicating a plurality of gesture types, as well as wherein the first gesture is associated with the plurality of gesture types…and wherein the second gesture is associated with the plurality of gesture types. Chen also teaches a gesture of a right one or more fingers and a gesture of a left one or more fingers, as well as wherein the plurality of characters is determined based, at least in part, on the received first indication (See Chen Fig. 3 and paras. [0026], [0046], [0051], and [0062]: lookup table of gestures that “includes displaying pictures representing the corresponding gestures in the triggering lookup table”. This further includes identifying gestures based on which hand is in contact with the touch screen display. These different hand/finger gestures and the display thereof are being applied to the gesturing and character scrolling of the ‘693 patent and Martensson). It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to combine, with a reasonable expectation of success, the character scrolling of the ‘693 patent and Martensson with the gestures of Chen. One would have been motivated to combine these references because both references disclose inputs using finger gestures, and Laubach enhances the user experience of the ‘693 patent and Martensson by allowing for more efficient awareness of the gestures, mitigating potential conflicts with overlapping gestures across different user interface elements and application (See Chen para. [0005]). Claims 2, 9, and 16 of the present application are taught by at least claim 2 of the ‘693 patent. While the ‘693 patent is directed to a third electronic device, this would read on the second electronic device of the present application because it is performing the same function (without the second electronic device being required for additional input in the claim language of the present application). Claims 3, 5, 10, 12, 17, and 19 of the present application are taught by at least claims 3 and 5 of the ‘693 patent. Claims 4, 11, and 18 of the present application are taught by at least claim 4 of the ‘693 patent. Claims 6, 13, and 20 of the present application are taught by at least claim 1 of the ‘693 patent. While the present application is directed to contact types of the gestures, the ‘693 patent is directed to more specific types of contact, such as right or left hand, number of fingers, and direction of movement. Claims 7 and 14 of the present application are not taught by the claims of the ‘693 patent. While a touch screen may be obvious in light of the gestures and fingers of the ‘693 patent, Martensson teaches a touch screen (See Martensson para. [0037]). Claims 1, 2, 6-9, 13-16, and 20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of U.S. Patent No. 10,747,426 B2 (hereinafter “the ‘426 patent”) in view of Martensson and Chen. Claims 1, 8, and 15 of the present application are taught by at least claim 1 of the ‘426 patent. However, while the ‘426 patent discloses an index of characters, it does not disclose the vertical column and vertical position as claimed in the present application. Martensson teaches these features of the claim (See Martensson Fig. 6A and paras. [0074-5]: number wheel to enter characters, where the wheel is a vertical column with a plurality of characters that can be scrolled and selected). It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to combine, with a reasonable expectation of success, the gesture character input of the ‘426 patent with the vertical characters of Martensson. One would have been motivated to combine these references because both references disclose arranging individual characters across a navigable axis and Martensson enhances the user experience of the ‘426 patent by allowing for more flexible ways to interact with the interface, as interfaces “may have limited space available for an input device and an output device.” Furthermore, while claim 1 of the ‘426 patent discloses the right hand or left hand gestures and a number of fingers associated therewith, the ‘426 patent does not disclose two gestures and then a subsequent determination of character Chen teaches displaying, prior to receiving a first indication of a first gesture of a first user, a menu on a display of a first electronic device indicating a plurality of gesture types, as well as wherein the first gesture is associated with the plurality of gesture types…and wherein the second gesture is associated with the plurality of gesture types. Chen also teaches a gesture of a right one or more fingers and a gesture of a left one or more fingers, as well as wherein the plurality of characters is determined based, at least in part, on the received first indication (See Chen Fig. 3 and paras. [0026], [0046], [0051], and [0062]: lookup table of gestures that “includes displaying pictures representing the corresponding gestures in the triggering lookup table”. This further includes identifying gestures based on which hand is in contact with the touch screen display. These different hand/finger gestures and the display thereof are being applied to the gesturing and character scrolling of the ‘426 patent and Martensson). It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to combine, with a reasonable expectation of success, the character scrolling of the ‘693 patent and Martensson with the gestures of Chen. One would have been motivated to combine these references because both references disclose inputs using finger gestures, and Laubach enhances the user experience of the ‘426 patent and Martensson by allowing for more efficient awareness of the gestures, mitigating potential conflicts with overlapping gestures across different user interface elements and application (See Chen para. [0005]). Claims 2, 9, and 16 of the present application are taught by at least claim 1 of the ‘426 patent.. Claims 6, 13, and 20 of the present application are taught by at least claim 1 of the ‘426 patent. While the present application is directed to contact types of the gestures, the ‘426 patent is directed to more specific types of contact, such as right or left hand, number of fingers, and direction of movement. Claims 7 and 14 of the present application are not taught by the claims of the ‘426 patent. While a touch screen may be obvious in light of the gestures and fingers of the ‘426 patent, Martensson teaches a touch screen (See Martensson para. [0037]). Claims 3-5, 10-12, and 17-19 are rejected on the ground of nonstatutory double patenting as being unpatentable over the combination of the ‘426 patent, Martensson, and Chen as applied above, further in view of Kagan et al. (U.S. Publication No. 2015/0253885; hereinafter “Kagan”). Claims 3, 5, 10, 12, 17, and 19 of the present application are taught by at least claim 10 of the ‘426 patent. However, while the ‘426 patent discloses that one of the devices can be a wearable, the ‘426 patent does not disclose that both the first and second electronic device of the present application are wearables. Kagan teaches that both the first and second electronic devices can be wearables (See Kagan Figs. 7A-7D and paras. [0030], [0058], [0077], and [0130]: both computing devices can be wearables in communication with one another. This includes entering a query on a first wearable and having it execute on a separate device, which can be a wearable). It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to combine, with a reasonable expectation of success, the gesturing remote character input of the ‘426 patent, Martensson, and Chen with the wearable devices of Kagan. One would have been motivated to combine these references because both references disclose remotely communicating with a variety of devices, and Kagan enhances the user experience by being conscious of “using the entirety of the display in cases where the display is relatively small in size” while also providing the user a choice on what second device receives the information, therefore increasing use case flexibility (See Kagan Figs. 7B-7D and paras. [0025] and [0044]). Claims 4, 11, and 18 of the present application are not taught by the ‘426 patent, Martensson, or Laubach. Kagan teaches the first and second wireless interfaces of the present application (See Kagan paras. [0038] and [0040]). Examiner’s Note The prior art rejections below cite particular paragraphs, columns, and/or line numbers in the references for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the applicant fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 2, 4, 6-9, 11, 13-16, 18, and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rider (U.S. Publication No. 2010/0085313; hereinafter “Rider”), and further in view of Martensson (U.S. Publication No. 2011/0069012), Chen (U.S. Publication No. 2013/0106707), and Laubach (U.S. Publication No. 2012/0235912). As per claim 1, Rider teaches a method, comprising: receiving the first indication of the first [gesture of a right one or more finger] of the first user detected by a first input interface of a first electronic device; receiving a second indication of a second [gesture of a left one or more fingers] of the first user detected by a second input interface, wherein the first input interface is separate from the second input interface; displaying a plurality of characters in a single vertical column and a selection window on a display of the first electronic device, wherein the plurality of characters is determined based, at least in part, on the received first indication from the first and the second input interfaces; and selecting a first character of the plurality of characters based on the adjustment to the vertical position of the plurality of characters (See Rider Figs. 4, 5, and paras. [0046-48]: detecting continuation of touch event (a gesture) over a first interface (the first set of input keys) until a secondary virtual keyboard of one or more secondary virtual input keys is presented to the user. “The secondary virtual keyboard is associated with the virtual input key (“AS” in this example) and has secondary virtual input keys 180 associated with the virtual input key “AS”.” Then, the user can select the rendered secondary character, such as the “Ô. This secondary keyboard 176 (or 476) is a separate input interface). However, while Rider explicitly teaches a keyboard layout of characters in both rows and columns, Rider does not explicitly teach that the characters can be displayed in a single vertical column, nor that the vertical position is adjusted. Martensson teaches the vertical column and the vertical position for the characters of Rider. Martensson also discloses adjusting a vertical position of the plurality of characters while maintaining a fixed position of the selection window on the display based on the received second indication from the first and the second input interfaces of Rider (See Martensson Figs. 6A-6B and paras. [0074-78]: number wheel to enter characters, where the wheel is a vertical column with a plurality of characters that can be scrolled and selected via directional gestures while the central characters remain fixed in the center. This includes detecting multiple inputs for each selection. These inputs would be associated with the interfaces and gestures of Rider). It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to combine, with a reasonable expectation of success, the horizontal characters of Rider with the vertical characters of Martensson. One would have been motivated to combine these references because both references disclose selecting characters on a keyboard-type layout, and Martensson enhances the user experience of Rider by allowing for more flexible ways to interact with the interface, as interfaces “may have limited space available for an input device and an output device” (See Martensson para. [0001]). Additionally, substituting vertical for horizontal can be preferred by the user and thus giving that option to users can further the ease of use of the character inputs. Furthermore, while Rider and Martensson teach using gesture inputs to scroll a plurality of characters in a vertical column, Rider and Martensson do not teach determining right or left finger, nor do they teach a menu of gesture types. Laubach further teaches a gesture of a right one or more fingers and a gesture of a left one or more fingers, (See Figs. 31a, 31b, 36-38, and paras. [0216], [0261], and [0319]: left hand or right hand gestures, with different functionalities associated therewith, including interacting with a touch screen keyboard based on the number of fingers used per hand. These gestures and their keyboards are being applied to the gesturing and character scrolling via the indications of Rider/Martensson). It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to combine, with a reasonable expectation of success, the character scrolling of Rider/Martensson with the left and right finger gestures of Laubach. One would have been motivated to combine these references because both references disclose inputs using finger gestures, including identifying left and right sides. Laubach enhances the user experience of Rider/Martensson by allowing for more efficient navigation of user interfaces and invoking output actions therein, expanding upon the ways that users traditionally interact with keyboards (See Laubach paras. [0003] and [0009]). Finally, while Rider, Martensson, and Laubach teach gesture inputs and finger tracking, neither Rider, Martensson, nor Laubach teach or suggest a gesture menu. Chen teaches displaying, prior to receiving a first indication of a first gesture of a first user, a menu on a display of a first electronic device indicating a plurality of gesture types, as well as wherein the first gesture is associated with the plurality of gesture types…and wherein the second gesture is associated with the plurality of gesture types (See Chen Fig. 3 and paras. [0026], [0046], [0051], and [0062]: lookup table of gestures that “includes displaying pictures representing the corresponding gestures in the triggering lookup table”. This further includes identifying gestures based on which hand is in contact with the touch screen display). It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to combine, with a reasonable expectation of success, the gestures of Rider/Martensson/Laubach with the gesture display of Chen. One would have been motivated to combine these references because both references disclose identifying gestures to control unique features and interactions within a user interface. Chen enhances the user experience of Rider/Martensson/Laubach by allowing for more efficient awareness of the gestures, mitigating potential conflicts with overlapping gestures across different user interface elements and application (See Chen para. [0005]). As per claim 6, Rider/Martensson/Laubach/Chen further teaches the method of claim 1, wherein the first indication of the first gesture comprises a second indication of a contact type of the first gesture, and wherein displaying the plurality of characters is performed based on the contact type (See Rider Figs. 4, 5, and paras. [0046-49]: secondary virtual input keys appear based on the gesture type, such as a continuous touch input followed by “sufficient force to actuate the switch 39 and lift the finger at the location X-Y location corresponding to the secondary virtual input key”; see also Martensson Fig. 6A and paras. [0067-70]: different pressures and/or gesture directions can determine what characters are displayed/rotated through the character wheel on the display). As per claim 7, Rider/Martensson/Laubach/Chen further teaches the method of claim 1, wherein the first input interface comprises a touch screen (See Rider para. [0046]: touch event on touch screen display). As per claims 8, 13, and 14, the claims are directed to a first electronic device that implements the same features as the method of claims 1, 6, and 7, respectively, and are therefore rejected for at least the same reasons therein. Furthermore, Rider/Martensson/Laubach/Chen discloses a first electronic device comprising: a first input interface; and a first processor coupled to the first input interface, wherein the first processor is configured to perform steps comprising said method (See Rider paras. [0015-17]). As per claims 15 and 20, the claims are directed to a computer program product that implements the same features as the method of claims 1 and 6, respectively, and are therefore rejected for at least the same reasons therein. Furthermore, Rider/Martensson/Laubach/Chen discloses a computer program product, comprising: a non-transitory computer readable medium comprising code to cause a processor to perform steps comprising said method (See Rider paras. [0015-17] and [0057]). Claims 2-5, 9-12, and 16-19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rider/Martensson/Laubach/Chen as applied above, and further in view of Kagan et al. (U.S. Publication No. 2015/0253885; hereinafter “Kagan”). As per claim 2, Rider/Martensson/Laubach/Chen further teaches the method of claim 1. However, while Rider/Martensson/Laubach/Chen teaches remote controls for the user interface features (See Martensson para. [0034]; see also Laubach para. [0086]), Rider/Martensson/Laubach/Chen does not teach second displays of a second electronic device. Kagan teaches displaying the selected first character on a second display of a second electronic device (See Kagan Figs. 7A-7D and paras. [0030], [0058], [0077], and [0130]: both computing devices can be wearables in communication with one another. This includes entering a query on a first wearable and having it execute on a separate device, which can be a wearable. Therefore, the selections of Rider/Martensson/Laubach/Chen will be displayed on the second electronic device of Kagan). It would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to combine, with a reasonable expectation of success, the remote devices of Rider/Martensson/Laubach/Chen with the wearables of Kagan. One would have been motivated to combine these references because both references disclose remotely entering and executing content from one device to another, and Kagan enhances the user experience by being conscious of “using the entirety of the display in cases where the display is relatively small in size” while also providing the user a choice on what second device receives the information, therefore increasing use case flexibility as well (See Kagan Figs. 7B-7D and paras. [0025] and [0044]). As per claim 3, while Rider/Martensson/Laubach/Chen/Kagan teaches the method of claim 2 and the second electronic device, Rider/Martensson/Laubach/Chen does not teach wearables. Kagan teaches wherein the second electronic device is a wearable electronic device (See Kagan Figs. 7A-7D and paras. [0030], [0058], [0077], and [0130]: both computing devices can be wearables in communication with one another. This includes entering a query on a first wearable and having it execute on a separate device, which can be a wearable). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to combine Rider/Martensson/Laubach/Chen with the teachings of Kagan for at least the same reasons as discussed above in claim 2. As per claim 4, while Rider/Martensson/Laubach/Chen/Kagan teaches the method of claim 2 and the second electronic device, Rider/Martensson/Laubach/Chen does not teach wireless interfaces between multiple devices. Kagan teaches wherein the first and second electronic devices further comprise first and second wireless interfaces (See Kagan paras. [0038] and [0040]: wireless connection interfaces for devices to communicate with one another). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to combine Rider/Martensson/Laubach/Chen with the teachings of Kagan for at least the same reasons as discussed above in claim 2. As per claim 5, while Rider/Martensson/Laubach/Chen teaches the method of claim 1 and the second electronic device, Rider/Martensson/Laubach/Chen does not teach wearables. Kagan teaches wherein the first electronic device is a wearable electronic device (See Kagan Figs. 7A-7D and paras. [0030], [0058], [0077], and [0130]: both computing devices can be wearables in communication with one another. This includes entering a query on a first wearable and having it execute on a separate device, which can be a wearable). It would have been obvious to one of ordinary skill in the art at the time the invention was filed to combine Rider/Martensson/Laubach/Chen with the teachings of Kagan for at least the same reasons as discussed above in claim 2. As per claims 9-12, the claims are directed to a first electronic device that implements the same features as the method of claims 2-5, respectively, and are therefore rejected for at least the same reasons therein. As per claims 16-19, the claims are directed to a computer program product that implements the same features as the method of claims 2-5, respectively, and are therefore rejected for at least the same reasons therein. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Nicholas Klicos whose telephone number is (571)270-5889. The examiner can normally be reached Mon-Fri 9:00 AM-5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott Baderman can be reached at (571) 272-3644. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NICHOLAS KLICOS/Primary Examiner, Art Unit 2118
Read full office action

Prosecution Timeline

Feb 13, 2023
Application Filed
Nov 02, 2023
Non-Final Rejection — §103, §DP
Feb 06, 2024
Response Filed
Feb 13, 2024
Final Rejection — §103, §DP
May 16, 2024
Request for Continued Examination
May 21, 2024
Response after Non-Final Action
Jun 14, 2024
Non-Final Rejection — §103, §DP
Sep 13, 2024
Response Filed
Sep 26, 2024
Final Rejection — §103, §DP
Nov 08, 2024
Response after Non-Final Action
Dec 19, 2024
Request for Continued Examination
Jan 02, 2025
Response after Non-Final Action
Feb 28, 2025
Non-Final Rejection — §103, §DP
Jun 02, 2025
Response Filed
Jun 09, 2025
Final Rejection — §103, §DP
Sep 05, 2025
Request for Continued Examination
Sep 10, 2025
Response after Non-Final Action
Sep 23, 2025
Non-Final Rejection — §103, §DP
Dec 22, 2025
Response Filed
Jan 13, 2026
Final Rejection — §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12572212
GENERATING DEVICE IDENTIFIERS AND DEVICE CONTROLS BASED ON HAND GESTURES
2y 5m to grant Granted Mar 10, 2026
Patent 12564430
Computerized Process for Making a Patient-Specific Implant
2y 5m to grant Granted Mar 03, 2026
Patent 12563695
ELECTRONIC DEVICE AND HEAT DISSIPATION METHOD THEREFOR
2y 5m to grant Granted Feb 24, 2026
Patent 12508108
AXIAL DIRECTION AND DEPTH CHECKING GUIDE PLATE FOR IMPLANTING AND MANUFACTURE METHOD THEREOF
2y 5m to grant Granted Dec 30, 2025
Patent 12512697
CONTROL PROCESS FOR LOW VOLTAGE MICROGRIDS WITH DISTRIBUTED COMMUNICATION
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

9-10
Expected OA Rounds
57%
Grant Probability
87%
With Interview (+30.2%)
3y 6m
Median Time to Grant
High
PTA Risk
Based on 361 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month