Prosecution Insights
Last updated: April 19, 2026
Application No. 19/224,159

MULTI-FUNCTION STYLUS WITH SENSOR CONTROLLER

Non-Final OA §112
Filed
May 30, 2025
Examiner
BLANCHA, JONATHAN M
Art Unit
2623
Tech Center
2600 — Communications
Assignee
Intel Corporation
OA Round
1 (Non-Final)
62%
Grant Probability
Moderate
1-2
OA Rounds
2y 7m
To Grant
71%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
408 granted / 661 resolved
At TC average
Moderate +9% lift
Without
With
+9.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
17 currently pending
Career history
678
Total Applications
across all art units

Statute-Specific Performance

§101
0.3%
-39.7% vs TC avg
§103
69.4%
+29.4% vs TC avg
§102
23.2%
-16.8% vs TC avg
§112
4.9%
-35.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 661 resolved cases

Office Action

§112
Notice of Pre-AIA or AIA Status The present application is being examined under the pre-AIA first to invent provisions. Drawings The drawings filed 5-30-25 have been accepted by the examiner. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1-20 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Regarding claim 1, the claim recites the limitation “cause a change to a typed character input value presented via the display screen based on data associated with acceleration of the electronic stylus.” The specification in the parent application 13/687,167 describes that a “typed character input value” (eg. typed from a keyboard, see “a keyboard input value can include an alphanumeric character…” discussed in [0009]) is presented via the display screen (eg. on display 110, seen in Fig. 1) based on data associated with acceleration of the electronic stylus (“certain movements of the stylus may correspond with certain keyboard input values” discussed in [0009], and “certain patterns detected by the stylus 118 may correspond with various input operations, such as operations that transmit alphanumeric characters… the stylus 118 may use the sensor data together with a gesture event to generate an input value to send to the computing device 100. For example, the stylus 118 may adjust the gesture event to account for the tilt, acceleration…” discussed in [0014]). However, there is only support for “changing” a “mouse input” based on data associated with acceleration of the electronic stylus (eg. “the gesture event that indicates a cursor is to be displayed on a display device may be altered based on sensor data, such as the tilt or change in velocity of a device” discussed in [0037], with the “cursor” being a “mouse input”). A mouse input (eg. a cursor) is not a “typed character input value” (the specification also distinguishes between them, see “certain movements of the stylus may correspond with certain keyboard input values or mouse input values, among others. In some embodiments, a keyboard input value can include an alphanumeric character, among others and a mouse input value can represent a cursor in two dimensional space or a selection of a portion of a graphical user interface, among others” discussed in [0009]) and changing the value of a cursor is not equivalent to changing a keyboard input (see also the 12-09-24 remarks in the parent application 17/738,807, where the applicant distinguishes eg. a “line” from “typed alphanumeric text” on page 9). Therefore, the limitation “cause a change to a typed character input value presented via the display screen based on data associated with acceleration of the electronic stylus” is new matter. Claim 8 recites the limitation “cause a change to a typed character input value displayed on the touchscreen based on the first signals,” which has the same issues as discussed above regarding claim 1. Claim 15 recites the limitation “cause a change to typed alphanumeric text on the display screen of the electronic device based on the velocity of the movement,” which has the same issues as discussed above regarding claim 1. Claims 2-7, 9-14, and 16-20 are dependent upon claims 1, 8, and 15, and so are rejected for the same reasons as discussed above. Allowable Subject Matter Claims 1-20 would be allowable if rewritten or amended to overcome the rejection(s) under 35 U.S.C. 112(a) set forth in this Office action. The following is a statement of reasons for the indication of allowable subject matter: Regarding claim 1, Pance et al. (US 2012/0127088) discloses (Fig. 1 and 2) a system comprising: an electronic stylus (101); and an electronic device (103) including: a display screen (105); memory (162); instructions (“storage device 162 may store operating system software that includes a set of instructions” discussed in [0031]); and at least one processor circuit (160) to be programmed by the instructions (“set of instructions that are executable on the processing device 160” discussed in [0031]) to cause a change to a character (for example, adjusting the width of the lines in the letters, as discussed in [0047]) presented via the display screen (eg. as “output shown on the touch screen” discussed in [0047]) based on data associated with acceleration of the electronic stylus (“acceleration information from an accelerometer in the haptic input device 101 may be used to change the output of a graphics creation or editing program” discussed in [0046]). However, Pance is only directed towards changing handwritten text (for example, “writing with a calligraphy pen” discussed in [0047]), and is unrelated to causing a change to a “typed” character input “value.” Iwema et al. (US 2004/0021700) discloses (Fig. 1, 2, and 4) a system comprising: an electronic stylus (204); and an electronic device (100) including: a display screen (107); memory (120); instructions (“computer-executable instructions” discussed in [0038]); and at least one processor circuit (110) to be programmed by the instructions to cause a change to a typed character input (eg. to add a “selection indicator such as marker 312, or other suitable visual indicator, could be provided to indicate which text (or other recognition result) displayed within region 314 has been selected” as discussed in [0050], seen in Fig. 4, while [0002] further clarifies the characters are “information that someone else has previously assembled, tabulated, typed…”, and see also “image of a keyboard wherein the user can "type"” discussed in [0051]) presented via the display screen (shown on the screen in Fig. 4) based on data associated with movement of the electronic stylus (“choose that text with, e.g., an appropriate pen gesture” discussed in [0052]). However, while Iwema also discloses changing a character input value (eg. “inch” character values 304 are changed to “ink” values as discussed in [0053], and seen in Fig. 4B), Iwema only teaches changing the original character input values that were “recognized” and not “typed,” and so only teaches either changing typed text (eg. by adding a selection indicator) or changing a character value (eg. “inch” into “ink”), but not changing the “value” of a text that is “typed.” Iwema also fails to teach or suggest wherein the movement of the stylus is specifically an “acceleration.” Case et al. (US 2015/0002484) discloses a system which causes a change (seen in Fig. 8, the text 810A can be changed to be bolded, as seen as 810B, see [0069]) to a typed character input value (“user to provide writing input, e.g., via a stylus or pen” discussed in [0037], and then “identify a user's input symbols and thereafter take the identified collection of symbols and translate them, e.g., into full text” discussed in [0072]) on the display (192, see Fig. 1) based on data associated with acceleration (810A is changed to 810B “on the basis of inferred emotion” as discussed in [0069], and “faster speed of input… indicative of a heightened emotional state” discussed in [0065]) of an electronic stylus (stylus 403A, seen in Fig. 4). However, Case has a filing date of June 28, 2013, and so is not valid prior art. Therefore, none of the currently cited references of record teaches or suggests a circuit to “cause a change to a typed character input value presented via the display screen based on data associated with acceleration of the electronic stylus” when combined with each of the other claim limitations. Regarding claim 8, Pance discloses (Fig. 1 and 2) a system comprising: an electronic stylus (101) including: a housing (“housing of the input device 101” discussed in [0042]); and at least one sensor (116) to output first signals associated with acceleration of the housing (“one sensor 116 may be an accelerometer” discussed in [0043]); and a mobile electronic device (103) including: a touchscreen (105); memory (162); machine readable instructions (“storage device 162 may store operating system software that includes a set of instructions” discussed in [0031]); and at least one processor circuit (160) to be programmed by the machine readable instructions (“set of instructions that are executable on the processing device 160” discussed in [0031]) to cause a change to a character (for example, adjusting the width of the lines in the letters, as discussed in [0047]) presented via the touchscreen (eg. as “output shown on the touch screen” discussed in [0047]) based on the first signals (“acceleration information from an accelerometer in the haptic input device 101 may be used to change the output of a graphics creation or editing program” discussed in [0046]). However, Pance is only directed towards changing handwritten text (for example, “writing with a calligraphy pen” discussed in [0047]), and is unrelated to causing a change to “a typed character input value.” Iwema discloses (Fig. 1, 2, and 4) a system comprising: an electronic stylus (204) including: a housing (seen in Fig. 2, eg. to hold the elements such as the “buttons” discussed in [0045]); and a mobile electronic device (100) including: a touchscreen (107, with “touch-sensitive… display” discussed in [0045]); memory (120); machine readable instructions (“computer-executable instructions” discussed in [0038]); and at least one processor circuit (110) to be programmed by the machine readable instructions to cause a change to a typed character input (eg. to add a “selection indicator such as marker 312, or other suitable visual indicator, could be provided to indicate which text (or other recognition result) displayed within region 314 has been selected” as discussed in [0050], seen in Fig. 4, while [0002] further clarifies the characters are “information that someone else has previously assembled, tabulated, typed…”, and see also “image of a keyboard wherein the user can "type"” discussed in [0051]) presented via the touchscreen (shown on the screen in Fig. 4) based on movement of the electronic stylus (“choose that text with, e.g., an appropriate pen gesture” discussed in [0052]). However, while Iwema also discloses changing a character input value (eg. “inch” character values 304 are changed to “ink” values as discussed in [0053], and seen in Fig. 4B), Iwema only teaches changing the original character input values that were “recognized” and not “typed,” and so only teaches either changing typed text (eg. by adding a selection indicator) or changing a character value (eg. “inch” into “ink”), but not changing the “value” of a text that is “typed.” Iwema also fails to teach or suggest wherein the movement of the stylus is specifically an “acceleration.” Case discloses a system which causes a change (seen in Fig. 8, the text 810A can be changed to be bolded, as seen as 810B, see [0069]) to a typed character input value (“user to provide writing input, e.g., via a stylus or pen” discussed in [0037], and then “identify a user's input symbols and thereafter take the identified collection of symbols and translate them, e.g., into full text” discussed in [0072]) on the display (192, see Fig. 1) based on data associated with acceleration (810A is changed to 810B “on the basis of inferred emotion” as discussed in [0069], and “faster speed of input… indicative of a heightened emotional state” discussed in [0065]) of an electronic stylus (stylus 403A, seen in Fig. 4). However, Case has a filing date of June 28, 2013, and so is not valid prior art. Therefore, none of the currently cited references of record teaches or suggests a circuit to “cause a change to a typed character input value displayed on the touchscreen based on the first signals” when combined with each of the other claim limitations. Regarding claim 15, Pance discloses (Fig. 1 and 2) a system comprising: an electronic device (103) including a display screen (105); an electronic stylus (101) including at least one sensor (116) to output first signals indicative of a velocity of movement of the electronic stylus housing (“one sensor 116 may be an accelerometer” discussed in [0043]), the electronic stylus to communicatively couple with the electronic device (with 166, see “receiver 166 may be configured to receive signals from the haptic input device 101” discussed in [0034]); and circuitry (160) to cause at least one of the electronic stylus or the electronic device to cause a change to alphanumeric text (for example, adjusting the width of the lines in the letters, as discussed in [0047]) on the display screen of the electronic device (eg. as “output shown on the touch screen” discussed in [0047]) based on the velocity of the movement (“acceleration information from an accelerometer in the haptic input device 101 may be used to change the output of a graphics creation or editing program” discussed in [0046]). However, Pance is only directed towards changing handwritten text (for example, “writing with a calligraphy pen” discussed in [0047]), and is unrelated to causing a change to “typed” alphanumeric text. Iwema discloses (Fig. 1, 2, and 4) a system comprising: an electronic device (100) including a display screen (107); an electronic stylus (204), the electronic stylus to communicatively couple with the electronic device (“a direct connection between the pen digitizer 165 and the serial port interface 106 is shown” discussed in [0040]); and circuitry (110) to cause at least one of the electronic stylus or the electronic device to cause a change to typed alphanumeric text (eg. to add a “selection indicator such as marker 312, or other suitable visual indicator, could be provided to indicate which text (or other recognition result) displayed within region 314 has been selected” as discussed in [0050], seen in Fig. 4, while [0002] further clarifies the characters are “information that someone else has previously assembled, tabulated, typed…”, and see also “image of a keyboard wherein the user can "type"” discussed in [0051]) on the display screen of the electronic device (shown on the screen in Fig. 4) based on movement of the electronic stylus (“choose that text with, e.g., an appropriate pen gesture” discussed in [0052]). However, Iwema fails to teach or suggest wherein the movement of the stylus is specifically an “velocity” (instead “gestures” in general are described without details on the velocity or speed of the gestures, or “tapping” on a button such as 323 are used as discussed in [0052]). Case discloses a system which causes a change (seen in Fig. 8, the text 810A can be changed to be bolded, as seen as 810B, see [0069]) to typed alphanumeric text (“user to provide writing input, e.g., via a stylus or pen” discussed in [0037], and then “identify a user's input symbols and thereafter take the identified collection of symbols and translate them, e.g., into full text” discussed in [0072]) on the display (192, see Fig. 1) based on data associated with acceleration (810A is changed to 810B “on the basis of inferred emotion” as discussed in [0069], and “faster speed of input… indicative of a heightened emotional state” discussed in [0065]) of an electronic stylus (stylus 403A, seen in Fig. 4). However, Case has a filing date of June 28, 2013, and so is not valid prior art. Therefore, none of the currently cited references of record teaches or suggests a circuit to “cause at least one of the electronic stylus or the electronic device to cause a change to typed alphanumeric text on the display screen of the electronic device based on the velocity of the movement” when combined with each of the other claim limitations. Claims 2-7, 9-14, and 16-20 are dependent upon claims 1, 8, and 15, and so would be allowable for the same reasons as discussed above. Any comments considered necessary by applicant must be submitted no later than the payment of the issue fee and, to avoid processing delays, should preferably accompany the issue fee. Such submissions should be clearly labeled “Comments on Statement of Reasons for Allowance.” Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JONATHAN M BLANCHA whose telephone number is (571)270-5890. The examiner can normally be reached Monday to Friday, 9-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chanh Nguyen can be reached at 5712727772. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JONATHAN M BLANCHA/Primary Examiner, Art Unit 2623
Read full office action

Prosecution Timeline

May 30, 2025
Application Filed
Mar 02, 2026
Non-Final Rejection — §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603033
SCANNING IMAGE DATA TO AN ARRAY OF PIXELS AT AN INTERMEDIATE SCAN RATE DURING A TRANSITION BETWEEN DIFFERENT REFRESH RATES
2y 5m to grant Granted Apr 14, 2026
Patent 12603060
Display Device
2y 5m to grant Granted Apr 14, 2026
Patent 12598285
OPTICAL DISPLAY, IMAGE CAPTURING DEVICE AND METHODS WITH VARIABLE DEPTH OF FIELD
2y 5m to grant Granted Apr 07, 2026
Patent 12585121
NEAR-EYE DISPLAY HAVING OVERLAPPING PROJECTOR ASSEMBLIES
2y 5m to grant Granted Mar 24, 2026
Patent 12578801
METHOD AND DEVICE FOR DETECTING AND RESPONDING TO USER INPUT
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
62%
Grant Probability
71%
With Interview (+9.4%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 661 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month