Prosecution Insights
Last updated: April 19, 2026
Application No. 18/462,912

Method and Device for Detecting a Touch Between a First Object and a Second Object

Non-Final OA §103
Filed
Sep 07, 2023
Examiner
HARRIS, DOROTHY H
Art Unit
2625
Tech Center
2600 — Communications
Assignee
Apple Inc.
OA Round
3 (Non-Final)
62%
Grant Probability
Moderate
3-4
OA Rounds
2y 8m
To Grant
85%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
560 granted / 898 resolved
At TC average
Strong +22% interview lift
Without
With
+22.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 8m
Avg Prosecution
29 currently pending
Career history
927
Total Applications
across all art units

Statute-Specific Performance

§101
2.6%
-37.4% vs TC avg
§103
54.6%
+14.6% vs TC avg
§102
14.6%
-25.4% vs TC avg
§112
19.4%
-20.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 898 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the response to this Office action, the Office respectfully requests that support be shown for language added to any original claims on amendment and any new claims. That is, indicate support for newly added claim language by specifically pointing to page(s) and line numbers in the specification and/or drawing figure(s). This will assist the Office in prosecuting this application. The Office has cited particular figures, elements, paragraphs and/or columns and line numbers in the references as applied to the claims for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested from the applicant, in preparing the responses, to fully consider each of the cited references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage disclosed by the Office. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on September 16, 2025 has been entered. Status of Claims - Applicant’s Amendment filed September 16, 2025 is acknowledged. - Claim(s) 31, 38, 45 is/are amended - Claim(s) 1-30 is/are canceled - Claim(s) 31-50 is/are pending in the application. Priority The application has claimed priority based on prior filed U.S. Application Serial No. 17133043 (now U.S. Patent No. 11797132) filed on December 23, 2020, prior filed U.S. Application Serial No. 14223601 (now U.S. Patent No. 10877605) filed on March 24, 2014 which is a continuation in part of PCT/EP2014/053017 filed on February 17, 2014. Specification The specification has not been checked to the extent necessary to determine the presence of all possible minor errors. Applicant’s cooperation is requested in correcting any errors of which applicant may become aware in the specification. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 31-50 is/are rejected under 35 U.S.C. 103 as being unpatentable over Saba et al, “Dante Vision: In-Air and Tough Gesture Sensing for Natural Surface Interaction with Combined Depth and Thermal Cameras”, ESPA 2012, IEEE, 2012, pages 167-170 in view of Niikura et al, U.S. Patent Publication No. 20130088422, Larson et al, HeatWave: Thermal imaging for Surface User interaction, CHI 2011, May 2011, pages 1-10 and Kurtenbach, The design and evaluation of Marking Menus, Ph.D. Thesis, Department of Computer Science, University of Toronto. May 1993, pages 1-181. Consider claim 31, Saba teaches a method comprising: obtaining a thermal image of a scene (see Saba page 169, figure 3 where thermal image in red is aligned with depth image in blue); detecting a touch temperature factors in computing systems, 2011, pp. 2565–2574 Note that Larson page 4 Software implementation discloses “Heat traces are the residual heat left behind on a surface due to the heating of that surface by another warmer object, such as a human hand”) within wherein the first object is associated with a first temperature wherein the touch temperature methods outlined in [6] E. Larson et al., “HeatWave: thermal imaging for surface user interaction,” in Proceedings of the 2011 annual conference on Human factors in computing systems, 2011, pp. 2565–2574 Note that Larson page 4 Software implementation discloses “Heat traces are the residual heat left behind on a surface due to the heating of that surface by another warmer object, such as a human hand”); Saba discloses using methods outlined in Larson with respect to detecting residual heat traces. Therefore, one of ordinary skill in the art would have been motivated to have combined the teachings of Larson with Saba so as to utilize the methods outlined in Larson to detect residual heat traces. Larson teaches utilizing thermal traces can be used as a plausible substitute for multi-touch screens and can drive typical user interfaces in real-time with naturalistic interactions (see Larson figure 6, Marking Menu and Image editing application and Prototypes and Interaction Techniques third paragraph where The first application is a multi-user and multi-touch drawing application that displays arbitrary gestures made by the users, and alters the brightness of displayed colors based on the pressure with which each user draws using three pressure levels. The second application uses line gestures for image manipulation. Images are chosen using marking menus [17], then once the images are displayed they can be translated, rotated, and scaled using thermal lines. The two applications are designed to demonstrate that thermal traces can be used as a plausible substitute for multi-touch screens and can drive typical user interfaces in real-time with naturalistic interactions. Images from interactions with each application can be seen in Figure 6). Saba/Larson appears to be silent regarding temperature intervals. However, Saba expressly discloses extracting average hand temperature and standard deviation of hand temperature when distinguishing between different users (see Saba page 169 3.2.6 Multi-User Classification) Saba/Larson is silent regarding determining a first portion of the thermal image having a predefined thermal characteristic; excluding the first portion of the thermal image to obtain a remainder of the thermal image. In a related field of endeavor, Niikura teaches applying two thresholds (temperature intervals) to a thermal images to exclude thermal image objects having temperatures outside of the temperature range being analyzed (see Niikura figures 10A-10C and paragraphs 0115-0130 specifically for example paragraph 0125 where binarization with respect to the thresholds th1 and th2 is performed for the purpose of extracting parts corresponding to temperature of the skin of a person while excluding parts having temperature lower or higher than the temperature of the skin of the person) so as to exclude for example objects having a temperature higher than body temperature so as to limit analysis to thermal objects corresponding to elements of interest such as a human body part and/or residual heat traces on a surface. One of ordinary skill in the art would have been motivated to have modified Saba/Larson with the teachings of Niikura to have excluded objects having a temperature outside the range (temperature interval) of interest so as to limit analysis of thermal objects to thermal objects of interest such as a human body part and/or residual heat traces on a surface using known techniques with predictable results. Saba is silent regarding determining a position associated with the touch temperature interval; and triggering an input action based on the determined position. Larson expressly discusses utilizing thermal traces can be used as a plausible substitute for multi-touch screens and can drive typical user interfaces (see above). Larson does not expressly describe determining a position associated with the touch temperature interval; and triggering an input action based on the determined position. Examiner takes Official Notice that it is well known in the touch input art that determining a touch location and triggering an input action based on the determined position is well-known. However, in the interest of compact prosecution, Examiner notes that Larson references Kurtenbach, who describes menu marking inputs in response to displaying a menu at a user selected location (see Kurtenbach On-line interactive methods starting on page 16-21 specifically for example page 19 where If a user is unsure of what marks can be made, the user presses the pen against the display and waits for approximately 1/3 of a second. This signals to the system that no mark is being made and it then prompts the user with a radial menu of the available commands, which appears directly under the cursor. The user may then select a command from the radial menu by keeping the pen tip pressed and making a stroke towards the desired menu item. This results in the item being highlighted (see Figure 1.7). The selection is confirmed when the pen is lifted from the display. It is implicit that a location of a user pressing the pen against the display would be determined in order to subsequently detect a user’s selection of a command and subsequently executing a user’s selected command). One of ordinary skill would have been motivated to have modified Saba with the teachings of Larson/Kurtenbach to incorporate determining a position associated with the touch temperature interval; and triggering an input action based on the determined position so as to utilizing thermal traces as a substitute for multi-touch screens and drive typical user interfaces in real-time with naturalistic interactions. Consider claim 32, Saba as modified by Larson, Kurtenbach and Niikura teaches all the limitations of claim 31 and further teaches wherein the touch temperature interval is between the first temperature interval and the second temperature interval (see Larson page 4 Software implementation discloses “Heat traces are the residual heat left behind on a surface due to the heating of that surface by another warmer object, such as a human hand”). Consider claim 33, Saba as modified by Larson, Kurtenbach and Niikura teaches all the limitations of claim 31 and further teaches wherein the predefined thermal characteristic comprises a detected temperature greater than a threshold temperature (see Niikura figures 10A-10C and paragraphs 0115-0130 specifically for example paragraph 0125 binarization with respect to the thresholds th1 and th2 is performed for the purpose of extracting parts corresponding to temperature of the skin of a person while excluding parts having temperature lower or higher than the temperature of the skin of the person). Consider claim 34, Saba as modified by Larson, Kurtenbach and Niikura teaches all the limitations of claim 31 and further teaches wherein the thermal image is comprised in a series of thermal image frames (see Saba pages 168-169 where multiple frames are utilized to obtain temperature differences and Niikura figure 8, Frame N, N+1, figure 12, F103), and wherein an excluded object is detected in accordance with a portion of each of the image frames having the predefined thermal characteristic (see Niikura paragraph 0114-0181 specifically for example figure 12, F105-F107 and paragraphs 0125, 0159-0164 where binarization with respect to the thresholds th1 and th2 is performed for the purpose of extracting parts corresponding to temperature of the skin of a person while excluding parts having temperature lower or higher than the temperature of the skin of the person). Consider claim 35, Saba as modified by Larson, Kurtenbach and Niikura teaches all the limitations of claim 31 and further teaches further comprising: classifying one or more pixels of the thermal image based on thermal characteristics at the one or more pixels (see Larson page 5, Uncalibrated Heat Trace Detection where we only look for heat traces in pixel locations where the hand has recently traveled. This reduces the search space significantly, and thus drastically decreases computational complexity. We frame the detection of heat traces as a Bayesian estimation problem. In particular, we observe the likelihood of a pixel being part of a heat trace given three features: smoothed temperature, temporal derivative, and background subtracted temperature… When the probability of a pixel being a heat trace, P(hp|x), surpasses a global threshold, we classify the pixel as a heat trace ). Consider claim 36, Saba as modified by Larson, Kurtenbach and Niikura teaches all the limitations of claim 31 and further teaches wherein a first object comprises a portion of a user (see Saba page 169, figure 3 where a user’s hand is illustrated). Consider claim 37, Saba as modified by Larson, Kurtenbach and Niikura teaches all the limitations of claim 31 and further teaches wherein excluding the first portion of the thermal image to obtain a remainder of the thermal image further comprises: obtaining a visual image comprising at least part of a first object and at least part of a second object (see Saba page 169, figure 3 where thermal image in red is aligned with depth image in blue where depth image corresponds to a visual image); determining a visual feature of the at least part of the first object or the at least part of the second object (see Saba page 169, figure 3 where thermal image in red is aligned with depth image in blue where depth image corresponds to a visual image); and Saba/Larson/Niikura is silent regarding excluding the first portion of a thermal image in accordance with a determined visual feature. Saba clearly teaches aligning depth (visual) images with thermal images so as to detect hand/finger/user inputs (see Saba pages 168-170 where hand detection, fingertip tracking and feature extraction is discussed. Notice that “If the hand is above a threshold (about 2 cm), the hand detection is performed on Dt diff. This is because the homography no longer reliably transforms from the thermal perspective to depth due to parallax differences between the two cameras”. One of ordinary skill would have, without inventive inspiration, used a visual image to determine whether a hand or finger was present so as to exclude portions of an image not corresponding to a hand or finger so as to accurately detect user inputs according to a visual image determination that a position of interest does not correspond to a hand/finger using skin color matching, contour detection, and motion tracking to identify various body parts and infer multi-touch gestures (see Saba page 167, 2. Motivation and related work where Traditional (RGB) cameras have seen considerable use for detecting hand gestures and touch points [1-3]. There has been substantial work in skin color matching, contour detection, and motion tracking to identify various body parts and infer multi-touch gestures [1], [4], [9], [10] as well as pressure of touch via finger deformation [11].) Consider claim 38, Saba as modified by Larson, Kurtenbach and Niikura teaches determine a first portion of the thermal image having a predefined thermal characteristic (see Niikura figures 10A-10C and paragraphs 0115-0130 specifically for example paragraph 0125 where binarization with respect to the thresholds th1 and th2 is performed for the purpose of extracting parts corresponding to temperature of the skin of a person while excluding parts having temperature lower or higher than the temperature of the skin of the person); exclude the first portion of the thermal image to obtain a remainder of the thermal image (see Niikura figures 10A-10C and paragraphs 0115-0130 specifically for example paragraph 0125 where binarization with respect to the thresholds th1 and th2 is performed for the purpose of extracting parts corresponding to temperature of the skin of a person while excluding parts having temperature lower or higher than the temperature of the skin of the person); and detect a touch temperature interval (see Saba page 168, 3.2 Algorithms where For residual heat transfer detection and pressure classification, we use the methods outlined in [6] E. Larson et al., “HeatWave: thermal imaging for surface user interaction,” in Proceedings of the 2011 annual conference on Human factors in computing systems, 2011, pp. 2565–2574 Note that Larson page 4 Software implementation discloses “Heat traces are the residual heat left behind on a surface due to the heating of that surface by another warmer object, such as a human hand”) within the remainder of the thermal image in response to determining a touch between a first object and a second object (see Saba page 169, 3.2.5 Feature Extraction where features are fed into a C4.5 tree classifier. The output of the classifier is one of two states: (1) touch down or (2) hover. This provides us with an efficient means of classifying if a user touches down onto the surface. In addition, we can combine this with the residual heat transfer detection implemented in [6] to dynamically retrain our touch classifier. If we detect residual heat transfer in previous frames from the thermal camera, then we know the user was pressing down on the surface,), wherein the first object is associated with a first temperature interval and the second object is associated with a second temperature interval (see Saba page 168, 3.1 Camera Hardware where wooden tabletop is used for interaction by user’s hand and page 3 where he background and hand temperatures can be assumed to occupy significantly different temperature ranges), and wherein the touch temperature interval is based on the first temperature interval and the second temperature interval (see Saba page 168, 3.2 Algorithms where For residual heat transfer detection and pressure classification, we use the methods outlined in [6] E. Larson et al., “HeatWave: thermal imaging for surface user interaction,” in Proceedings of the 2011 annual conference on Human factors in computing systems, 2011, pp. 2565–2574 Note that Larson page 4 Software implementation discloses “Heat traces are the residual heat left behind on a surface due to the heating of that surface by another warmer object, such as a human hand”); determine a position associated with the touch temperature interval (see Larson figure 6, Marking Menu and Image editing application and Prototypes and Interaction Techniques third paragraph where The first application is a multi-user and multi-touch drawing application that displays arbitrary gestures made by the users, and alters the brightness of displayed colors based on the pressure with which each user draws using three pressure levels. The second application uses line gestures for image manipulation. Images are chosen using marking menus [17], then once the images are displayed they can be translated, rotated, and scaled using thermal lines. The two applications are designed to demonstrate that thermal traces can be used as a plausible substitute for multi-touch screens and can drive typical user interfaces in real-time with naturalistic interactions. Images from interactions with each application can be seen in Figure 6 and Kurtenbach On-line interactive methods starting on page 16-21 specifically for example page 19 where If a user is unsure of what marks can be made, the user presses the pen against the display and waits for approximately 1/3 of a second. This signals to the system that no mark is being made and it then prompts the user with a radial menu of the available commands, which appears directly under the cursor. The user may then select a command from the radial menu by keeping the pen tip pressed and making a stroke towards the desired menu item. This results in the item being highlighted (see Figure 1.7). The selection is confirmed when the pen is lifted from the display); and trigger an input action based on the determined position (see Larson figure 6, Marking Menu and Image editing application and Prototypes and Interaction Techniques third paragraph where The first application is a multi-user and multi-touch drawing application that displays arbitrary gestures made by the users, and alters the brightness of displayed colors based on the pressure with which each user draws using three pressure levels. The second application uses line gestures for image manipulation. Images are chosen using marking menus [17], then once the images are displayed they can be translated, rotated, and scaled using thermal lines. The two applications are designed to demonstrate that thermal traces can be used as a plausible substitute for multi-touch screens and can drive typical user interfaces in real-time with naturalistic interactions. Images from interactions with each application can be seen in Figure 6 and Kurtenbach On-line interactive methods starting on page 16-21 specifically for example page 19 where If a user is unsure of what marks can be made, the user presses the pen against the display and waits for approximately 1/3 of a second. This signals to the system that no mark is being made and it then prompts the user with a radial menu of the available commands, which appears directly under the cursor. The user may then select a command from the radial menu by keeping the pen tip pressed and making a stroke towards the desired menu item. This results in the item being highlighted (see Figure 1.7). The selection is confirmed when the pen is lifted from the display). Saba does not appear to explicitly disclose a non-transitory computer readable medium comprising computer readable code executable by one or more processors. Saba explicitly discloses algorithms and processing (see Saba page 168, figure 3 and Methodology where processing lag is constrained to be no more than one frame at 20 frames per second). Larson teaches processing on an external computer (see Larson page 4, Hardware second paragraph and page 4 Software implementation). Niikura teaches a CPU for executing programs store in a ROM so as to control a system in response to a user’s operations (see Niikura paragraph 0086 where CPU 2 collectively controls the television receiver 20 as a whole by executing programs stored in the ROM 4, for example. Specifically, the CPU 2 effects control by sending control commands and control data to the main function unit 6 via the I/O port 5 in response to the user's operations and the programs so that operations necessary in the main function unit 6 are executed). One of ordinary skill in the art would have been motivated to have further modified Saba to have a CPU and software for executing programs store in a memory so as to control a system in response to a user’s operations using known techniques with predictable results. Consider claim 45, Saba as modified by Larson, Kurtenbach and Niikura teaches a system comprising: one or more processors; and one or more computer readable media comprising computer readable code executable by the one or more processors (see Niikura paragraph 0086 where CPU 2 collectively controls the television receiver 20 as a whole by executing programs stored in the ROM 4, for example. Specifically, the CPU 2 effects control by sending control commands and control data to the main function unit 6 via the I/O port 5 in response to the user's operations and the programs so that operations necessary in the main function unit 6 are executed) to: obtain a thermal image of a scene (see Saba page 169, figure 3 where thermal image in red is aligned with depth image in blue); determine a first portion of the thermal image having a predefined thermal characteristic (see Niikura figures 10A-10C and paragraphs 0115-0130 specifically for example paragraph 0125 where binarization with respect to the thresholds th1 and th2 is performed for the purpose of extracting parts corresponding to temperature of the skin of a person while excluding parts having temperature lower or higher than the temperature of the skin of the person); exclude the first portion of the thermal image to obtain a remainder of the thermal image (see Niikura figures 10A-10C and paragraphs 0115-0130 specifically for example paragraph 0125 where binarization with respect to the thresholds th1 and th2 is performed for the purpose of extracting parts corresponding to temperature of the skin of a person while excluding parts having temperature lower or higher than the temperature of the skin of the person); and detect a touch temperature interval (see Saba page 168, 3.2 Algorithms where For residual heat transfer detection and pressure classification, we use the methods outlined in [6] E. Larson et al., “HeatWave: thermal imaging for surface user interaction,” in Proceedings of the 2011 annual conference on Human factors in computing systems, 2011, pp. 2565–2574 Note that Larson page 4 Software implementation discloses “Heat traces are the residual heat left behind on a surface due to the heating of that surface by another warmer object, such as a human hand”) within the remainder of the thermal image in response to determining a touch between a first object and a second object (see Saba page 169, 3.2.5 Feature Extraction where features are fed into a C4.5 tree classifier. The output of the classifier is one of two states: (1) touch down or (2) hover. This provides us with an efficient means of classifying if a user touches down onto the surface. In addition, we can combine this with the residual heat transfer detection implemented in [6] to dynamically retrain our touch classifier. If we detect residual heat transfer in previous frames from the thermal camera, then we know the user was pressing down on the surface,), wherein the first object is associated with a first temperature interval and the second object is associated with a second temperature interval (see Saba page 168, 3.1 Camera Hardware where wooden tabletop is used for interaction by user’s hand and page 3 where he background and hand temperatures can be assumed to occupy significantly different temperature ranges), and wherein the touch temperature interval is based on the first temperature interval and the second temperature interval (see Saba page 168, 3.2 Algorithms where For residual heat transfer detection and pressure classification, we use the methods outlined in [6] E. Larson et al., “HeatWave: thermal imaging for surface user interaction,” in Proceedings of the 2011 annual conference on Human factors in computing systems, 2011, pp. 2565–2574 Note that Larson page 4 Software implementation discloses “Heat traces are the residual heat left behind on a surface due to the heating of that surface by another warmer object, such as a human hand”); determine a position associated with the touch temperature interval (see Larson figure 6, Marking Menu and Image editing application and Prototypes and Interaction Techniques third paragraph where The first application is a multi-user and multi-touch drawing application that displays arbitrary gestures made by the users, and alters the brightness of displayed colors based on the pressure with which each user draws using three pressure levels. The second application uses line gestures for image manipulation. Images are chosen using marking menus [17], then once the images are displayed they can be translated, rotated, and scaled using thermal lines. The two applications are designed to demonstrate that thermal traces can be used as a plausible substitute for multi-touch screens and can drive typical user interfaces in real-time with naturalistic interactions. Images from interactions with each application can be seen in Figure 6 and Kurtenbach On-line interactive methods starting on page 16-21 specifically for example page 19 where If a user is unsure of what marks can be made, the user presses the pen against the display and waits for approximately 1/3 of a second. This signals to the system that no mark is being made and it then prompts the user with a radial menu of the available commands, which appears directly under the cursor. The user may then select a command from the radial menu by keeping the pen tip pressed and making a stroke towards the desired menu item. This results in the item being highlighted (see Figure 1.7). The selection is confirmed when the pen is lifted from the display); and trigger an input action based on the determined position (see Larson figure 6, Marking Menu and Image editing application and Prototypes and Interaction Techniques third paragraph where The first application is a multi-user and multi-touch drawing application that displays arbitrary gestures made by the users, and alters the brightness of displayed colors based on the pressure with which each user draws using three pressure levels. The second application uses line gestures for image manipulation. Images are chosen using marking menus [17], then once the images are displayed they can be translated, rotated, and scaled using thermal lines. The two applications are designed to demonstrate that thermal traces can be used as a plausible substitute for multi-touch screens and can drive typical user interfaces in real-time with naturalistic interactions. Images from interactions with each application can be seen in Figure 6 and Kurtenbach On-line interactive methods starting on page 16-21 specifically for example page 19 where If a user is unsure of what marks can be made, the user presses the pen against the display and waits for approximately 1/3 of a second. This signals to the system that no mark is being made and it then prompts the user with a radial menu of the available commands, which appears directly under the cursor. The user may then select a command from the radial menu by keeping the pen tip pressed and making a stroke towards the desired menu item. This results in the item being highlighted (see Figure 1.7). The selection is confirmed when the pen is lifted from the display). . Claims 39-44, 46-50 recite similar claim limitations as claims 32-37, and thus are rejected under similar rational as claims 32-37 detail above. Response to Arguments Applicant’s arguments with respect to claim(s) have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Regarding Applicant’s assertion that Saba does not register the touch within an (temperature) interval, Examiner respectfully directs Applicant’s attention to the rejection above where Saba discloses extracting average hand temperature and standard deviation of hand temperature when distinguishing between different users (see Saba page 169 3.2.6 Multi-User Classification) and Niikura teaches temperature thresholds corresponding to temperature intervals. Regarding Applicant’s assertion that Larson is limited to a drawing application, Examiner respectfully directs Applicant’s attention to Larson figure 6, Marking Menu and Image editing application and Prototypes and Interaction Techniques third paragraph where “The first application is a multi-user and multi-touch drawing application that displays arbitrary gestures made by the users, and alters the brightness of displayed colors based on the pressure with which each user draws using three pressure levels. The second application uses line gestures for image manipulation. Images are chosen using marking menus [17], then once the images are displayed they can be translated, rotated, and scaled using thermal lines. The two applications are designed to demonstrate that thermal traces can be used as a plausible substitute for multi-touch screens and can drive typical user interfaces in real-time with naturalistic interactions. Images from interactions with each application can be seen in Figure 6”. Regarding Applicant’s assertion that “Larson is limited to heat traces drawn by a user, but does not take into account the interval of time over in which the user is creating the heat traces”, Examiner respectfully notes that Applicant’s claim language merely recites “temperature interval”. Applicant’s claim language does not include the term “time”. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Yoshimoto et al, U.S. Patent Publication No. 20110043489 (display device and control method), Murthi et al, U.S. Patent Publication No. 20120326959 (region of interest segmentation), Baker et al, U.S. Patent Publication No. 20120327218 (resource conservation based on a region of interest), Stafford, U.S. Patent Publication No. 20130257751 (detection of interaction with virtual object from finger color change). Any inquiry concerning this communication or earlier communications from the examiner should be directed to Dorothy H Harris whose telephone number is (571)270-7539. The examiner can normally be reached Monday - Friday 8am - 4pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Boddie can be reached at 571-272-0666. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Dorothy Harris/Primary Examiner, Art Unit 2625
Read full office action

Prosecution Timeline

Sep 07, 2023
Application Filed
Oct 30, 2024
Non-Final Rejection — §103
Feb 04, 2025
Response Filed
Apr 29, 2025
Final Rejection — §103
Sep 16, 2025
Request for Continued Examination
Oct 01, 2025
Response after Non-Final Action
Oct 27, 2025
Non-Final Rejection — §103
Mar 23, 2026
Applicant Interview (Telephonic)
Mar 23, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601924
ELECTRONIC DEVICE SECUREMENT STRAP
2y 5m to grant Granted Apr 14, 2026
Patent 12578810
POSITION DETECTION DEVICE
2y 5m to grant Granted Mar 17, 2026
Patent 12579936
DISPLAY DEVICE
2y 5m to grant Granted Mar 17, 2026
Patent 12567359
PIXEL CIRCUIT, METHOD FOR DRIVING PIXEL CIRCUIT, DISPLAY PANEL AND DISPLAY APPARATUS
2y 5m to grant Granted Mar 03, 2026
Patent 12561025
TOUCH APPARATUS
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
62%
Grant Probability
85%
With Interview (+22.3%)
2y 8m
Median Time to Grant
High
PTA Risk
Based on 898 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month