Prosecution Insights
Last updated: April 19, 2026
Application No. 17/622,430

TIME-OF-FLIGHT SENSING CIRCUITRY AND METHOD FOR OPERATING A TIME-OF-FLIGHT SENSING CIRCUITRY

Non-Final OA §103§DP
Filed
Dec 23, 2021
Examiner
VASQUEZ JR, ROBERT WILLIAM
Art Unit
3645
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Sony Semiconductor Solutions Corporation
OA Round
3 (Non-Final)
12%
Grant Probability
At Risk
3-4
OA Rounds
4y 1m
To Grant
-4%
With Interview

Examiner Intelligence

Grants only 12% of cases
12%
Career Allow Rate
1 granted / 8 resolved
-39.5% vs TC avg
Minimal -17% lift
Without
With
+-16.7%
Interview Lift
resolved cases with interview
Typical timeline
4y 1m
Avg Prosecution
53 currently pending
Career history
61
Total Applications
across all art units

Statute-Specific Performance

§101
2.1%
-37.9% vs TC avg
§103
53.5%
+13.5% vs TC avg
§102
32.7%
-7.3% vs TC avg
§112
7.7%
-32.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 8 resolved cases

Office Action

§103 §DP
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The Amendment filed January 30th, 2026 has been entered. Claims 1-2,4-5,7-12,14-15,17-18 and 20-25 remain pending in the application. Applicants’ amendments to the claims have overcome the Nonstatutory double patenting, 101 and 112(b) rejection previously set forth in the Final Office Action previously set forth on October 30th, 2025. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-2, 5, 7, 9-12, 15, 17, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Metz et al. (United States Patent Application Publication 20160003937 A1), hereinafter Metz, in view of Smith et al. (United States Patent Application Publication 20180278843 A1), hereinafter Smith. Regarding claim 1, Metz teaches A time-of-flight sensing circuitry for sensing image information in different imaging modes ([0017] a time-of-flight apparatus 100 in accordance with various embodiments; [0044] This analysis 904 may occur in a scene analysis module 116 of the processing device 102; [0062] For example, a module may be implemented as a hardware circuit comprising custom Very-Large-Scale Integrated (VLSI) circuits or gate arrays, off-the-shelf semiconductors such as logic chips, transistors, or other discrete components.), comprising circuitry configured to: detect light and output light sensing signals ([0017] at least one time-of-flight image sensor 106 operatively coupled to the processing device 102; [0020] Various kinds of light can be emitted by the illuminator 104 to be utilized in the disclosed embodiments,); and process the light sensing signals from the circuitry, wherein the circuitry is configured to dynamically set an imaging mode among the different imaging modes ([0046] Using the analysis 904 from the scene analysis module 116, the processing device 102 is further configured to determine 912 a mode of operation; [0049] Altering these factors within the mode of operation allows for dynamic adaptation of the mode of operation to suit the needs of the scene or other settings.), Metz fails to teach the circuitry wherein an imaging mode sequence is a predetermined sequence of the different imaging modes which is initiated by the circuitry producing a trigger peak for a first imaging mode which is then followed by a second imaging mode. However, Smith teaches wherein an imaging mode sequence is a predetermined sequence of the different imaging modes which is initiated by the circuitry producing a trigger peak for a first imaging mode which is then followed by a second imaging mode. ([Fig. 4]; [0050] The method 400 begins at block 410 with a command to configure the depth sensor 100.; [0051] the depth sensing system 300 described herein instead continues onto block 430 where it loads operation sequences for second through Nth depth sensing modes into groups of memory bins of the depth sensor 100.). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Metz to comprise the predetermined imaging mode sequence similar to Smith, with a reasonable expectation of success. This would have the predictable result of arranging a predetermined configuration for basic operations of the imaging circuitry. Regarding claim 2, Metz, as modified above, teaches the time-of-flight sensing circuitry according to claim 1, wherein the dynamical setting of the imaging mode among the different imaging modes is based on an imaging mode sequence ([0044] The method 900 continues by the processing device 102 analyzing 904 the at least one high-resolution image or the high-resolution depth map frame. This analysis 904 may occur in a scene analysis module 116 of the processing device 102 or an ambient light analysis module 120 that may be part of or separate from the scene analysis module 116; [0046] Using the analysis 904 from the scene analysis module 116, the processing device 102 is further configured to determine 912 a mode of operation). Regarding claim 5, Metz, as modified above, teaches The time-of-flight sensing circuitry according to claim 1, wherein the circuitry is configured to output a light sensing signal of a first type in a first imaging mode and a light sensing signal of a second type in a second imaging mode ([0044] The method 900 continues by the processing device 102 analyzing 904 the at least one high-resolution image or the high-resolution depth map frame. This analysis 904 may occur in a scene analysis module 116 of the processing device 102 or an ambient light analysis module 120 that may be part of or separate from the scene analysis module 116; [0046] Using the analysis 904 from the scene analysis module 116, the processing device 102 is further configured to determine 912 a mode of operation). Regarding claim 7, Metz, as modified above, teaches the time-of-flight sensing circuitry according to claim 5, wherein the first imaging mode includes a first time-of-flight imaging mode for acquiring distance information and the second imaging mode includes an infrared imaging mode for acquiring object information ([0044] By various embodiments, this 904 analysis may comprise determining 908 an ambient light interference level and/or determining 910 a depth of at least one foreground object 122 in the image; [0020] Various kinds of light can be emitted by the illuminator 104 to be utilized in the disclosed embodiments, including infrared (IR) light, visible spectrum light, or ultraviolet light). Regarding claim 9, Metz, as modified above, teaches The time-of-flight sensing circuitry according to claim 5, wherein the circuitry is configured to output a light sensing signal of a third type in a third imaging mode, the third imaging mode including at least one of full-field time-of-flight imaging mode and mosaicked time-of-flight imaging mode ([0053] Like the ambient light interference threshold level, the threshold depth may represent a single threshold (fixed or variable) or multiple thresholds (fixed or variable) corresponding to different depths and multiple modes of operation depending upon the needs of the apparatus 100; [0058] By these multiple embodiments described above, the processing device 102 is capable of varying the mode of operation by varying the binning factor and/or the illuminator output power as is required by the scene). Regarding claim 10, Metz teaches the time-of-flight sensing circuitry according to claim 1, wherein the circuitry includes a sequencer circuitry and a register circuitry ([0044] The method 900 continues by the processing device 102 analyzing 904 the at least one high-resolution image or the high-resolution depth map frame. This analysis 904 may occur in a scene analysis module 116 of the processing device 102 or an ambient light analysis module 120 that may be part of or separate from the scene analysis module 116; [0046] Using the analysis 904 from the scene analysis module 116, the processing device 102 is further configured to determine 912 a mode of operation), Metz fails to explicitly teach the register circuitry including multiple registers for storing data from the light signals and wherein those multiple registers are used for the imaging mode selection. However, Smith does teach a depth sensing time-of-flight device wherein the register circuitry includes multiple registers for storing data which are derived on the basis of the light sensing signals and wherein each imaging mode of the different imaging modes is based on a predetermined set of registers, and wherein the sequencer circuitry is adapted to dynamically select a set of registers for setting the imaging mode among the different imaging modes ([0044] A conventional depth sensor typically has multiple memory bins for holding programming instructions. Each bin can hold, for example, one of the operations shown in the operation sequences of Tables 1-4…Thus, using conventional methods, 6+6+10+10=32 memory bins could be required in order to program the depth sensor 100 to be capable of operating in all four of these depth sensing modes; [0045] The depth sensor 100 can be programmed to operate in any of the depth sensing modes illustrated in Tables 1-4, as well as others, by loading the respective operation steps (and associated settings) into the sensor's memory bins). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Metz to comprise the multiple-register dependent image mode sequencing similar to Smith, with a reasonable expectation of success. This would have the predictable result of allocating dedicated memory registers for the variety of imaging modes for faster processing time under specific environmental conditions. Regarding claim 11, Metz teaches A method for operating a time-of-flight sensing circuitry for sensing image information in different imaging modes, wherein the time-of-flight sensing circuitry includes a circuitry for detecting light and outputting light sensing signals and circuitry for processing the light sensing signals from the light sensing circuitry ([0017] at least one time-of-flight image sensor 106 operatively coupled to the processing device 102), the method comprising: dynamically setting an imaging mode among the different imaging modes ([0046] Using the analysis 904 from the scene analysis module 116, the processing device 102 is further configured to determine 912 a mode of operation; [0049] Altering these factors within the mode of operation allows for dynamic adaptation of the mode of operation to suit the needs of the scene or other settings.). Metz fails to teach the method wherein an imaging mode sequence is a predetermined sequence of the different imaging modes which is initiated by the circuitry producing a trigger peak for a first imaging mode which is then followed by a second imaging mode. However, Smith teaches the method wherein an imaging mode sequence is a predetermined sequence of the different imaging modes which is initiated by the circuitry producing a trigger peak for a first imaging mode which is then followed by a second imaging mode. ([Fig. 4]; [0050] The method 400 begins at block 410 with a command to configure the depth sensor 100.; [0051] the depth sensing system 300 described herein instead continues onto block 430 where it loads operation sequences for second through Nth depth sensing modes into groups of memory bins of the depth sensor 100.). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Metz to comprise the predetermined imaging mode sequence similar to Smith, with a reasonable expectation of success. This would have the predictable result of arranging a predetermined configuration for basic operations of the imaging circuitry. Regarding claim 12, Metz, as modified above, teaches the method according to claim 11, wherein the dynamical setting of the imaging mode among the different imaging modes is based on an imaging mode sequence ([0044] The method 900 continues by the processing device 102 analyzing 904 the at least one high-resolution image or the high-resolution depth map frame. This analysis 904 may occur in a scene analysis module 116 of the processing device 102 or an ambient light analysis module 120 that may be part of or separate from the scene analysis module 116; [0046] Using the analysis 904 from the scene analysis module 116, the processing device 102 is further configured to determine 912 a mode of operation). Regarding claim 15, Metz, as modified above, teaches the method according to claim 11, further comprising: outputting a light sensing signal of a first type in a first imaging mode and a light sensing signal of a second type in a second imaging mode ([0044] The method 900 continues by the processing device 102 analyzing 904 the at least one high-resolution image or the high-resolution depth map frame. This analysis 904 may occur in a scene analysis module 116 of the processing device 102 or an ambient light analysis module 120 that may be part of or separate from the scene analysis module 116; [0046] Using the analysis 904 from the scene analysis module 116, the processing device 102 is further configured to determine 912 a mode of operation). Regarding claim 17, Metz, as modified above, teaches the method according to claim 15, wherein the first imaging mode includes a first time-of-flight imaging mode for acquiring distance information and the second imaging mode includes an infrared imaging mode for acquiring object information ([0044] By various embodiments, this 904 analysis may comprise determining 908 an ambient light interference level and/or determining 910 a depth of at least one foreground object 122 in the image; [0020] Various kinds of light can be emitted by the illuminator 104 to be utilized in the disclosed embodiments, including infrared (IR) light, visible spectrum light, or ultraviolet light). Regarding claim 20, Metz teaches the method according to claim 11, wherein the circuitry includes a sequencer circuitry and a register circuitry ([0044] The method 900 continues by the processing device 102 analyzing 904 the at least one high-resolution image or the high-resolution depth map frame. This analysis 904 may occur in a scene analysis module 116 of the processing device 102 or an ambient light analysis module 120 that may be part of or separate from the scene analysis module 116; [0046] Using the analysis 904 from the scene analysis module 116, the processing device 102 is further configured to determine 912 a mode of operation), Metz fails to explicitly teach the register circuitry including multiple registers for storing data from the light signals and wherein those multiple registers are used for the imaging mode selection. However, Smith does teach a depth sensing time-of-flight device wherein the register circuitry includes multiple registers for storing data which are derived on a basis of the light sensing signals and wherein each imaging mode of the different imaging modes is based on a predetermined set of registers, and wherein the sequencer circuitry is adapted to dynamically select a set of registers for setting the imaging mode among the different imaging modes ([0044] A conventional depth sensor typically has multiple memory bins for holding programming instructions. Each bin can hold, for example, one of the operations shown in the operation sequences of Tables 1-4…Thus, using conventional methods, 6+6+10+10=32 memory bins could be required in order to program the depth sensor 100 to be capable of operating in all four of these depth sensing modes; [0045] The depth sensor 100 can be programmed to operate in any of the depth sensing modes illustrated in Tables 1-4, as well as others, by loading the respective operation steps (and associated settings) into the sensor's memory bins). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Metz to comprise the multiple-register dependent image mode sequencing similar to Smith, with a reasonable expectation of success. This would have the predictable result of allocating dedicated memory registers for the variety of imaging modes for faster processing time under specific environmental conditions. Claims 4, 8, 14, and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Metz in view of Smith, further in view of Oggier (United States Patent Application Publication 20130148102 A1), hereinafter Oggier. Regarding claim 4, Metz, as modified above, teaches the time-of-flight sensing circuitry according to claim 1, wherein the different imaging modes include a full frame mode (fig.3; [0028] The various measurements for each pixel 310 are then processed per equations [3] and [1] above to produce the depth map frame 312. As shown in FIG. 3, each pixel 310 of the depth map frame 312 has a corresponding calculated z value represented by “D1”, “D2”, and so forth.), a binned frame mode (fig. 8; [0038] FIG. 8 is a modified version of FIG. 3 that illustrates the binning mode of operation 800), an infrared mode ([0020] Various kinds of light can be emitted by the illuminator 104 to be utilized in the disclosed embodiments, including infrared (IR) light, visible spectrum light, or ultraviolet light), a two-dimensional mode (fig. 3; [0028] The depth map frame 312 can then be used for many purposes, one of which includes combining it with a standard two-dimensional image taken simultaneously, or near simultaneously), a full field mode (fig.3; [0028] The various measurements for each pixel 310 are then processed per equations [3] and [1] above to produce the depth map frame 312. As shown in FIG. 3, each pixel 310 of the depth map frame 312 has a corresponding calculated z value represented by “D1”, “D2”, and so forth.), and a mosaicked mode (fig. 8; [0038] FIG. 8 is a modified version of FIG. 3 that illustrates the binning mode of operation 800). Metz fails to explicitly teach a spot Time of Flight mode as an available imaging mode However, Oggier teaches a spot Time of Flight mode as an ideal imaging mode option ([0035] Going to its extreme, the ideal case would be a single spot TOF measurement. In that case, the lateral resolution gets lost.) It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Metz to comprise the spot Time of Flight imaging mode similar to Oggier, with a reasonable expectation of success. This would have the predictable result of providing an additional imaging mode that is idea eliminating multi-path interference. Regarding claim 8, Metz, as modified above, teaches the time-of-flight sensing circuitry according to claim 7 Metz fails to explicitly teach wherein the first time-of-flight imaging mode is a spot time of flight imaging mode. However, Oggier teaches a spot Time of Flight mode as an ideal imaging mode option ([0035] Going to its extreme, the ideal case would be a single spot TOF measurement. In that case, the lateral resolution gets lost.) It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Metz to comprise the spot Time of Flight imaging mode similar to Oggier, with a reasonable expectation of success. This would have the predictable result of providing an additional imaging mode that is idea eliminating multi-path interference. Regarding claim 14, Metz, as modified above, teaches the method according to claim 11, wherein the different imaging modes include a full frame mode (fig.3; [0028] The various measurements for each pixel 310 are then processed per equations [3] and [1] above to produce the depth map frame 312. As shown in FIG. 3, each pixel 310 of the depth map frame 312 has a corresponding calculated z value represented by “D1”, “D2”, and so forth.), a binned frame mode (fig. 8; [0038] FIG. 8 is a modified version of FIG. 3 that illustrates the binning mode of operation 800), an infrared mode ([0020] Various kinds of light can be emitted by the illuminator 104 to be utilized in the disclosed embodiments, including infrared (IR) light, visible spectrum light, or ultraviolet light), a two-dimensional mode (fig. 3; [0028] The depth map frame 312 can then be used for many purposes, one of which includes combining it with a standard two-dimensional image taken simultaneously, or near simultaneously), a full field mode (fig.3; [0028] The various measurements for each pixel 310 are then processed per equations [3] and [1] above to produce the depth map frame 312. As shown in FIG. 3, each pixel 310 of the depth map frame 312 has a corresponding calculated z value represented by “D1”, “D2”, and so forth.), and a mosaicked mode (fig. 8; [0038] FIG. 8 is a modified version of FIG. 3 that illustrates the binning mode of operation 800). Metz fails to explicitly teach a spot Time of Flight mode as an available imaging mode However, Oggier teaches a spot Time of Flight mode as an ideal imaging mode option ([0035] Going to its extreme, the ideal case would be a single spot TOF measurement. In that case, the lateral resolution gets lost.) It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Metz to comprise the spot Time of Flight imaging mode similar to Oggier, with a reasonable expectation of success. This would have the predictable result of providing an additional imaging mode that is idea eliminating multi-path interference. Regarding claim 18, Metz, as modified above, teaches the method according to claim 17 Metz fails to explicitly teach wherein the first time-of-flight imaging mode is a spot time of flight imaging mode. However, Oggier teaches a spot Time of Flight mode as an ideal imaging mode option ([0035] Going to its extreme, the ideal case would be a single spot TOF measurement. In that case, the lateral resolution gets lost.) It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Metz to comprise the spot Time of Flight imaging mode similar to Oggier, with a reasonable expectation of success. This would have the predictable result of providing an additional imaging mode that is idea eliminating multi-path interference. Claims 21-24 are rejected under 35 U.S.C. 103 as being unpatentable over Metz in view of Smith, further in view of Pellman et al. (United States Patent Application Publication 20180131853 A1), hereinafter Pellman. Regarding claim 21, Metz, as modified above, teaches the time-of-flight sensing circuitry of claim 1 Metz fails to teach the circuitry wherein the first imaging mode is a binned imaging mode producing depth information and the second imaging mode is a passive infrared imaging mode producing a 2D image representation of a scene. However, Pellman teaches a circuitry wherein the first imaging mode is a binned imaging mode producing depth information and the second imaging mode is a passive infrared imaging mode producing a 2D image representation of a scene ([0034] When used for computer vision, the imaging system 102 is configured to image objects in front of the user that are illuminated by passive ambient light in the visible wavelength range...In other embodiments, the world cameras (WC) 106 and 108, as well as the picture camera 110, may also be configured for dual functions, i.e., for imaging both visible and infrared light.; [0060] FIG. 16 illustrates schematically a mode of operating the image sensor 220 according to an embodiment of the present invention. The two-dimensional array of pixel cells 222 may be binned into 2×2 groups 224. Each group 224 includes four pixel cells 222a-222d. This mode of operation can be referred to as image sensor pixel binning.). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Metz to comprise the binning and passive infrared modes similar to Pellman, with a reasonable expectation of success. This would have the predictable result of ensuring there remains a full field passive scan of the environment in a non-compressed method for a live visual of the environment. Regarding claim 21, Metz, as modified above, teaches the time-of-flight sensing circuitry of claim 21, wherein the second imaging mode is a second binned mode ([0046] The mode of operation comprises at least a binning factor for at least one binned lower-resolution image 802, 804, 806, 808 and a calculated illuminator power output level (i.e., P.sub.emitted). Alternatively, the binning mode configuration module could determine a lower binning factor (or a zero binning rate for a full-resolution image) based on the analysis 904.). Regarding claim 21, Metz, as modified above, teaches the time-of-flight sensing circuitry of claim 1, Metz fails to teach the circuitry wherein the circuitry is configured to operate in a binned mode and the second imaging mode is passive infrared. However, Pellman teaches a circuitry wherein the circuitry is configured to operate in a binned mode and the second imaging mode is passive infrared ([0034] When used for computer vision, the imaging system 102 is configured to image objects in front of the user that are illuminated by passive ambient light in the visible wavelength range...In other embodiments, the world cameras (WC) 106 and 108, as well as the picture camera 110, may also be configured for dual functions, i.e., for imaging both visible and infrared light.; [0060] FIG. 16 illustrates schematically a mode of operating the image sensor 220 according to an embodiment of the present invention. The two-dimensional array of pixel cells 222 may be binned into 2×2 groups 224. Each group 224 includes four pixel cells 222a-222d. This mode of operation can be referred to as image sensor pixel binning.). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Metz to comprise the binning and passive infrared modes similar to Pellman, with a reasonable expectation of success. This would have the predictable result of ensuring there remains a full field passive scan of the environment in a non-compressed method for a live visual of the environment. Regarding claim 21, Metz, as modified above, teaches the time-of-flight sensing circuitry of claim 21, wherein the first imaging mode is a binned imaging mode which bins a subset of pixels, a number of the subset of pixels being less than a number of all of the pixels ([0034] According to one approach, the time-of-flight sensor 106 is capable of pixel binning when operating in a binning mode. For example, and with reference to FIG. 4, an example time-of-flight sensor 106 has groups of four sensor pixels 310 that are binned together to form a binned pixel 402.). Claim 25 is rejected under 35 U.S.C. 103 as being unpatentable over Metz in view of Smith, Pellman, and further in view of Sun et al. (United States Patent Application Publication 20160240579 A1), hereinafter Sun. Regarding claim 21, Metz, as modified above, teaches the time-of-flight sensing circuitry of claim 24, Metz fails to teach the circuitry wherein each pixel in the subset of pixels is a single photon avalanche detector (SPAD). However, Sun teaches wherein each pixel in the subset of pixels is a single photon avalanche detector (SPAD) ([0023] TOF pixel array 210 is a two-dimensional (2D) array of visible light pixels 211 and SPAD pixels 212). It would have been obvious to one of ordinary skill in the art prior to the effective filing date of this invention to modify the invention of Metz to comprise the SPAD pixels similar to Sun, with a reasonable expectation of success. This would have the predictable result of implementing a known sensor device common to the art into the overall system of the present invention. Response to Arguments Applicant's arguments filed January 5th, 2026 have been fully considered but they are not persuasive. Regarding the applicant’s argument that the prior art of record fails to include circuitry which is initiated by the circuitry producing a trigger peak for a first imaging mode which is then followed by a second imaging mode, the examiner points to the above cited prior art reference in which Smith teaches that the two operating modes are triggered to begin operation through a command sent to the circuit. As the limitations of the claims do not distinguish this newly amended trigger peak as anything beyond that of a signal in the independent claim, the claim is interpreted under the broadest reasonable interpretation of one of ordinary skill in the art, and as such the signal sent to start and switch operating modes is understood to teach the same trigger as that of the immediate application and as such the rejection is maintained in this Non-Final Office Action. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROBERT WILLIAM VASQUEZ JR whose telephone number is (571)272-3745. The examiner can normally be reached Monday thru Thursday, Flex Friday, 8:00-5:00 PST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, HELAL ALGAHAIM can be reached at (571)270-5227. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ROBERT W VASQUEZ/Examiner, Art Unit 3645 /HELAL A ALGAHAIM/SPE , Art Unit 3645
Read full office action

Prosecution Timeline

Dec 23, 2021
Application Filed
Jun 26, 2025
Non-Final Rejection — §103, §DP
Sep 08, 2025
Response Filed
Oct 27, 2025
Final Rejection — §103, §DP
Jan 05, 2026
Response after Non-Final Action
Jan 30, 2026
Request for Continued Examination
Feb 15, 2026
Response after Non-Final Action
Mar 16, 2026
Non-Final Rejection — §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12436282
DISTANCE MEASURING DEVICE
2y 5m to grant Granted Oct 07, 2025
Study what changed to get past this examiner. Based on 1 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
12%
Grant Probability
-4%
With Interview (-16.7%)
4y 1m
Median Time to Grant
High
PTA Risk
Based on 8 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month