Prosecution Insights
Last updated: April 19, 2026
Application No. 18/549,311

OPTICAL SENSOR, METHOD FOR CONTROLLING OPTICAL SENSOR, AND CONTROL PROGRAM FOR OPTICAL SENSOR

Non-Final OA §103
Filed
Sep 06, 2023
Examiner
MALIKASIM, JONATHAN L
Art Unit
3645
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Omron Corporation
OA Round
1 (Non-Final)
80%
Grant Probability
Favorable
1-2
OA Rounds
2y 6m
To Grant
79%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
281 granted / 352 resolved
+27.8% vs TC avg
Minimal -1% lift
Without
With
+-0.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
30 currently pending
Career history
382
Total Applications
across all art units

Statute-Specific Performance

§101
1.4%
-38.6% vs TC avg
§103
43.6%
+3.6% vs TC avg
§102
20.4%
-19.6% vs TC avg
§112
27.5%
-12.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 352 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “ an optical axis adjustment element” in claim 1; 18549311 instant applicant’s specification describes the corresponding structure for the optical axis adjustment element as being a liquid crystal device or equivalents including a MEMS mirror, an optical phased array, or an electro-optic crystal (para. [0023] ) ; “a reception unit” in claim 1 ; 18549311 instant applicant’s specification describes the corresponding structure for the reception unit as being an operation button 150 (para. [0025, 0035, and 0053]) or an I/O interface 170 (para. [0027, 0035, and 0053]). Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis ( i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1- 2 and 4-7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Petterson US20050283989 in view of Slobodyanyuk et al. US20180059221 . Regarding independent claim 1 , Petterson discloses, in Figures 1-2, An sensor ( Petterson ; Fig. 1-2; probe 4) comprising: a target object ( Petterson ; [0030] reference workpiece used for preliminary calibration step 20 ) ; a reception unit ( Petterson ; the input device that receives/communicates sensor data with the computer 5 and receives/communicates with instructions/data from the computer user/operator, for example by using the keyboard shown in Fig. 1) configured to receive, in advance, a specified condition ( Petterson ; [0037] “production conditions”) relating to a detection result obtained via preliminary detection of a target object ( Petterson ; flowchart Fig. 2; step 20 for calibration based on using a sample/reference workpiece; [0030] “a preliminary calibration step 20 that is performed beforehand, i.e. before inspecting a series of workpieces on “production””) ; and a control unit ( Petterson ; computer 5) configured to set, as an inspection direction ( Petterson ; flowchart Fig. 2; step 20 for calibration based on using a sample/reference workpiece; [0030] “a preliminary calibration step 20 that is performed beforehand, i.e. before inspecting a series of workpieces on “production””) , of detection results based on the detection signal obtained by scanning a preliminary target placed in advance ( Petterson ; [0030] “preliminary calibration step 20”; [0032] “Scanning the reference workpiece at the calibration step can be done by using any measuring device”) , a detection result corresponding to the specified condition ( Petterson ; [0037] “production conditions”) received by the reception unit is obtained ( Petterson ; [0034] “The result of the calibration phase is a set of 3D points”; [0035] “calibration data”) , and drive, in the inspection direction set, for an inspection target placed instead of the preliminary target and to perform detection processing ( Petterson ; flowchart Fig. 2; inspection step 24; [0040] take measurements on a production workpiece at production conditions that replaces the preliminary reference workpiece) . Petterson does not disclose An optical sensor comprising: a light-projecting element configured to project detection light; an optical axis adjustment element configured to adjust an optical axis of the detection light projected from the light-projecting element; a light-receiving element configured to receive the detection light reflected at a target object and output a detection signal ; a control unit configured to set, as an inspection direction, a direction of the detection light in which , of detection results based on the detection signal obtained by scanning a preliminary target placed in advance with the detection light by driving the optical axis adjustment element , a detection result corresponding to the specified condition received by the reception unit is obtained, and drive the optical axis adjustment element to cause the detection light to be projected , in the inspection direction set, for an inspection target placed instead of the preliminary target and cause the light-projecting element and the light-receiving element to perform detection processing. Slobodyanyuk teaches An optical sensor ( Slobodyanyuk ; Fig. 5; hybrid LIDAR system 500 ; [0049] 1-D detector array ) comprising: a light-projecting element configured to project detection light ( Slobodyanyuk ; Fig. 5; [0047] “ light source ” such as a “laser array” ) ; an optical axis adjustment element configured to adjust an optical axis of the detection light projected from the light-projecting element ( Slobodyanyuk ; Fig. 5; optical beam scanner 510 ; [0047] “focusing the light beam in another direction ” ) ; a light-receiving element configured to receive the detection light reflected at a target object and output a detection signal ( Slobodyanyuk ; Fig. 5; sensor 520 ; [0049] 1-D detector array ) ; a direction of the detection light in which, with the detection light by driving the optical axis adjustment element, drive the optical axis adjustment element to cause the detection light to be projected, and cause the light-projecting element and the light-receiving element to perform detection processing ( Slobodyanyuk ; Fig. 5; hybrid LIDAR system 500; [0047] “focusing the light beam in another direction” ; flowchart Fig. 7 with step 720 that describes using a controller to change a scan direction so that scanning is performed in a second direction that is different from a first direction ) . It would have been obvious to one having ordinary skill at the effective filing date of the invention to substitute the 3D probe sensor taught by Petterson with the 1-D LIDAR optical sensor with optical beam scanner that can control and focus the light beam in different directions as taught by Slobodyanyuk for the purpose of providing “a better resolution in one direction” ( Slobodyanyuk ; [0058] “a better resolution in one direction”) and to “provide fine resolution” ( Slobodyanyuk ; [0026] “provide fine resolution” ) and to provide an “active remote sensing” ( Slobodyanyuk ; [0001] “active remote sensing” ) . Regarding claim 2 , Modified Petterson teaches the invention substantially the same as described above, and The optical sensor according to claim 1, wherein the reception unit receives a condition relating to distance as the specified condition ( Slobodyanyuk ; [0001] “to measure distances between objects”) . Regarding claim 4 , Modified Petterson teaches the invention substantially the same as described above, and The optical sensor according to claim 1 . Modified Petterson does not teach a display unit configured to show the inspection direction set by the control unit. Slobodyanyuk teaches a display unit configured to show the inspection direction set by the control unit ( Slobodyanyuk ; Fig. 8; [0070] output device 820 such as a display device) . It would have been obvious to one having ordinary skill at the effective filing date of the invention to modify the sensor as taught by Modified Petterson to comprise a display unit as taught by Slobodyanyuk for the purpose of providing a means for the user/operator to view and understand the information for decision-making. Regarding claim 5 , Modified Petterson teaches the invention substantially the same as described above, and The optical sensor according to claim 1, wherein the control unit ( Petterson ; computer 5) uses, of detection results obtained by scanning with the detection light ( Slobodyanyuk ; Fig. 5; [0047] “light source” such as a “laser array”) , a detection result obtained near the inspection direction set, and sets a detection condition of detection processing for the inspection target ( Petterson ; [0030] “preliminary calibration step 20”; [0032] “Scanning the reference workpiece at the calibration step can be done by using any measuring device” ; [0037] “production conditions”) . Regarding independent claim 6 , Modified Petterson teaches the invention substantially the same as described above r egarding independent claim 1 . Regarding independent claim 7 , Modified Petterson teaches the invention substantially the same as described above r egarding independent claim 1 , but is silent regarding A non-transitory storage medium storing a control program . Slobodyanyuk teaches A non-transitory storage medium storing a control program ( Slobodyanyuk ; Fig. 8; [0075] non-transitory storage device 825 to store application programs 845) . It would have been obvious to one having ordinary skill at the effective filing date of the invention to modify the sensor as taught by Modified Petterson to comprise a non-transitory storage medium as taught by Slobodyanyuk for the purpose of configuring/adapting a general purpose computer to perform specific operations ( Slobodyanyuk ; [0076] “ used to configure and/or adapt a general purpose computer ( or other device) to perform one or more operations in accordance with the described methods. ”). Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Petterson US20050283989 in view of Slobodyanyuk et al. US20180059221 as applied to claim 1 above, and further in view of Takahashi et al. US20190376893 . Regarding claim 3, Modified Petterson teaches the invention substantially the same as described above, and The optical sensor according to claim 1, wherein the control unit ( Petterson ; computer 5) . Modified Petterson does not teach wherein the control unit performs a notification projection of projecting the detection light in the inspection direction set in a manner that the detection light is visually recognizable by a user . Takahashi teaches wherein the control unit performs a notification projection of projecting the detection light in the inspection direction set in a manner that the detection light is visually recognizable by a user ( Takahashi ; [0062] “ Then, information representing a result of detection of the detection target light L3 is output to the control unit 6, and an intensity of the projection light L2 is controlled on the basis of a result of detection of the detection target light L3 that is acquired by the optical detector 5 (Step SOS: a control step). In this way, a projection image P on which the result of detection of the detection target light L3 is reflected is formed on the surface of the observation target S. ”) . It would have been obvious to one having ordinary skill at the effective filing date of the invention to modify the control unit as taught by Modified Petterson to comprise a notification projection as taught by Takahashi in order to provide and secure a “sufficient amount of the detection target light” for the user/operator ( Takahashi ; [0063 “ a projection image P can be displayed on the observation target S in a state in which a sufficient amount of the detection target light L3 is secured .”]). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Urano et al. US20100004875 teaches, in Figure 15, a calibration detection system that has a set inspection condition step 313, a set inspection process step 315, and a set defect type/size determination processing conditions step 323. Bendall US20160171705 teaches determining “the desired measurement application (e.g. determining the deepest point, the highest point, or the clearance between two surfaces)”. Any inquiry concerning this communication or earlier communications from the examiner should be directed to FILLIN "Examiner name" \* MERGEFORMAT JONATHAN MALIKASIM whose telephone number is FILLIN "Phone number" \* MERGEFORMAT (313)446-6597 . The examiner can normally be reached FILLIN "Work Schedule?" \* MERGEFORMAT M-F; 8 am - 5 pm (CST) . Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, FILLIN "SPE Name?" \* MERGEFORMAT Yuqing Xiao can be reached at FILLIN "SPE Phone?" \* MERGEFORMAT 571-270-3603 . The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JONATHAN MALIKASIM/ Primary Examiner, Art Unit 3645 3/6/26
Read full office action

Prosecution Timeline

Sep 06, 2023
Application Filed
Mar 06, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602913
Apparatus For IDENTYFING OBJECT PRIORITY FOR AUTONOMOUS DRIVING CONTROL OF VEHICLE
2y 5m to grant Granted Apr 14, 2026
Patent 12590807
INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12488684
METHOD FOR PROVIDING AN ALREADY GENERATED TRAJECTORY OF A FIRST MOTOR VEHICLE FOR A SECOND MOTOR VEHICLE FOR FUTURE TRAVEL ALONG THE TRAJECTORY, COMPUTER PROGRAM PRODUCT AND ASSISTANCE SYSTEM
2y 5m to grant Granted Dec 02, 2025
Patent 12129748
ELECTRICAL SUBMERSIBLE PUMP Y-TOOL WITH PERMANENT COILED TUBING PLUG AND MILLABLE BALL VALVE
2y 5m to grant Granted Oct 29, 2024
Patent 12116857
WELL ACCESS APPARATUS AND METHOD
2y 5m to grant Granted Oct 15, 2024
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
80%
Grant Probability
79%
With Interview (-0.9%)
2y 6m
Median Time to Grant
Low
PTA Risk
Based on 352 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month