Prosecution Insights
Last updated: April 19, 2026
Application No. 17/767,642

LASER WORKING SYSTEM FOR PERFORMING A WORKING PROCESS ON A WORKPIECE BY MEANS OF A LASER BEAM AND METHOD FOR MONITORING A WORKING PROCESS ON A WORKPIECE BY MEANS OF A LASER BEAM

Non-Final OA §103
Filed
Apr 08, 2022
Examiner
SCHNASE, PAUL DANIEL
Art Unit
2877
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
Precitec GmbH & Co. Kg
OA Round
3 (Non-Final)
77%
Grant Probability
Favorable
3-4
OA Rounds
2y 9m
To Grant
99%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
10 granted / 13 resolved
+8.9% vs TC avg
Strong +38% interview lift
Without
With
+37.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
39 currently pending
Career history
52
Total Applications
across all art units

Statute-Specific Performance

§101
5.8%
-34.2% vs TC avg
§103
41.1%
+1.1% vs TC avg
§102
25.9%
-14.1% vs TC avg
§112
27.3%
-12.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 13 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/09/2026 has been entered. Response to Arguments Rejections under 35 U.S.C. § 112(b) Claim 20 is now viewed as broad instead of indefinite, as it is now more clear that the claim is not demanding much of the sequence of spectral bands, so the rejection under 35 U.S.C. § 112(b) is withdrawn. Rejections under 35 U.S.C. § 103 Applicant’s first argument is that Staudt teaches imaging a single line of the workpiece rather than an areal extent as now claimed, however, this argument is moot. The present action relies on FRAMOS to teach scanning an areal extent of an imaging target. Applicant’s second argument is that using industrial vision to detect welding defects (as Sassi does) is irrelevant to evaluating the temperature of a weld (as Staudt does), however, this argument is not persuasive. Staudt describes temperature information as very valuable to understanding such factors as thermal gradients that determine solidification rate and the resulting microstructure, concluding that temperature information is useful for process control purposes (section 1, paragraph 4), clearly indicating that Staudt does not consider quality of the final result to be an irrelevant consideration. It should also be noted that, while each reference applied in a rejection under 35 U.S.C. § 103 must be analogous to the claimed invention, it is not required that they address the same problem as the claimed invention when they are in the same field of endeavor as the claimed invention (such as evaluation of laser materials processing systems through hyperspectral sensing or machine vision). See MPEP 2141.01(a) I, first paragraph. Applicant’s third argument is that Staudt teaches away from the use of image capturing strategies other than imaging a single line of a sample through a slit, however, this argument is not persuasive. While Staudt does teach the advantages of one form of hyperspectral imaging, other forms of hyperspectral imaging exist, such as the filter-based methods of FRAMOS, that are not subject to excessive spectral overlap in the same way that Staudt would have experienced by simply widening the slit. Additionally, a requirement of elaborate image reconstruction does not render a method non-functional, so Staudt’s choice not to perform image reconstruction does not constitute teaching away from methods that employ some degree of image reconstruction, especially when those methods come with other advantages, such as imaging an areal extent of a workpiece. Claim Objections Claim 4 is objected to because of the following informalities: line 4 of claim 4 recites “an region” rather than “a region”. Appropriate correction is required. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-11, 13-15, and 17-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Staudt (Non-Patent Literature “Temperature determination in laser welding based upon a hyperspectral imaging technique”) in view of FRAMOS (Non-Patent Literature “Imaging with Hyperspectral Sensors: The Optimum Design for your Application”) and Sassi (Non-Patent Literature “A Smart Monitoring System for Automatic Welding Defect Detection”). Regarding claim 1, Staudt teaches a laser working system for carrying out a working process on a workpiece by a laser beam, said laser working system comprising: a laser working head (FIG. 3(a), labeled BEO D70, described in the first paragraph of section 3) for radiating a laser beam into a working region on said workpiece (FIG. 3(b) shows the process); a sensor unit for monitoring said working process with at least one hyperspectral sensor (FIG. 3, HSI-camera, described in section 2.1), said sensor unit being configured to capture a hyperspectral image with N times M pixels of a region of said workpiece, each pixel comprising L values (section 2.1 and FIG. 1 describe a system that captures a two-dimensional image (wavelength and y) in 256 × 256 pixels, so M and L may be regarded as 256, but scanning would be required to for N to exceed the width of the slit as imaged onto the sensor), wherein the hyperspectral image has two spatial dimensions x and y and one spectral dimension λ (Staudt images in y and λ primarily, with the x dimension limited by the width of the slit (section 2.1 and FIG. 1)), and wherein N denotes a number of pixels in the first spatial dimension x (not explicitly enumerated, but determined by the width of the slit. See FIG. 1(b). Note that this would have to be at least 1 for the slit to show up at all.), M denotes a number of pixels in the second spatial dimension y (256 according to section 2.1, first paragraph), and L denotes the number of spectral bands in the spectral dimension λ of the hyperspectral image (256 according to section 2.1, first paragraph), where M, N and L are natural numbers greater than zero (pixel counts of features that appear in a pixilated image are necessarily natural numbers greater than 0), wherein L is equal to or greater than 16 (section 2.1 describes a system of assigning each column a wavelength. A resolution of 256 × 256 produces 256 columns, measuring 256 wavelengths. Note that 256 is equal to or greater than 16. Also see FIG. 2, which shows measurements at a resolution far higher than 16 total points.). Staudt does not explicitly teach that the sensor unit is configured to scan multiple rows of the workpiece in a row direction to obtain the hyperspectral image of an areal extent of the region of the workpiece and the x dimension is limited by the width of the slit. In the same field of endeavor of hyperspectral imaging, FRAMOS does teach a technique in which a sensor unit that images in one spatial dimension and one wavelength dimension (such as the one depicted in FIG. 1 of FRAMOS or the HSI-camera employed by Staudt) wherein the sensor unit is configured to scan multiple rows of the workpiece in a row direction (page 2, section Pushbroom Scanning, the scanning is caused by relative motion between the camera and the object) to obtain the hyperspectral image of an areal extent of the region of the workpiece (section Pushbroom Scanning, the synchronized scanning and image capture allows the whole area to be captured for each wavelength) and the x dimension is not limited by the width of the slit (page 2, section Pushbroom Scanning, the synchronized scanning and image capture allows the whole area to be captured for each wavelength). By using a pushbroom scanning technique, FRAMOS is able to capture very detailed spectral information with a high maximum spatial resolution to enable more reliable identification and classification results over the entire area of a sample (page 2, section Pushbroom Scanning, paragraph 2). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the hyperspectral monitoring of laser welding systems of Staudt with the pushbroom scanning techniques of FRAMOS in order to gain the predictable benefit of imaging the whole sample rather than a narrow strip extending in the y direction with reasonable expectation of success. While Staudt does teach using the data for determining information about the working process, Staudt does not explicitly teach a computing unit configured to determine an input tensor based on the hyperspectral image and to determine an output tensor based on the input tensor using a transfer function, the output tensor containing information about said working process, wherein the transfer function between the input tensor and the output tensor is formed by a deep convolutional neural network, and wherein the information about said working process contains information about a working error and a type of working error. In the same field of endeavor of automated optical monitoring of laser welding systems through machine learning, Sassi teaches a computing unit (FIG. 3, computing PC) configured to determine an input tensor based on the image and to determine an output tensor based on the input tensor using a transfer function, the output tensor containing information about said working process (page 2, first paragraph, detecting weld defects), wherein the transfer function between the input tensor and the output tensor is formed by a deep convolutional neural network (page 4 describes their neural network learning as deep learning (using a deep neural network). page 7 describes their use of convolutional layers), and wherein the information about said working process contains information about a working error (page 6, final paragraph, defective workpieces are identified as scrap) and a type of working error (page 6, final paragraph, scrap workpieces are identified by type of defect). By using deep convolutional neural networks, Sassi is able to identify multiple types of abnormalities in the processing of images of laser welds (FIG. 2 shows several types of defects studied with the data set that Sassi used, but other data could be labeled with a different set of defects to train such a deep neural network to identify those defects instead) quickly (see table III) and accurately (see table II). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the hyperspectral monitoring of laser welding systems of Staudt, as modified by FRAMOS, with the machine learning-based defect identification and classification techniques of Sassi in order to gain the predictable benefit of identifying abnormalities in the welding process, such as those that may be associated with abnormalities in temperature distribution (see paragraph 4 of Staudt). Regarding claim 2, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that said sensor unit is configured to sense radiation emitted by the captured region of said workpiece and/or reflected laser radiation and to output it as a hyperspectral image (second page, first paragraph). Regarding claim 3, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that said sensor unit is configured to sense at least one of the following types of radiation: thermal radiation, radiation in the infrared range of light, radiation in the near-infrared range of light, radiation in the visible range of light, plasma radiation, reflected light of the laser beam, backscattered light of the laser beam, and light that is radiated by a lighting source and reflected (section 2.1, first paragraph indicates spectral range of 350-1100 nm, which includes at least wavelengths in the infrared, near-infrared, and visible ranges. Additionally, determining temperature in that manner requires sensing thermal radiation (see section 2.2) ). Regarding claim 4, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that the region of said workpiece captured in the hyperspectral image comprises at least one of the following regions on the workpiece: the working region, a region in the advance of said laser beam, a region trailing said laser beam, an region still to be machine worked and a machine worked region (FIG. 3 shows high speed camera directed toward at least the working region). Regarding claim 5, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that L is equal to or greater than 20 (section 2.1 describes a system of assigning each column a wavelength. A resolution of 256 × 256 produces 256 columns, measuring 256 wavelengths. Note that 256 is equal to or greater than 20. Also see FIG. 2, which shows measurements at a resolution far higher than 20 total points.). Regarding claim 6, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that the spectral bands are of the same size (Staudt uses a diffraction grating to produce the wavelength dispersion on the camera sensor (FIG. 1(a)). A diffraction grating disperses light according to the formula sin(θm)=m λ/d, where θm is the angle of outgoing light, m is an integer (the order of diffraction), λ is the wavelength, and d is the spacing of the grating. The linear relationship between the wavelength and sin(θm) and between sin(θm) and where in the x direction the band will be projected onto a surface indicates that pixels of substantially equal physical size will detect bands of substantially equal spectral size.) Regarding claim 7, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that said sensor unit is configured to capture hyperspectral images continuously and/or to capture one hyperspectral image per predetermined time interval (section 3, second paragraph indicates a predetermined time interval of ΔT = 33 µs). Regarding claim 8, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that said sensor unit is configured to sense all L values for all of the N times M pixels of the hyperspectral image simultaneously, or wherein said sensor unit is configured to sense all pixel rows of the hyperspectral image simultaneously but in different spectral bands and the L values of the different spectral bands for each pixel row or for all n pixel rows sequentially (section 2.1 Principles of the HSI-system describes the latter arrangement, with a slit-aperture and dispersive setup resulting in different wavelengths in different values of x simultaneously). Regarding claim 9, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches the use of a diffraction grating to cause different rows to see different wavelengths of light rather than mosaic filters or individual filters over particular rows or groups of rows, so does not explicitly teach that said hyperspectral sensor comprises a mosaic filter having N times M individual filters with L different transmission ranges or wherein said hyperspectral sensor comprises a plurality of pixel rows and the pixels are arranged in a pixel row in a first direction (x) and said hyperspectral sensor comprises a row filter, in which an individual filter of a plurality of individual filters extends over at least one pixel row of said hyperspectral sensor and said plurality of individual filters with L different transmission ranges are arranged in a second direction (y) perpendicular to the first direction (x). In the same field of endeavor of hyperspectral imaging, FRAMOS does teach that said hyperspectral sensor comprises a mosaic filter having N times M individual filters with L different transmission ranges (described in section Snapshot Mosaic and depicted in FIG. 2, which uses N = 2048, M = 1088, and L = 16) or wherein said hyperspectral sensor comprises a plurality of pixel rows and the pixels are arranged in a pixel row in a first direction (x) and said hyperspectral sensor comprises a row filter, in which an individual filter of a plurality of individual filters extends over at least one pixel row of said hyperspectral sensor and said plurality of individual filters with L different transmission ranges are arranged in a second direction (y) perpendicular to the first direction (x) (described in section Pushbroom scanning and shown in FIG. 1, which shows a number of different row groups, each imaging a different wavelength band due to the use of individual filters). By using filters, FRAMOS is able to provide an alternative to a slit and diffraction grating when collecting hyperspectral images in either snapshot or pushbroom scanning modes. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the hyperspectral monitoring of laser welding systems of Staudt, as modified by FRAMOS and Sassi, to use filters over specific pixels or rows of pixels as taught by FRAMOS as a substitute for the slit and diffraction grating as an alternative way of imaging at all the spectral bands desired, with predictable results and a reasonable expectation of success. Regarding claim 10, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that the at least one hyperspectral sensor has a spectral sensitivity range from 400 nm to 1800 nm, and/or from 400 nm to 950 nm, and/or from 400 nm to 1000 nm, and/or from 1000 nm to 1700 nm, and/or from 950 nm to 1800 nm, and/or from 1200 nm to 2000 nm (section 2.1, first paragraph. A hyperspectral sensor with a spectral range from 350 to 1100 nm will inherently be sensitive in spectral ranges of 400 nm to 950 nm and from 400 nm to 1000 nm). Regarding claim 11, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that the at least one hyperspectral sensor comprises a CMOS camera, an infrared-enhanced CMOS sensor, a near-infrared (NIR) enhanced CMOS sensor, an InGaAs-based sensor, a graphene-based sensor, a sensor array and/or a diode array (the camera used, a Vision Research Phantom v1210 (section 2.1, first paragraph), is a CMOS camera (abstract and second paragraph of section 4), which can also be said to comprise a sensor array of at least 512× 512 sensors (section 1, final paragraph. Note that each pixel of a camera is a sensor, and the pixel array is a sensor array)). Regarding claim 13, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). While Staudt does endorse the use of temperature information for process control purposes (paragraph 4), Staudt does not explicitly teach that said computing unit is configured to form the output tensor in real time and, based thereon, to output control data to a control unit of said laser working system. Sassi further teaches that Sassi further teaches said computing unit is configured to form the output tensor in real time (table 3 lists processing times of a small fraction of a second when using a GPU, which are far faster than the 1.8 second design goal set forth in the final sentence of section III A. Note that a more powerful GPU would allow for processing a greater quantity of data with similar speed.) and, based thereon, to output control data to a control unit of said laser working system (page 3, final paragraph). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to implement the hyperspectral monitoring of laser welding systems and deep convolutional neural network of Staudt, as modified by FRAMOS and Sassi, to perform the processing in real time and use the result to control actions on the production line, per the further teachings of Sassi as suggested by Staudt (paragraph 4 of Staudt). Regarding claim 14, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that the output tensor further contains one of the following information: information about a state of the working process, the presence of at least one working error, a position of the working error on the workpiece, a probability of a working error of a certain type and spatial and/or areal extent of the working error (section 1, final paragraph. Temperature of a weld is a type of information about a state of a working process). Regarding claim 15, Staudt teaches a method for monitoring a working process on a workpiece by a laser beam, said method comprising the steps of: radiating a laser beam into a working region on a workpiece (FIG. 3(b) shows the process of applying the laser beam); acquiring a hyperspectral image (FIG. 3, using HSI-camera, described in section 2.1) with N by M pixels of a region of said workpiece, each pixel comprising L values (section 2.1 and FIG. 1 describe a system that captures a two-dimensional image (wavelength and y) in 256 × 256 pixels, so M and L may be regarded as 256, but scanning would be required to for N to exceed the width of the slit as imaged onto the sensor), wherein L is equal or greater than 16 (section 2.1 describes a system of assigning each column a wavelength. A resolution of 256 × 256 produces 256 columns, measuring 256 wavelengths. Note that 256 is equal to or greater than 16. Also see FIG. 2, which shows measurements at a resolution far higher than 16 total points.); wherein the hyperspectral image has two spatial dimensions x and y and a spectral dimension λ (Staudt images in y and λ primarily, with the x dimension limited by the width of the slit (section 2.1 and FIG. 1)) and wherein N indicates a number of pixels in the first spatial dimension x (not explicitly enumerated, but determined by the width of the slit. See FIG. 1(b). Note that this would have to be at least 1 for the slit to show up at all.), M indicates a number of pixels in the second spatial dimension y (256 according to section 2.1, first paragraph), and L indicates the number of spectral bands in the spectral dimension λ of the hyperspectral image (256 according to section 2.1, first paragraph), where M, N and L are natural numbers (pixel counts of features that appear in a pixilated image are necessarily natural numbers greater than 0); Staudt does not explicitly teach scanning multiple rows of the workpiece in a row direction to obtain the hyperspectral image of an areal extent of the region of the workpiece, In the same field of endeavor of hyperspectral imaging, FRAMOS does teach a technique using a camera that images in one spatial dimension and one wavelength dimension (such as the one depicted in FIG. 1 of FRAMOS or the HSI-camera employed by Staudt) of scanning multiple rows of the workpiece in a row direction (page 2, section Pushbroom Scanning, the scanning is caused by relative motion between the camera and the object) to obtain the hyperspectral image of an areal extent of the region of the workpiece (page 2, section Pushbroom Scanning, the synchronized scanning and image capture allows the whole area to be captured for each wavelength). By using a pushbroom scanning technique, FRAMOS is able to capture very detailed spectral information with a high maximum spatial resolution to enable more reliable identification and classification results over the entire area of a sample (page 2, section Pushbroom Scanning, paragraph 2). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the hyperspectral monitoring of laser welding method of Staudt with the pushbroom scanning techniques of FRAMOS in order to gain the predictable benefit of imaging the whole sample rather than a narrow strip extending in the y direction with reasonable expectation of success. While Staudt does teach using the data for determining information about the working process, Staudt does not explicitly teach determining, based on the hyperspectral image, an input tensor; and determining, based on the input tensor and by means of a transfer function, an output tensor containing information about said working process, wherein the transfer function between the input tensor and the output tensor is formed by a deep convolutional neural network, and wherein the information about said working process contains information about a working error and a type of working error. In the same field of endeavor of automated optical monitoring of laser welding systems through machine learning, Sassi teaches determining, based on the image, an input tensor (page 3, preprocessing and feature extraction steps); and determining, based on the input tensor and by means of a transfer function, an output tensor containing information about said working process (FIG. 1, classification step, using the neural networks described), wherein the transfer function between the input tensor and the output tensor is formed by a deep convolutional neural network (page 4 describes their neural network learning as deep learning (using a deep neural network). page 7 describes their use of convolutional layers), and wherein the information about said working process contains information about a working error (page 6, final paragraph, defective workpieces are identified as scrap) and a type of working error (page 6, final paragraph, scrap workpieces are identified by type of defect). By using deep convolutional neural networks, Sassi is able to identify multiple types of abnormalities in the processing of images of laser welds (FIG. 2 shows several types of defects studied with the data set that Sassi used, but other data could be labeled with a different set of defects to train such a deep neural network to identify those defects instead) quickly (see table III) and accurately (see table II). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the hyperspectral monitoring of laser welding method of Staudt, as modified by FRAMOS, with the machine learning-based defect identification and classification techniques of Sassi in order to gain the predictable benefit of identifying abnormalities in the welding process, such as those that may be associated with abnormalities in temperature distribution (see paragraph 4 of Staudt). Regarding claim 17, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that L is equal to or greater than 25 (section 2.1 describes a system of assigning each column a wavelength. A resolution of 256 × 256 produces 256 columns, measuring 256 wavelengths. Note that 256 is equal to or greater than 25. Also see FIG. 2, which shows measurements at a resolution far higher than 25 total points.). Regarding claim 18, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that L is equal to or greater than 100 (section 2.1 describes a system of assigning each column a wavelength. A resolution of 256 × 256 produces 256 columns, measuring 256 wavelengths. Note that 256 is equal to or greater than 100.). Regarding claim 19, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that the spectral bands are adjacent to one another in a spectral dimension, wherein each spectral band is spaced apart from its subsequent spectral band by a distance equal to the size of the spectral band in the spectral dimension (FIG. 1 (a), as pixels in a camera system are designed to avoid gaps between the regions captured by adjacent pixels, adjacent pixel columns would capture adjacent spectral bands). Regarding claim 20, Staudt, as modified by FRAMOS and Sassi, teaches or renders obvious the laser working system according to claim 1 (as described above). Staudt further teaches that the spectral bands are arranged in a sequential order along a spectral dimension (FIG. 1(b), the lines that are spread out by the diffraction grating are arranged in a sequence in the x direction, which is the direction Staudt disperses the light using the diffraction grating), wherein each subsequent spectral band is located at an increased or a decreased wavelength of the sequential order in the spectral dimension (FIG. 1, the use of a diffraction grating naturally imposes an increasing or decreasing order in that sequence, depending on the direction in which one counts). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to PAUL D SCHNASE whose telephone number is (703)756-1691. The examiner can normally be reached Monday - Friday 8:30 AM - 5:00 PM ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tarifur Chowdhury can be reached at (571) 272-2287. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /PAUL SCHNASE/Examiner, Art Unit 2877 /TARIFUR R CHOWDHURY/Supervisory Patent Examiner, Art Unit 2877
Read full office action

Prosecution Timeline

Apr 08, 2022
Application Filed
Apr 01, 2025
Non-Final Rejection — §103
Aug 08, 2025
Response Filed
Oct 06, 2025
Final Rejection — §103
Jan 09, 2026
Request for Continued Examination
Jan 26, 2026
Response after Non-Final Action
Feb 03, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12601584
MEASUREMENT METHOD OF SURFACE SHAPE AND SURFACE SHAPE MEASUREMENT DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12559353
DETERMINING POSITION OF A CONTAINER HANDLING EQUIPMENT
2y 5m to grant Granted Feb 24, 2026
Patent 12546715
ELECTRONIC DEVICE AND METHOD FOR DETECTING FILTER STATUS
2y 5m to grant Granted Feb 10, 2026
Patent 12517039
SYSTEM AND METHODS FOR GAS SPECTROSCOPIC SENSING WITH PHOTON COUNTING AND TUNABLE INTEGRATED PHOTONIC FILTERS
2y 5m to grant Granted Jan 06, 2026
Patent 12481039
PHOTOELECTRIC SENSOR AND OPTICAL RANGEFINDER
2y 5m to grant Granted Nov 25, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
77%
Grant Probability
99%
With Interview (+37.5%)
2y 9m
Median Time to Grant
High
PTA Risk
Based on 13 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month