Prosecution Insights
Last updated: April 19, 2026
Application No. 18/259,749

MEASUREMENT SYSTEM, INSPECTION SYSTEM, MEASUREMENT DEVICE, MEASUREMENT METHOD, INSPECTION METHOD, AND PROGRAM

Final Rejection §102§103
Filed
Jun 28, 2023
Examiner
VAUGHN, ALEXANDER JOSEPH
Art Unit
2675
Tech Center
2600 — Communications
Assignee
Omron Corporation
OA Round
2 (Final)
73%
Grant Probability
Favorable
3-4
OA Rounds
2y 10m
To Grant
99%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
11 granted / 15 resolved
+11.3% vs TC avg
Strong +29% interview lift
Without
With
+28.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
20 currently pending
Career history
35
Total Applications
across all art units

Statute-Specific Performance

§101
6.3%
-33.7% vs TC avg
§103
52.5%
+12.5% vs TC avg
§102
30.0%
-10.0% vs TC avg
§112
11.3%
-28.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 15 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment This action is responsive to applicant’s amendments and remarks received on 10/27/2025. Response to Arguments Applicant's remarks received on 10/27/2025 contain no arguments but states that the amendments are not taught. Claims 1-3, 5-6, 8-11 and 14-16 stand rejected. Applicant should submit an argument under the heading “Remarks” pointing out disagreements with the examiner’s contentions. Applicant must also discuss the references applied against the claims, explaining how the claims avoid the references or distinguish from them. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-2, 5, 9-11, 15 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Taniguchi et al. (JP 2013186100 A), hereinafter Taniguchi. Regarding claim 1, Taniguchi teaches A measurement system for measuring a shape of at least a part of a measurement target, the measurement system comprising: (Abstract see "a three-dimensional shape inspection method and device" and "A three dimensional shape inspection device includes a first three-dimensional shape sensor for acquiring first shape data to be inspected, a second three-dimensional shape sensor for acquiring second shape data different from the first shape data to be inspected, and a complementary integration unit that complementarily integrates the first shape data and the second shape data."). a first feature point data generator configured to generate first feature point data indicating a shape of a predetermined portion of the measurement target from first image data or from first shape data, (Para. 25 discloses using a camera to aquire images and derive points of the object in the image indicating the object's shape to generate item shape data. Para. 19 see "In addition, in order to improve the measurement efficiency, an area identification unit that determines the unskilled hand of each method based on the CAD data 142 and identifies an area for acquiring shape data with the distance measurement sensor 131 and the two-dimensional camera 123."). the first image data being obtained from imaging of the measurement target and including the shape of the at least the part of the measurement target, the first shape data being generated based on the first image data; (Para. 17 see "In the image pickup unit 120, the illumination unit 121 illuminates the sample 1 in an arbitrary direction, and the reflected light, scattered light, diffracted light, and diffused light are imaged by the two dimensional camera 123 using the lens 122, and the three-dimensional shape is obtained. Is acquired as two-dimensional image data." Para. 26-28 discloses using vectors to calculate point groups representing shape information from images.). a second feature point data generator configured to generate second feature point data indicating the shape of the predetermined portion of the measurement target from second image data or from second shape data, (Para. 23 see "A measurement area is determined according to the performance of the distance measurement unit 130 to be used (S100), and a point group representing coordinates in 3D space by the distance measurement unit 130." Para. 19 see "In addition, in order to improve the measurement efficiency, an area identification unit that determines the unskilled hand of each method based on the CAD data 142 and identifies an area for acquiring shape data with the distance measurement sensor 131 and the two-dimensional camera 123."). the second image data including the shape of the at least the part of the measurement target and being obtained differently from the first image data, the second shape data being generated based on the second image data; (Para. 18 see "The non-contact distance measurement sensor 131 measures the shape of the object surface and scans the y-stage 107 and the θ-stage 108 to output the three dimensional coordinates of many points as a point group. Many methods have been proposed for noncontact optical distance measurement sensors, and any method is applicable to the present invention. For example, a light cutting method based on triangulation." (Light cutting projects a pattern of light onto an object and uses a camera to capture the distorted pattern. Thus second image data is disclosed.)). a calculator configured to calculate a positional correspondence of the predetermined portion of the measurement target between the first image data and the second image data or between the first shape data and the second shape data based on the first feature point data and the second feature point data; (Abstract see "a complementary integration unit that complementarily integrates the first shape data and the second shape data." Para. 29-36 discloses combining sensor data based on corresponding points using vector calculations.). and a merged shape data generator configured to merge at least parts of the first image data and the second image data or at least parts of the first shape data and the second shape data based on the positional correspondence of the predetermined portion of the measurement target calculated by the calculator, and to generate merged shape data of the predetermined portion of the measurement target, (Abstract see "a complementary integration unit that complementarily integrates the first shape data and the second shape data." Para. 29-36 discloses combining sensor data based on corresponding points using vector calculations to create integrated shape data. Para. 19 see "In addition, in order to improve the measurement efficiency, an area identification unit that determines the unskilled hand of each method based on the CAD data 142 and identifies an area for acquiring shape data with the distance measurement sensor 131 and the two-dimensional camera 123."). wherein the calculator includes an XY merging criterion generator and a Z merging criterion generator, wherein the XY merging criterion generator is configured to determine a correspondence of horizontal positions of the measured object and generate a horizontal merging criterion for the first shape data and the second shape data, and wherein the Z merging criterion generator is configured to determine a correspondence of vertical positions of the measured object and generate a vertical merging criterion for the first shape data and the second shape data. (Abstract see "a complementary integration unit that complementarily integrates the first shape data and the second shape data." Para. 34-36 discloses combining sensor data based on corresponding points using vector calculations. Para. 35 discloses coordinates of x, y, and z.). Regarding claim 2, Taniguchi teaches The measurement system according to claim 1. further comprising: a first imager configured to image the measurement target; (Para. 17 see "In the image pickup unit 120, the illumination unit 121 illuminates the sample 1 in an arbitrary direction, and the reflected light, scattered light, diffracted light, and diffused light are imaged by the two dimensional camera 123 using the lens 122, and the three-dimensional shape is obtained. Is acquired as two-dimensional image data."). and a second imager configured to image the measurement target. (Para. 18 see "The non-contact distance measurement sensor 131 measures the shape of the object surface and scans the y-stage 107 and the θ-stage 108 to output the three dimensional coordinates of many points as a point group."). Regarding claim 5, Taniguchi teaches The measurement system according to claim 1. wherein the first feature point data includes at least one of coordinate data indicating the shape of the predetermined portion of the measurement target, binary image data indicating the shape of the predetermined portion of the measurement target, multivalued image data indicating the shape of the predetermined portion of the measurement target, or height inflection point data indicating a three-dimensional shape of the predetermined portion of the measurement target, (Para. 25-26 discloses measuring a predetermined region with a camera and acquiring points in 3D space (coordinates) indicating shape data.). and the second feature point data includes at least one of coordinate data indicating the shape of the predetermined portion of the measurement target, binary image data indicating the shape of the predetermined portion of the measurement target, multivalued image data indicating the shape of the predetermined portion of the measurement target, or height inflection point data indicating the three-dimensional shape of the predetermined portion of the measurement target. (Para. 23 discloses measuring a predetermined area with a distance measurement method and acquiring points in 3D space (coordinates) indicating shape data.). Regarding claim 9, Taniguchi teaches The measurement system according to claim 1; an inspection system for inspecting a measurement target, the inspection system comprising: (Abstract see "a three-dimensional shape inspection method and device"). and a merged data inspector configured to determine whether the measurement target or a component included in the measurement target is acceptable based on the merged shape data generated by the merged shape data generator. (Para. 19 discloses combining data from the camera shape data and the distance measurement shape data and based on the merged shape data, determining if the quality of the measurement target is OK.). Claim 10 is rejected under the same analysis as claim 1 above. Claim 11 is rejected under the same analysis as claim 1 above. Claim 15 is rejected under the same analysis as claim 9 above. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 3, 6, 16 are rejected under 35 U.S.C. 103 as being unpatentable over Taniguchi et al. (JP 2013186100 A), hereinafter Taniguchi, in view of Murakami (JP 2012237729 A), hereinafter Murakami. Regarding claim 3, Taniguchi teaches The measurement system according to claim 2. wherein the first imager is a visible light camera, (Para. 17 see "For the two-dimensional camera 123, a CCD (Charge Coupled Device image sensor, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, or the like can be used."). Taniguchi does not teach and the second imager is an X-ray camera. However, Murakami teaches and the second imager is an X-ray camera. (Para. 28 see "The X-ray inspection apparatus 100 includes an X-ray source 10 that outputs X-rays 18, an X-ray detector 23, an image acquisition control mechanism 30." Para. 39-40 discloses an x-ray imager that constructs three-dimensional data.). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Taniguchi to incorporate the teachings of Murakami to use an x-ray imager as the second imager to acquire second image and shape data. Doing so would predictably allow the imager to see though components and detect shape data that is occluded. Regarding claim 6, Taniguchi teaches The measurement system according to claim 1. Taniguchi does not teach wherein the measurement target is a board on which a component is mounted, and the shape of the predetermined portion of the measurement target includes at least one of a shape of a wiring pattern on the board, a shape of a land on the board, a shape of an electrode in the component mounted on the board, or a shape of solder on the board. However, Murakami teaches wherein the measurement target is a board on which a component is mounted, (Para. 29 see "The inspection object 1 is disposed between the X-ray source 10 and the X-ray detector 23. In the present embodiment, it is assumed that the inspection object 1 is a circuit board on which components are mounted."). and the shape of the predetermined portion of the measurement target includes at least one of a shape of a wiring pattern on the board, a shape of a land on the board, a shape of an electrode in the component mounted on the board, or a shape of solder on the board. (Para. 65 see "the position of the inspection object (here, the solder ball) of the component mounted on the front side is further superimposed on the superimposed image." Para. 74 see "Here, for example, a fiducial mark can be used as a reference for such an overlay position. The fiducial mark is formed of a copper wiring pattern at the diagonal end of the substrate." Para. 90 see "The position of the electrode pad (land) is displayed in a rectangular shape."). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Taniguchi to incorporate the teachings of Murakami to inspect solder, wiring, or electrodes on a board. Doing so would predictably allow a system to robustly identify distinct shapes or material properties that indicate the quality of the board. Regarding claim 16, Taniguchi teaches The measurement method according to claim 11. Taniguchi does not teach A non-transitory computer readable medium storing a program for causing a computer to perform operations included in the method according to claim 11. However, Murakami teaches A non-transitory computer readable medium storing a program for causing a computer to perform operations included in the method according to claim 11. (Para. 37, 45, and 58 discloses processors and computer memory to execute a program to perform measurements and inspections.). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Taniguchi to incorporate the teachings of Murakami to include computer memory to store a program which performs the operations according to claim 11. Doing so would predictably allow the system and method to easily be updated if the process must be changed by providing the process as a computer program as opposed to executing the process through specific hardware units. Claims 8, 14 are rejected under 35 U.S.C. 103 as being unpatentable over Taniguchi et al. (JP 2013186100 A), hereinafter Taniguchi, in view of Murakami (JP 2012237729 A), hereinafter Murakami, and Roder et al. (US 20020015520 A1), hereinafter Roder. Regarding claim 8, Taniguchi teaches The measurement system according to claim 1. wherein the first image data is imaged with a visible light camera, (Para. 17 see "For the two-dimensional camera 123, a CCD (Charge Coupled Device image sensor, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, or the like can be used."). and the merged shape data generator generates the merged shape data of the predetermined portion of the measurement target by using, with priority, the second image data or the second shape data (Para. 36 discloses weighting corresponding point groups when combining sensor data. (Giving priority)). Taniguchi does not teach the second image data is X-ray image data, for an area in a blind spot of the visible light camera in the first image data. However, Murakami teaches the second image data is X-ray image data, (Para. 28 see "The X-ray inspection apparatus 100 includes an X-ray source 10 that outputs X-rays 18, an X-ray detector 23, an image acquisition control mechanism 30." Para. 39-40 discloses an x-ray imager that constructs three-dimensional data.). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Taniguchi to incorporate the teachings of Murakami to use an x-ray imager as the second imager to acquire second image and shape data. Doing so would predictably allow the imager to see though components and detect shape data that is occluded. Furthermore, Roder teaches for an area in a blind spot of the visible light camera in the first image data. (Para. 85 see "Note that most of the solder joints to be inspected are hidden so that they cannot be inspected either visually or by using conventional X-ray inspection. By employing the laminography process described herein however, a cross-sectional view at or near the surface of the circuit board 610 can be taken which allows the solder connections of a BGA device to be analyzed."). It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Taniguchi and Murakami to incorporate the teachings of Roder to prioritize x-ray shape data for areas that are blind spots to the visible light camera. Doing so would predictably allow the imager to see though components and detect shape data that is occluded. Claim 14 is rejected under the same analysis as claim 8 above. Allowable Subject Matter Claim(s) 17-19 is/are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Fujimoto et al. (WO 2016135856 A1) discloses a three-dimensional shape measurement system and measurement method. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALEXANDER J VAUGHN whose telephone number is (571) 272-5253. The examiner can normally be reached M-F 8:30-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, ANDREW MOYER can be reached on (571) 272-9523. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ALEXANDER JOSEPH VAUGHN/Examiner, Art Unit 2675 /EDWARD PARK/Primary Examiner, Art Unit 2675
Read full office action

Prosecution Timeline

Jun 28, 2023
Application Filed
Aug 11, 2025
Non-Final Rejection — §102, §103
Oct 27, 2025
Response Filed
Jan 08, 2026
Final Rejection — §102, §103
Apr 09, 2026
Response after Non-Final Action

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591955
SYSTEMS AND METHODS FOR GENERATING DYNAMIC DARK CURRENT IMAGES
2y 5m to grant Granted Mar 31, 2026
Patent 12579756
GRAPHICAL ASSISTANCE WITH TASKS USING AN AR WEARABLE DEVICE
2y 5m to grant Granted Mar 17, 2026
Patent 12573010
IMAGE PROCESSING APPARATUS, RADIATION IMAGING SYSTEM, IMAGE PROCESSING METHOD, AND NON-TRANSITORY COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Mar 10, 2026
Patent 12567265
VEHICLE, CONTROL METHOD THEREOF AND CAMERA MONITORING APPARATUS
2y 5m to grant Granted Mar 03, 2026
Patent 12521061
Method of Determining the Effectiveness of a Treatment on a Face
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
73%
Grant Probability
99%
With Interview (+28.6%)
2y 10m
Median Time to Grant
Moderate
PTA Risk
Based on 15 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month