Prosecution Insights
Last updated: April 19, 2026
Application No. 18/274,797

DISTANCE MEASURING DEVICE AND DISTANCE MEASURING METHOD

Non-Final OA §102§103
Filed
Jul 28, 2023
Examiner
MALKOWSKI, KENNETH J
Art Unit
3667
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Nikon Vision Co. Ltd.
OA Round
1 (Non-Final)
75%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
94%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
480 granted / 642 resolved
+22.8% vs TC avg
Strong +19% interview lift
Without
With
+19.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
22 currently pending
Career history
664
Total Applications
across all art units

Statute-Specific Performance

§101
8.3%
-31.7% vs TC avg
§103
40.7%
+0.7% vs TC avg
§102
20.4%
-19.6% vs TC avg
§112
27.7%
-12.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 642 resolved cases

Office Action

§102 §103
DETAILED ACTION Response to Amendment The preliminary amendment filed 7/28/23 has been accepted and entered. Accordingly, claims 1-26 are canceled and new claims 27-46 are added. Accordingly, claims 27-46 are examined herein. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 27-31, 40-41 and 46 are rejected under 35 U.S.C. 102(a)(1) as anticipated by US 20160103209 to Masuda et al. (Masuda) With respect to claims 27 and 46, Masuda discloses a distance measurement apparatus that projects light to measure a distance to a target object, the distance measurement apparatus comprising: a control unit that controls a projection state1 of the light based on a detection result of the target object; (i.e., second control unit 47, ¶¶ 87-90; 61-72 – i.e., ¶ 88 “changes the inclination angle of the reflection surface and the direction of the optical axis L2 under the control of the second control unit 47”) a light projection unit that projects the light controlled by the control unit onto the target object; and (i.e., laser ranging unit 12, FIG. 14 and corresponding description) a processing unit that determines the distance to the target object based on a detection result of reflected light. (i.e., ¶ 61 “second control unit 47 is provided with a distance calculation unit 48. The distance calculation unit 48 measures the time (the reciprocation time of the laser beam LB) until the laser beam LB emitted from the laser light source 43 is reflected by the subject and received as the reflected beam RB by the light receiving element 46, and calculates the distance (hereinafter, referred to as distance information) from the digital camera 10 to the subject (laser radiation position) based on the measured value”) (¶¶ 54, 61-64, 71, 74-75, 87-89, FIG. 14-15 and corresponding description; i.e., the "laser irradiation position IP", the "direction of the optical axis L2", the "second control unit", the "laser ranging unit", the "distance calculation unit", and the "imaging device” correspond to the "detection result", the "projection state", the "control unit", the "projection unit", the "processing unit", and the "distance measurement device") With respect to claim 28, Masuda discloses an image capture unit that captures an image of the target object, wherein the control unit detects the target object based on an image capture result obtained by the image capture unit. (¶¶ 61-64, 87-89, FIG. 15, and “imaging element” in various figures and corresponding descriptions) With respect to claim 29, Masuda discloses wherein the light projection unit projects the light onto the target object via an optical system, and the image capture unit captures the image of the target object via the optical system or an image capturing optical system different from the optical system, to detect the target object. (¶¶ 61-64, 87-89, FIG. 15, i.e., laser ranging unit includes an optical system, such as a first objective lens, a dichroic mirror, and a reflecting mirror corresponding to the "optical system", and the first image D1 is acquired via an imaging lens 19 (corresponding to the "capturing optical system differing from the optical system”) With respect to claim 30, Masuda discloses wherein the light projection unit includes a light source unit that emits the light, and the control unit controls any one of the light source unit and the optical system to control the projection state of the light. (¶¶ 61-64, 87-89, FIG. 15, laser ranging unit includes a laser light source (corresponding to the "light source unit"; i.e., ¶ 54 “laser light source 43”) With respect to claim 31, Masuda discloses wherein the projection state of the light includes any one of a radiation direction of the light and an intensity of the light. (i.e., second control unit 47, ¶¶ 87-90; 61-72 – i.e., ¶ 88 “changes the inclination angle of the reflection surface and the direction of the optical axis L2 under the control of the second control unit 47”) With respect to claim 40, Masuda discloses a display unit that displays, on a display screen, the image of the target object obtained by the image capture unit. (i.e., first image D1 is displayed on display unit 18, i.e., 18, FIG. 2-3, 9, 15; s12 and s19, Fig. 6; s33, s40, FIG. 17A; s44, s51 FIG. 17B; FIG. 7-8 and corresponding descriptions, i.e., target object can be, i.e., subject (laser radiation position), position IP, ¶¶ 61-66, 70-71) With respect to claim 41, Masuda discloses wherein the display unit displays an object to be superimposed on the image of the target object, the object indicating a location on the target object onto which the light is projected, or the target object to which the distance is determined. (i.e., first image D1 is displayed on display unit 18, i.e., 18, FIG. 2-3, 9, 15; s12 and s19, Fig. 6; s33, s40, FIG. 17A; s44, s51 FIG. 17B; FIG. 7-8 and corresponding descriptions, i.e., target object can be, i.e., subject (laser radiation position), position IP, ¶¶ 61-66, 70-71; the laser irradiation position and the distance information are displayed within the first image D1 and a laser irradiation position identifying unit searches within the first image D1 for a region matching a second image D2. Because the second image D2 is image information in which a local region, the center of which is the irradiation position, has been captured during the radiation of laser light (i.e., ¶ 62); ¶ 81, FIG. 4-5, 7-8, 10 and corresponding descriptions) Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 32-33 are rejected under 35 U.S.C. 103 as being unpatentable over Masuda in view of Japanese Patent Application Publication No. 2009-192415 to Arata et al. (Arata) (cited by Applicant) With respect to claims 32-33, Masuda fails to explicitly disclose using image analysis to identify the target object. Arata, from the same field of endeavor, discloses a target ranging device comprising: a control means for detecting a target from an image captured by means of an imaging means, and controlling the angle of projection of a projection means on the basis of the position of the detected target in the image; and a distance calculation means for calculating the distance to a target, a pedestrian detection unit identifies a pedestrian serving as a target by means of comparison to a pre-generated learning model; and a display unit for instructing of processing results is provided and an analysis unit that analyzes the image capture result obtained by the image capture unit wherein the analysis unit analyzes the image of the target object in the image capture result to identify the target object. (¶¶ 15-17 , 20, FIG. 1 and corresponding description, i.e., pedestrian detection unit and learning model disclose the analysis unit analyzing the image capture result to identify the target object) Accordingly, it would have been obvious to one of ordinary skill in the art at the time of effective filing date to implement the image analysis and machine learning model to identify a target object as taught by Arata, in the system of Masuda in order to provide the user with additional information and usability, i.e., clarifying the identity of unknown objects while the distance measurement apparatus is in use such that the user can confirm they are measuring the distance to the correct desired object. Claim 34 is rejected under 35 U.S.C. 103 as being unpatentable over Masuda in view of Arata and further in view of Japanese Patent Application Publication No. JP 2012-186538 to Riichi et al. (Rich) (cited by Applicant) With respect to claim 34, Masuda in view of Arata fail to explicitly disclose edge detection. Rich, from the same field of endeavor, discloses analyzing an image of a target object including at least an edge detection method (¶ 31 carrying out subject recognition by extracting the outline of the subject from image data, and "extracting the outline of the subject"; 102, FIG. 3, extract contour; FIG. 5-6 and corresponding descriptions) Accordingly, it would have been obvious to one of ordinary skill in the art at the time of effective filing date to implement edge detection as taught by Rich into the system of Masuda in view of Arata in order to improve target object detection, i.e., by determining a size of the object (Rich, abstract) and by isolating the object from the background to provide a more precise distance measurement. Claim 35 is rejected under 35 U.S.C. 103 as being unpatentable over Masuda in view of Arata and further in view of Japanese Patent Application Publication No. JP 2005-156356 to Kazuhito (Kaz) (cited by Applicant) With respect to claim 35, Masuda in view of Arata fail to explicitly disclose centering the target object. Kaz, from the same field of endeavor, discloses a subject distance measurement display device, wherein the imaging direction of an imaging unit is regulated so that the measurement point at which a distance has been measured enters a ranging frame displayed at the center position of the display screen of an image display unit (¶ 29 display control unit 83 generates an image obtained by combining the distance measurement frames by the image composition unit 60, and a composite image including the distance measurement frame at the center position of the captured image; FIG. 4 and corresponding description). Accordingly, it would have been obvious to one of ordinary skill in the art at the time of effective filing date to implement the functionality of Kaz, cited above, into the invention of Masuda, since both disclosures carry out the ranging of an arbitrary measurement point on an image, and therefore regulating the imaging unit so that the subject is imaged at the center of the imaging region of the first imaging element, and displaying the location on the subject that is being irradiated with laser light so that said location is positioned in the center of the screen, are matters that could easily have been achieved in the invention described in Masuda by a person skilled in the art on the basis of the aforementioned disclosure of Kaz. When doing so, configuring the laser irradiation position identifying unit so as to search, at the image center, within the first image D1 for a region matching the second image D2 is a matter that could have been addressed, as appropriate, by a person skilled in the art. The combination is further obvious in order to improve the ease of determining a specific portion of an image region in which a user wants to determine a distance (i.e., Kaz ¶¶ 7 the dimension measuring position variable element in the state where the camera or mobile phone device with these distance measuring devices is held by hand and the image to be photographed is displayed on the viewfinder, LCD monitor screen, etc. It is troublesome to specify the measurement position by operating and to specify the measurement point by operating the cursor movement key; 8 an easy-to-operate subject distance measurement display device that can easily specify a measurement point on a subject) Claim 36 is rejected under 35 U.S.C. 103 as being unpatentable over Masuda in view of Arata and further in view of US 20180165815 to Okada et al. (Okada) With respect to claim 36, Masuda in view of Arata fail to explicitly disclose using an image difference between a plurality of images to aid in the identification. Okada, from the same field of endeavor, discloses identifying a search region in which a target is present on the basis of the differences in image data calculated from current image data and previous image data, wherein the "differences in image data" represent the image differences among images captured a plurality of times at different timings (¶ 67). Accordingly, it would have been obvious to one of ordinary skill in the art at the time of effective filing date to implement the differential detection method recited in Okada, in the system of Masuda in view of Arata in order to ensure the target object is continually tracked, i.e., that the same object is identified over time (Okada, ¶ 67) and with high accuracy (Okada, ¶¶ 6-8). Claim 37 is rejected under 35 U.S.C. 103 as being unpatentable over Masuda in view of Arata and further in view of Japanese Patent No. JP 11-88870 A to Kano (Kano) (cited by Applicant) With respect to claim 37, Masuda in view of Arata fail to explicitly disclose determining resolution to enlarge or reduce the image of the target object Kano, from the same field of endeavor discloses an analysis unit detects resolution of the image, and the control unit controls the optical system based on a detection result of the resolution, to enlarge or reduce the image of the target object (¶¶ 34-35). Accordingly, it would have been obvious to one of ordinary skill in the art at the time of effective filing date to implement the sizing control disclosed in Kano above, in the system of Masuda in view of Arata in order to provide an appropriate level of zoom, according to user interest, to enable a user to identify various potential targets accurately within a field of view. In addition, the modification enables imaging with reduced blurring and higher reliability with respect to a moving object (Kano, ¶¶ 4-5, 34-35). Claim 38 is rejected under 35 U.S.C. 103 as being unpatentable over Masuda in view of Arata and further in view of US Patent No. 11893749 to Yang et al. (Yang) With respect to claim 38, Masuda fails to explicitly disclose analyzing the image to detect a center of gravity of a target object. Yang, from the same field of endeavor, discloses analyzing the image to detect a center of gravity of a target object, to further determine and monitor a distance between a target and an imager (col. 8, ll. 48-56 the image photographed by the camera is a two-dimensional picture . . . Adjustment of a focal length during focus following is determined according to a distance between the living object and the camera, such that providing a method for monitoring a distance between a living object and a camera in real time is very valuable; col. 9, ll. 30-40 a distance between the living object and the camera is unfixed, sizes of the living object photographed at the same posture and at the same viewing angle are different, and according to the imaging principle, the present disclosure calculates the direction of the depth moving speed of the living object by calculating the change proportion of the size of the motion gravity center image within the initial time period; claim 7 a direction of the depth moving speed of the living object is calculated by calculating a change proportion of a size of the motion gravity center image within the initial time period; abstract; col. 10, ll. 30-60) Accordingly, it would have been obvious to one of ordinary skill in the art at the time of effective filing date to analyze the image to detect a center of gravity of a target object, as taught by Yang, such that the projected light of Masuda in view of Arata provides a focus at the center of gravity of the target object, in order to reduce blurring and maintain focus on a target object (Yang, col. 10, ll. 30-60) with a low cost, in a computationally efficient manner (Yang, abstract) Claim 39 is rejected under 35 U.S.C. 103 as being unpatentable over Masuda in view of Arata and further in view of US 20200402249 to Kim et al. (Kim) With respect to claim 39, Masuda fails to explicitly disclose scanning the target object such that the distance is determined based on a scan position of the light on the target object and detected reflected light. Kim, from the same field of endeavor, discloses scanning the target object such that the distance is determined based on a scan position of the light on the target object and detected reflected light. (¶¶ 47-49, 61 process of measuring a distance from an object is performed for each frame . . . first frame acquired by the camera 10 and first scan data are acquired, the distance measurement unit 40 performs the above-described process to acquire a distance from the object, corresponding to the second frame, based on a vehicle moving distance. By repeatedly performing such a process until the next scan data is acquired, the distance measurement unit 40 acquires a distance from the object, corresponding to each of the frames between the scan periods; 9, 20, FIG. 5 and corresponding description; 18) Accordingly, it would have been obvious to one of ordinary skill in the art at the time of effective filing date to implement the scanning technique of Kim in the system of Masuda in view of Arata in order to provide high quality distance measurements via camera with low cost, effectively providing additional data akin to a stereo effect with only one camera (Kim, ¶ 21). Claim 42 is rejected under 35 U.S.C. 103 as being unpatentable over Masuda in view of Japanese Patent Application Publication No. JP 2005-156356 to Kazuhito (Kaz) (cited by Applicant) With respect to claim 42, Masuda fails to explicitly disclose the the display unit displays the image of the target object obtained by the image capture unit such that a location on the target object onto which the light is projected is positioned at a center of a screen. Kaz, from the same field of endeavor, discloses a subject distance measurement display device, wherein the imaging direction of an imaging unit is regulated so that the measurement point at which a distance has been measured enters a ranging frame displayed at the center position of the display screen of an image display unit (¶ 29 display control unit 83 generates an image obtained by combining the distance measurement frames by the image composition unit 60, and a composite image including the distance measurement frame at the center position of the captured image; FIG. 4 and corresponding description). Accordingly, it would have been obvious to one of ordinary skill in the art at the time of effective filing date to implement the functionality of Kaz, cited above, into the invention of Masuda, since both disclosures carry out the ranging of an arbitrary measurement point on an image, and therefore regulating the imaging unit so that the subject is imaged at the center of the imaging region of the first imaging element, and displaying the location on the subject that is being irradiated with laser light so that said location is positioned in the center of the screen, are matters that could easily have been achieved in the invention described in Masuda by a person skilled in the art on the basis of the aforementioned disclosure of Kaz. When doing so, configuring the laser irradiation position identifying unit so as to search, at the image center, within the first image D1 for a region matching the second image D2 is a matter that could have been addressed, as appropriate, by a person skilled in the art. The combination is further obvious in order to improve the ease of determining a specific portion of an image region in which a user wants to determine a distance (i.e., Kaz ¶¶ 7 the dimension measuring position variable element in the state where the camera or mobile phone device with these distance measuring devices is held by hand and the image to be photographed is displayed on the viewfinder, LCD monitor screen, etc. It is troublesome to specify the measurement position by operating and to specify the measurement point by operating the cursor movement key; 8 an easy-to-operate subject distance measurement display device that can easily specify a measurement point on a subject) Claims 43-44 are rejected under 35 U.S.C. 103 as being unpatentable over Masuda in view of Japanese Patent Application Publication No. JP 2001-124544 to Kiyoshi et al. (Kiyoshi) (cited by Applicant) With respect to claims 43-44, Masuda fails to explicitly disclose the display includes a touch sensor. Kiyoshi, from the same field of endeavor, discloses a camera-type distance measurement device, wherein: the back face of the camera body includes a touch panel, and a pointing pen is used to designate a measurement point; and a plurality of measurement points are designated, and a slope distance between two measurement points (corresponding to the "distance between a plurality of locations" set forth in claim 44 and the horizontal area. corresponding to the "area" set forth in claim 44 of a triangle comprising three measurement points are calculated). (¶¶ 14, 51, 72, FIG. 11 and 20 and corresponding descriptions, abstract) Accordingly, configuring the invention of Masuda so as to include a touch panel, use a pointing pen to designate the position at which the target exists in the first image D1, and calculate, when a plurality of measurement points have been designated, the slope distance between two measurement points or the horizontal area of a triangle comprising three measurement points is a matter that could easily have been achieved by a person skilled in the art on the basis of the aforementioned disclosure of Kiyoshi, and when the position at which the target exists in the first image D1 has been designated using a pointing pen, the designated position is irradiated with laser light in order to provide a more comprehensive ranging device to improve surveying wherein the user can select any pixel of a displayed image and determine the distance (Kiyoshi, abstract, ¶ 5) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to KENNETH J MALKOWSKI whose telephone number is (313)446-4854. The examiner can normally be reached 8:00 AM - 5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Faris Almatrahi can be reached at 313-446-4821. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /KENNETH J MALKOWSKI/Primary Examiner, Art Unit 3667 1 No limiting definition provided, but includes at least direction or intensity of light (i.e., claim 31). Accordingly, under a BRI of the term, this includes anything that affects a property of the light based on a detection result.
Read full office action

Prosecution Timeline

Jul 28, 2023
Application Filed
Mar 02, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12589745
VISUAL GUIDANCE METHOD FOR IMPROVING AUTONOMOUS NAVIGATION WITH ROW FOLLOWING CORRECTIONS IN STEREO CAMERA SYSTEMS
2y 5m to grant Granted Mar 31, 2026
Patent 12583443
MOVING BODY CONTROL DEVICE, MOVING BODY CONTROL METHOD, AND MOVING BODY CONTROL PROGRAM
2y 5m to grant Granted Mar 24, 2026
Patent 12571636
METHOD AND DEVICE WITH LANE DETECTION
2y 5m to grant Granted Mar 10, 2026
Patent 12553733
COMPUTER-IMPLEMENTED METHOD FOR BEHAVIOR PLANNING OF AN AT LEAST PARTIALLY AUTOMATED EGO VEHICLE WITH A SPECIFIED NAVIGATION DESTINATION
2y 5m to grant Granted Feb 17, 2026
Patent 12546621
TRAVELING TRACK GENERATION DEVICE AND TRAVELING TRACK GENERATION METHOD
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
75%
Grant Probability
94%
With Interview (+19.1%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 642 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month