Prosecution Insights
Last updated: April 19, 2026
Application No. 18/235,497

MEASURING DEVICE WITH TOF SENSOR

Non-Final OA §103§112
Filed
Aug 18, 2023
Examiner
ITSKOVICH, MIKHAIL
Art Unit
2483
Tech Center
2400 — Computer Networks
Assignee
Hexagon Technology Center GmbH
OA Round
3 (Non-Final)
35%
Grant Probability
At Risk
3-4
OA Rounds
4y 0m
To Grant
59%
With Interview

Examiner Intelligence

Grants only 35% of cases
35%
Career Allow Rate
206 granted / 585 resolved
-22.8% vs TC avg
Strong +24% interview lift
Without
With
+23.8%
Interview Lift
resolved cases with interview
Typical timeline
4y 0m
Avg Prosecution
62 currently pending
Career history
647
Total Applications
across all art units

Statute-Specific Performance

§101
11.5%
-28.5% vs TC avg
§103
53.5%
+13.5% vs TC avg
§102
12.3%
-27.7% vs TC avg
§112
20.4%
-19.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 585 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 03/03/2026 has been entered. Response to Arguments Applicant's arguments filed on 03/03/2026 have been fully considered but they are not persuasive. Applicant notes: “Applicant has herein amended independent claim 1 to recite subject matter that is not taught or suggested by Bridges, Wong, and/or Camacho in any combination.” Examiner notes that some of the amendments appear to be inconsistent with the Specification. See section 112 below. Applicant argues: “Paragraph [00120] and [00128] provides support for the amendment (cited below with emphasis added). "The emitting unit and the receiving unit may be understood as part of an electronic distance meter (EDM) or also known as range finder using time of flight … and "The measuring device 1 comprises a Time-Of-Flight (TOF) sensor 15 …” Examiner notes that Paragraphs 120 and 128 appear to indicate that the emitting unit and the receiving unit and a Time-Of-Flight (TOF) sensor can be components of an electronic distance meter. These paragraphs do not appear to support that these are separate distance meters usable together as argued. “The Applicant submits that the features of former claim 5 have been incorporated into independent claim 1, and it is now specified that: "the controlling and processing unit is configured to provide the pixel-related TOF data associated with the pixel-related image data so that each pixel of the pixel-related image data is assigned to at least one pixel of the pixel-related TOF data." (emphasis added) … The Office Action implicitly acknowledged that Bridges fails to teach these features by referring to paragraph [0183] of Wong. …” Examiner disagrees. (a) Claim 1 does not incorporate the entirety of Claim 5. See the updated rejection reasons of the amended claim 1 below. (b) Office has not “acknowledged that Bridges fails to teach these features.” As cited in the updated reasons for rejection below, both references remain relevant. Regarding citations to Wong, Applicant argues: “The identified text portion appears to relate to the standard operation of a single 3D camera providing (monochromatic) brightness images and depth data. This corresponds to the TOF-image as defined in paragraph [0042] of the specification as well as in claim 1 of the present application (cited below):” Examiner disagrees. First, Wong does not describe brightness images as monochromatic. Second, Specification Paragraph 42 indicates that “TOF data comprises at least range data and/or amplitude data” which is does not correspond to the brightness images or to depth data of Wong. Finally, this argument is not directed to the language of the claim limitations. “Applicant submits that the claim language does not specify whether the imaging sensor and TOF sensor being two distinct entities or not. Applicant submits that if the imaging sensor …” Examiner notes that what “the claim language does not specify” does not limit the claim. Where prior art recites claimed features combined with additional features, omission of the additional features in the claim does not distinguish it over the prior art reference. M.P.E.P. 2144.04(II)(A), Ex parte Wu, 10 USPQ 2031 (Bd. Pat. App. & Inter. 1989); See also In re Larson, 340 F.2d 965, 144 USPQ 347 (CCPA 1965); and In re Kuhle, 526 F.2d 553, 188 USPQ 7 (CCPA 1975). Applicant argues: “However, Bridges fails to teach any pixel association between. Only a general coordinate transformation between the TOF camera and the instrument is mentioned.” Examiner disagrees. Bridges explicitly teaches: “determining the distance to an object surface over a plurality of pixels on the 3D-camera image sensor. … The 3D-camera may include, but is not limited to a light-field camera and a time-of-flight (TOF) camera. …” See Bridges, Paragraphs 27, 37, 41 and reasons for rejection below. “Applicant submits that if the imaging sensor and the TOF sensor is substantially the same, by way of example they utilize common sensor elements, it follows that TOF sensor must be placed in the directing unit such that an optical axis of the TOF sensor is coaxially aligned with an optical axis of the first distance measuring unit. By way of example realized as the arrangement shown in Figure 3 of the present application. In accordance with Applicant's understanding, Wong fails to teach …” Examiner notes that theoretical “if” supported by examples in the present Application are not claim limitations. Claim 1 is not limited to this example. See updated reasons for rejection of the Claim 1 language below. Applicant argues: “In accordance with Applicant's understanding, Wong fails to teach first distance measuring unit in the sense of the invention (laser rangefinder as mentioned e.g. in paragraph [00120]). Thus, cannot teach such arrangement. … Additionally, Bridges as shown above teaches many different arrangement, however fails to teach an arrangement where the TOF camera is arranged coaxially with the laser distance meter.” In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Bridges is cited to teach the laser rangefinder with a 3D camera and a coaxial arrangement that corresponds to the example in the Specification and Wong teaches other claimed features such as continuous updating of the data. See reasons for rejection below. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1-2, 4-21 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. Claims 1-2, 4-21 recite “the measuring device comprises a second distance measuring unit with a Time-Of-Flight (TOF) sensor.” However, Applicant has not cited and Examiner did not find support in the Specification for the measuring device comprising a second distance measuring unit as claimed. Specification indicates that “additional distance information” can be obtained from the same TOF sensor. See Specification, Paragraph 144. Similarly Claim 21 recites “the TOF sensor is coaxially aligned with an optical axis of the first distance measuring unit” however Applicant has not cited and Examiner did not find support in the Specification for this feature. Specification describes “an optical axis of the TOF sensor can be coaxially aligned with an optical axis of the capturing unit” in Paragraph 48 and more broadly “and an optical axis of the capturing unit is coaxially aligned with an optical axis of the distance measuring unit” in Paragraph 41. The TOF is described as an example or a component of a distance measuring unit but never as a substitute for a capturing unit as claimed. Claim Construction Note that, for purposes of compact prosecution, multiple reasons for rejection may be provided for a claim or a part of the claim. The rejection reasons are cumulative, and Applicant should review all the stated reasons as guides to improving the claim language and advancing the prosecution toward an allowance. Claim scope is not limited by claim language that suggests or makes optional but does not require steps to be performed by a method claim, or by claim language that does not limit an apparatus claim to a particular structure. However, examples of claim language, although not exhaustive, that may raise a question as to the limiting effect of the language in a claim are: (A) “adapted to” or “adapted for” clauses; (B) “wherein” clauses; and (C) “whereby” clauses. M.P.E.P. 2111.04. Other examples are where the claim passively indicates that a function is performed or a structure is used without requiring that the function or structure is a limitation on the claim itself. The clause may be given some weight to the extent it provides "meaning and purpose” to the claimed invention but not when “it simply expresses the intended result” of the invention. In Hoffer v. Microsoft Corp., 405 F.3d 1326, 1329, 74 USPQ2d 1481, 1483 (Fed. Cir. 2005). Further, during prosecution, claim language that may or may not be limiting should be considered non-limiting under the standard of the broadest reasonable interpretation. See M.P.E.P. 904.01(a); In re Morris, 127 F.3d 1048, 44 USPQ2d 1023 (Fed. Cir. 1997). Use of the word “means” (or “step for”) in a claim with functional language creates a rebuttable presumption that the claim element is to be treated in accordance with 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph). The presumption that 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) is invoked is rebutted when the function is recited with sufficient structure, material, or acts within the claim itself to entirely perform the recited function. Absence of the word “means” (or “step for”) in a claim creates a rebuttable presumption that the claim element is not to be treated in accordance with 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph). The presumption that 35 U.S.C. 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph) is not invoked is rebutted when the claim element recites function but fails to recite sufficiently definite structure, material or acts to perform that function. M.P.E.P. 2181(I), Williamson v. Citrix Online, LLC, 792 F.3d 1339, 1348, 115 USPQ2d 1105, 1111 (Fed. Cir. 2015) (en banc, quoting Watts v. XL Systems, Inc., 232 F.3d 877, 880 (Fed. Cir. 2000); Personalized Media Communications, LLC v. International Trade Commission, 161 F. 3d 696, 704 (Fed. Cir. 1998). A substitute term acts as a generic placeholder for the term "means" and would not be recognized by one of ordinary skill in the art as being sufficiently definite structure for performing the claimed function. "The standard is whether the words of the claim are understood by persons of ordinary skill in the art to have a sufficiently definite meaning as the name for structure." Williamson at 1349; see also Greenberg v. Ethicon Endo-Surgery, Inc., 91 F.3d 1580, 1583 (Fed. Cir. 1996). Specification must disclose adequate structure for each of the claimed functions, and the structure for special purpose functions must be more than simply a general purpose computer or microprocessor, specification must also disclose an algorithm for performing these claimed functions. Williamson at 1351. Claims 1-2, 4-19 recite “a first distance measuring unit comprising an emitting unit configured for emitting collimated measuring radiation (T) and a receiving unit … a second distance measuring unit with a Time-Of-Flight (TOF) sensor, … a capturing unit, wherein the capturing unit comprises an image sensor,” reciting a generic term (unit) which is modified by further structures. The limitations do not invoke 35 U.S.C. 112(f) or 35 U.S.C. 112 (pre-AIA ), sixth paragraph. The claims further recite: “a base unit, … a support unit mounted on the base unit … a distance measuring unit comprising an emitting unit configured for emitting collimated measuring radiation (T) and a receiving unit configured for detecting at least a part of the collimated measuring radiation … a directing unit mounted in the support unit, … a controlling and processing unit configured at least for aligning (the controlling and processing unit comprises a target identification functionality … tracking functionality) … an illumination unit configured for illuminating … ” a generic term (unit) modified by functional language but not modified by structure or a structural term and not naming a structure readily recognized by persons of skill in the art to perform the claimed function. The limitation invokes 35 U.S.C. 112(f) or 35 U.S.C. 112 (pre-AIA ), sixth paragraph, and shall be construed to cover the corresponding structure described in the specification and equivalents thereof. See construction of these terms in the context of the claim limitations, the Specification, and the prior art in the reasons for rejection of each claim element below. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2, 4-15, 21 are rejected under 35 U.S.C. 103 as being unpatentable over US 20150015895 to Bridges (“Bridges”) in view of US 20130226344 to Wong (“Wong”). Regarding Claim 1: “A measuring device for acquiring a three-dimensional measuring point related to a target in a scene, the measuring device comprises a base unit, (See an example base unit in Specification Fig. 1. Prior art provides a similar structure: “azimuth base 16” Bridges, Paragraph 23 and Fig. 1.) a support unit mounted on the base unit and rotatable relative to the base unit around an azimuth axis (A), (For example, the support unit can be embodied as the “zenith carriage 14 mounted on azimuth base 16” Bridges, Paragraph 23 and Fig. 1.) a first distance measuring unit comprising an emitting unit configured for emitting collimated measuring radiation (T) and a receiving unit configured for detecting at least a part of the collimated measuring radiation reflected (R) by the target, (Under the broadest reasonable interpretation consistent with the specification and ordinary skill in the art, (a) “the emitting unit is a light source,” embodying a laser or a laser diode and (b) “the receiving unit may be understood as part of an electronic distance meter (EDM) or also known as range finder using time of flight, multiple frequency phase-shift or interferometry technology,” or other distance measuring technology such as a camera or a pair of cameras. See Specification, Paragraphs 120, 36, 39. Prior art teaches this embodiment. “The laser tracker uses a beam of light, such as laser beam 46 [collimated radiation] … Each light source 54 is placed near camera [receiving unit] 52 so that light from light source 54 is reflected off each retroreflector target 26 onto camera 52,” which in turn captures the signal and measures the distance. Bridges, Paragraphs 23, 25, and Fig. 1.) a directing unit mounted in the support unit, rotatable relative to the support unit around an elevation axis (E) and configured for directing the measuring radiation towards the scene, (For example: “Laser beam 46 is pointed in the desired direction by motors [of the directing unit mounted] within the tracker laser (not shown) that rotate payload 15 about zenith axis 18 and azimuth axis 20.” Bridges, Paragraph 23, and Fig. 1.) a capturing unit, wherein the capturing unit comprises an image sensor and is configured to capture at least a scene image of at least part of the scene, and (“Each camera [receiving unit] 52 comprises an optical detector [capturing unit], such as a photosensitive array … retroreflector images are readily distinguished from the background” Bridges, Paragraph 23, and Fig. 1.) a controlling and processing unit configured at least for aligning the directing unit, (“a tracker control system to adjust the rotation angles of the mechanical azimuth and zenith axes of the laser tracker … a control system that sends a second signal to the first motor and a third signal to the second motor” Bridges, Paragraphs 4, 15, Claim 2.) wherein the first distance measuring unit and the capturing unit are arranged in the directing unit and an optical axis of the capturing unit is coaxially aligned with an optical axis of the first distance measuring unit, (“Each light source 54 is placed near camera 52 so that light from light source 54 is reflected off each retroreflector target 26 onto camera 52,” thus the axes of all the units the laser, the camera, and the imager are coaxially aligned. Cumulatively, since the camera [distance measuring unit] and the imager [capturing device] make up the same physical device, their axes are coaxially the same. See, Bridges, Paragraph 25, and Fig. 1. Also note that Specification Paragraph 5 confirms that this is a known feature in the art.) the image sensor is configured to provide the scene image by generating pixel-related image data by detecting visual-spectrum (VIS) light and/or near-infrared-spectrum (NIR) light, (See supporting examples as CCD and CMOS sensors in Specification, Paragraphs 9-10. Prior art teaches: “The image sensor might be a photosensitive array such as a CMOS or CCD array,” which capture visible and/or near infrared spectra. See Bridges, Paragraph 28, and Fig. 1. Cumulatively note that “LIDAR uses ultraviolet, visible, or near infrared light to image objects and can be used with a wide range of targets” in Wang, Paragraphs 119, 175, and statement of motivation below.) the measuring device comprises a second distance measuring unit with a Time-Of-Flight (TOF) sensor, wherein the TOF sensor is configured to provide pixel-related TOF data of at least part of the scene as a TOF image, the pixel-related TOF data comprises at least range data and/or amplitude data for each pixel of the TOF image, (As noted above, “the receiving unit may be understood as part of an electronic distance meter (EDM) or also known as range finder using time of flight, multiple frequency phase-shift or interferometry technology,” or other distance measuring technology such as a camera or a pair of cameras. See Specification, Paragraphs 120, 36, 39. In this case the first distance measuring unit comprises the emitting unit and the receiving unit but not the sensor, and the second distance measuring unit comprises the TOF sensor but not the emitting unit or the receiving unit. Prior art teaches that the sensor of the receiving unit can be a TOF camera: “The 3D-camera may include, but is not limited to a light-field camera and a time-of-flight (TOF) camera. … determining a distance to the target …” See Bridges, Paragraphs 27, 29-31. As a camera, it provides pixel-related TOF data.) the controlling and processing unit comprises a target identification functionality (For example, “a TOF 3D-camera may identify the outline of an object that is to be inspected” See Bridges, Paragraphs 45, 52.) which is configured to process the scene image and the pixel-related TOF data to derive target information based thereon, … wherein the target information comprises a direction to the target with respect to the measuring device and TOF data associated to the target, and (“The 3D-camera may include, but is not limited to a light-field camera and a time-of-flight (TOF) camera. … determining a distance to the target …” See Bridges, Paragraphs 27, 29-31.) the controlling and processing unit comprises a target tracking functionality which is configured to [continuously] update the target information (“Processing system 800 comprises tracker processing unit 810 and optionally computer 80. … Auxiliary unit processor 870 optionally provides timing” indicating tracking over time. Bridges, Paragraphs 41, 52. See treatment of continuous operation below.) [continuously] deriving a position of the target in the scene image by image processing of the scene image and (“For example, the 3D-camera 55 may capture an image of the process and the engine 826 is used to identify a desired object, such as a robot end effector. Using this information, the laser tracker 10 transforms this information into an azimuth angle and a zenith angle to allow the rotation of the payload 15 to the desired location and the rapid acquisition of a retroreflective target.” Bridges, Paragraph 52.) [continuously] deriving TOF data for the target by means of the TOF sensor, and (“The 3D-camera may include, but is not limited to a light-field camera and a time-of-flight (TOF) camera. … the 3D-camera 55 may be used with a laser tracker 10 in an automated system where the 3D-camera is used to identify components within the process. For example, the 3D-camera 55 may capture an image of the process and the engine 826 is used to identify a desired object, such as a robot end effector. Using this information, the laser tracker 10 transforms this information into an azimuth angle and a zenith angle to allow the rotation of the payload 15 to the desired location and the rapid acquisition of a retroreflective target.” Briges, Paragraphs 27, 52.) which target tracking functionality is configured to continuously control directing of the measuring radiation towards the target based on the updated target information, (“a tracker control system to adjust the rotation angles of the mechanical azimuth and zenith axes of the laser tracker … a control system that sends a second signal to the first motor and a third signal to the second motor” Bridges, Paragraphs 4, 15, Claim 2.) wherein the controlling and processing unit is configured to provide the pixel-related TOF data associated with the pixel-related image data so that each pixel of the pixel-related image data is assigned to at least one pixel of the pixel-related TOF data.” (For example, “includes a three-dimensional (3D) camera device 55. The 3D-camera device 55 is capable of capturing both visual and distance information. As used herein, a 3D-camera is a device having a single photosensitive array capable of determining the distance to an object surface over a plurality of pixels on the 3D-camera image sensor. … The 3D-camera may include, but is not limited to a light-field camera and a time-of-flight (TOF) camera. … 3D-camera may provide received information to internal processors” See Bridges, Paragraphs 27, 37, 41. See a similar teaching in Wang, Paragraph 183 and statement of motivation below.) Bridges does not explicitly teach the claim features below: Wong teaches the features below in the context of a 3D tracking camera system with a mount for controlling system rotation: continuously update the target information by continuously updating the scene image … position of the target … TOF data (Bridges teaches the above features in the context of tracking over time, which implies a continuous operation. Cumulatively, Wang teaches “The operations may include repeating 1416 operations 1402-1412 or operations 1406-1412, and optionally operation 1414, (e.g., continuously) to track motion of the object 12 within the scene 10. … supplies continuous position data ” Wang, Paragraphs 181, 130. See tracking the object by position in the image and TOF data above.) to provide a video stream of the scene, and (Bridges teaches the above features in the context of tracking over time, which implies capturing continuous images and sensor data, but does not explicitly teach capturing video. Wong teaches: “a camera 320 and/or other imaging device 450 disposed on the head 160 (see e.g., FIGS. 3A and3C), which can be used to capture video or 3D volumetric point clouds from an elevated vantage point of the head 160 ( e.g., for videoconferencing).” Wong, Paragraph 102. Also see video tracking in Paragraph 209 and the video conferencing based on camera mounted on a robot in Paragraph 236.) Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to supplement the teachings of Bridges to use the above features as taught in Wong, in order to use a robot to continuously locate the object of interest and record video using at least visual spectrum to provide video conferencing or object monitoring. Wong, Paragraph 92. Finally, in reviewing the present application, there does not seem to be objective evidence that the claim limitations are particularly directed to: addressing a particular problem which was recognized but unsolved in the art, producing unexpected results at the level of the ordinary skill in the art, or any other objective indicators of non-obviousness. Regarding Claim 2: “The measuring device according to claim 1, wherein the controlling and processing unit comprises a pixel relating functionality configured to derive a distance value for each pixel of the TOF image and relating each pixel of the scene image to at least one pixel of the TOF image based on the distance values for the pixels of the TOF image.” (“photosensitive array capable of determining the distance to an object surface over a plurality of pixels on the 3D-camera image sensor. … The 3D-camera may include, but is not limited to a light-field camera and a time-of-flight (TOF) camera. … determining a distance to the target …” See Bridges, Paragraphs 27, 29-31.) Regarding Claim 4: “The measuring device according to claim 1, wherein the TOF sensor and the capturing unit are configured and arranged relative to each other so that a field of view of the TOF sensor is greater than a field of view of the image sensor, in particular wherein a resolution of the image sensor is greater than a resolution of the TOF sensor.” (“Each light source 54 is placed near camera 52 so that light from light source 54 is reflected off each retroreflector target 26 onto camera 52,” and thus coaxially aligned on an optical axis. Bridges, Paragraph 25, and Fig. 1. Further, “The spinning LIDAR has limited positioning on a robot. For example, its generally position on top with an unobstructed field of view … sweeps about 360 degrees about the collar axis to form a substantially complete perimeter of the collar,” thus providing a greater field of view than the image sensor. Wang, Paragraph 4, 25-30, and statement of motivation in Claim 1.) Regarding Claim 5: “The measuring device according to claim 1, wherein the pixels of the pixel-related image data are divided into pixel groups and each group is assigned to one respective pixel of the pixel-related TOF data.” (For example, “determining the distance to an object surface over a plurality of pixels” embodying a pixel group. Bridges, Paragraph, 27. “distance and angle corresponding to each pixel” also embodying a group of different pixels assigned to one respective pixel. Bridges, Paragraph 27. Also note that the two can be divided into pixel groups and depth groups: “The CMOS sensor 1520 may include an array of pixel detectors 1522, or other arrangement of pixel detectors 1522, where each pixel detector 1522 is capable of detecting the intensity and phase of photonic energy impinging upon it. … The camera controller 1540 provides a sequence of operations that formats pixel data obtained by the CMOS sensor 1520 into a depth map and a brightness image,” assigning pixels of image data to pixels of the depth map. Wang, Paragraph 183. See statement of motivation in Claim 1.) Regarding Claim 6: “The measuring device according to claim 5, wherein the associated pixel-related TOF data and pixel-related image data are provided in an overlay manner.” (“the 3-D imaging sensor 450 includes a 3D time-of-flight ( TOF) camera 1500 for obtaining depth image data. … The camera controller 1540 provides a sequence of operations that formats pixel data obtained by the CMOS sensor 1520 into a depth map and a brightness image,” providing overlapping pixel images. Wang, Paragraph 183. See statement of motivation in Claim 1.) Regarding Claim 7: “The measuring device according to claim 1, wherein the range data comprises a distance value for each pixel or range information for each pixel related to range measurement with the TOF sensor and/or the amplitude data comprises a signal strength related to an intensity of the detected collimated measuring radiation.” (“The CMOS sensor 1520 may include an array of pixel detectors 1522, or other arrangement of pixel detectors 1522, where each pixel detector 1522 is capable of detecting the intensity [amplitude] and phase of photonic energy impinging upon it. … The camera controller 1540 provides a sequence of operations that formats pixel data obtained by the CMOS sensor 1520 into a depth [distance or range] map and a brightness image.” Wang, Paragraph 183. See statement of motivation in Claim 1.) Regarding Claim 8: “The measuring device according to claim 1, wherein the measuring device comprises an illumination unit configured for illuminating at least a part of the scene with illumination radiation, wherein the illumination radiation comprises a modulated illumination signal and the pixel-related TOF data is generateable by detecting the modulated illumination signal provided by the illumination radiation.” (“One type of TOF camera uses an RF modulated light [radiation] source with a phase detector. … These devices work by modulating the outgoing beam with an RF carrier, measuring the phase shift of the reflected light, and determining a distance to the target based on the phase shift and on the speed of light in air.” Bridges, Paragraph 29 and similarly in Wong, Paragraph 183.) Regarding Claim 9: “The measuring device according to claim 1, wherein the pixel-related image data is generateable by detecting non-modulated signal of the visual-spectrum (VIS) light and/or the near-infrared-spectrum (NIR) light by means of the image sensor, in particular a non-modulated illumination signal of the illumination radiation.” (“The 3D-camera may include, but is not limited to a light-field camera [detecting non-modulated light] and a time-of-flight (TOF) camera [detecting modulated light]. … One type of TOF camera uses an RF modulated light [radiation] source with a phase detector.” Bridges, Paragraphs 27, 29, and similarly in Wong, Paragraph 183.) Regarding Claim 10: “The measuring device according to claim 1, wherein the controlling and processing unit comprises a target differentiation functionality configured for differentiating at least two particular targets of a set of targets, (“One application for such a camera is to detect multiple retroreflectors in the field of view and measure each in an automated sequence.” Bridges, Paragraphs 7, 23. Also see application to objects in general in Bridges, Paragraphs 9, 45, 47, and similarly in Wong, Paragraph 183.) wherein the pixel-related TOF data is processed so that the target information comprises directions to the at least two particular targets of the set of targets with respect to the measuring device and respective TOF data for the at least two targets, the TOF data is derived and associated to each of the at least two targets, … in particular wherein a position of each of the at least two targets in the scene image is derived, in particular a direction to each of the at least two targets is derived with respect to the measuring device, and (“One application for such a camera is to detect multiple retroreflectors in the field of view and measure each in an automated sequence. … By measuring the radial distance between gimbal point 22 and retroreflector 26 and the rotation angles about the zenith and azimuth axes 18, 20, the position of retroreflector 26 is found within the spherical coordinate system of the laser tracker 10 (i.e. the device frame of reference).” Bridges, Paragraphs 7, 23.) the positions and the TOF data of each of the at least two targets are provided, in particular displayed, in an associated manner.” (“the three-dimensional images captured by the 3D-camera 55 may be displayed on the graphical display 218.” Bridges, Paragraph 44.) Regarding Claim 11: “The measuring device according to claim 10, wherein the target differentiation functionality is configured to receive a target selection criterion, to apply the target selection criterion on the TOF data of each of the at least two targets, (“a TOF 3D-camera may identify the outline [a target selection criterion] of an object that is to be inspected” Bridges, Paragraphs 45, 62, 36. Also note criterion for detecting retroreflectors in Paragraph 7.) to determine a matching measure for each of the at least two targets based on applying the target selection criterion on the TOF data and (For example comparing reflector and non-reflector objects in Bridges, Paragraph 7.) to select one target of the at least two targets based on the matching measures.” (For example comparing reflector and non-reflector objects in Bridges, Paragraph 7. Also see selecting objects by other criteria in Bridges, Paragraphs 45, 62, 36.) Regarding Claim 12: “The measuring device according to claim 1, wherein deriving the target information comprises processing the pixel-related TOP data by comparing first TOF data of a first group of pixels with second TOF data of a second group of pixels and identifying the target based on a difference between the first and the second TOF data.” (“the controller 500 receiving information from the speckle camera 1300 can use local cross-correlation with the reference image that gave the closest match.” Wong, Paragraphs 177-178. See statement of motivation in Claim 1.) Regarding Claim 13: “The measuring device according to claim 1, wherein the controlling and processing unit comprises a sub-tracking functionality configured to perform the steps of: processing the pixel-related TOF data, (“The 3D-camera may include, but is not limited to a light-field camera and a time-of-flight (TOF) camera. … By measuring the radial distance between gimbal point 22 and retroreflector 26 and the rotation angles about the zenith and azimuth axes 18, 20, the position of retroreflector 26 is found [by processing the measurements] within the spherical coordinate system of the laser tracker 10 (i.e. the device frame of reference).” Bridges, Paragraphs 27, 23.) identifying a number of targets based on the processing of the pixel-related TOF data, (“One application for such a camera is to detect multiple retroreflectors in the field of view and measure each in an automated sequence. … By measuring the radial distance between gimbal point 22 and retroreflector 26 and the rotation angles about the zenith and azimuth axes 18, 20, the position of retroreflector 26 is found within the spherical coordinate system of the laser tracker 10 (i.e. the device frame of reference).” Bridges, Paragraphs 7, 23.) determining respective positions of the identified targets in the TOP image, (“By measuring the radial distance between gimbal point 22 and retroreflector 26 and the rotation angles about the zenith and azimuth axes 18, 20, the position of retroreflector 26 is found within the spherical coordinate system of the laser tracker 10 (i.e. the device frame of reference).” Bridges, Paragraphs 7, 23.) deriving TOF data for the identified targets, (“One type of TOF camera uses an RF modulated light source with a phase detector. … These devices work by modulating the outgoing beam with an RF carrier, measuring the phase shift of the reflected light, and determining a distance to the target” Bridges, Paragraph 29.) providing the TOF image or the scene image together with markers, each marker is associated with a respective identified target and (For example, “By measuring the radial distance between gimbal point 22 and retroreflector 26 and the rotation angles about the zenith and azimuth axes 18, 20, the position [marker based on TOF data associated with the target] of retroreflector 26 is found within the spherical coordinate system of the laser tracker 10 (i.e. the device frame of reference).” Bridges, Paragraphs 7, 23. In another example where the detected objects are user gestures the restores are marked with appropriate data: “gestures may be associated with various controls of the laser tracker 10. In other words, there may be a rule of correspondence between each of a plurality of gestures and each of the plurality of commands or controls for the laser tracker 10. … The gestures that may be interpreted by the gesture engine 826 based on three-dimensional data acquired by the 3D-camara 55” including TOF data. See Bridges, Paragraphs 48-49.) each marker indicates the position of its associated identified target in the provided image, wherein each marker comprises an indicator indicating a measure of the TOF data for the respective target,” (“By measuring the radial distance between gimbal point 22 and retroreflector 26 and the rotation angles about the zenith and azimuth axes 18, 20, the position of retroreflector 26 is found within the spherical coordinate system of the laser tracker 10 (i.e. the device frame of reference).” Bridges, Paragraphs 7, 23.) Regarding Claim 14: “The measuring device according to claim 13, wherein the controlling and processing unit comprises a switching functionality configured to: receive a user input related to selecting one of the markers and (“The gestures may be performed by one or more of the operator's body parts, or the relative movement or position of those parts to each other (i.e. spatial configuration). Those gestures may be associated with various controls of the laser tracker 10.” Bridges, Paragraph 48.) control alignment of the directing unit so that the collimated measuring radiation is directed towards the target associated with the selected marker.” (“This image is then used to identify the location of the operator and allow the laser tracker 10 to rotate the payload 15 about the azimuth and zenith axis to allow rapid acquisition of the retroreflector 26 with the laser beam 46.” Bridges, Paragraph 51.) Regarding Claim 15: “The measuring device according to claim 1, wherein the measuring device comprises a zoom objective, wherein the zoom objective and the capturing unit are arranged so that an optical axis of the zoom objective and an optical axis of the capturing unit are coaxial and an orientation of the coaxial axes is alignable by means of the directing unit.” (“he long range sensor 2190 may be an imaging sensor 450 ( e.g., having optics or a zoom lens configured for relatively long range detection). In additional examples, the long range sensor 2190 is a camera ( e.g., with a zoom lens), a laser range finder, LIDAR, RADAR, etc.” Wong, Paragraph 235. See treatment of alignment of the cameras and statement of motivaiton in Claim 1.) Regarding Claim 21: “The measuring device according to claim 20, wherein the TOF sensor is arranged in the directing unit and an optical axis of the TOF sensor is coaxially aligned with an optical axis of the first distance measuring unit.” (See reasons for rejection under section 112 above. Cumulatively: Under the broadest reasonable interpretation consistent with the specification and ordinary skill in the art, (a) TOF sensor is the measuring part of the distance measuring unit, (b) coaxially aligned means pointing in the same direction rather than being coaxial or sharing an axis, and (c) the claimed result can be accomplish when “The distance measuring unit and the capturing unit are arranged in the directing unit and an optical axis of the capturing unit is coaxially aligned with an optical axis of the distance measuring unit.” See Specification, Paragraph 41. Prior art provides an example of this arrangement: “An exemplary gimbaled beam-steering mechanism [directing unit] 12 of laser tracker 10 … Laser beam 46 is pointed in the desired direction by motors [of the directing unit mounted] within the tracker laser (not shown) that rotate payload 15 about zenith axis 18 and azimuth axis 20. … In the exemplary embodiment, the laser tracker 10 further includes a three-dimensional (3D) camera device 55” Bridges, Paragraph 23, and Fig. 1. “The 3D-camera may include, but is not limited to a light-field camera and a time-of-flight (TOF) camera. … determining a distance to the target …” See Bridges, Paragraphs 27, 29-31. Thus, the TOF sensor is part of the 3D camera which is arranged in the directing unit and thus it is aligned with the optical axis of the laser unit (distance measuring unit) so as to detect light returned from the direction of the laser.) Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over US 20150015895 to Bridges (“Bridges”) in view of US 20130226344 to Wong (“Wong”) and in view of US 20220377242 to Camacho (“Camacho”). Regarding Claim 16: “The measuring device according to claim 15, wherein the controlling and processing unit comprises Bridges and Wong mention but do not teach the details of the claimed functionality below. Camacho teaches the claimed details in the context of tracking cameras: a focusing functionality configured to: derive a focussing distance based on the TOF data or on distance information provided by the first distance measuring unit and (“The range-to-focus module estimates the lens position for ideal focus based on three input variables: target distance (e.g., as determined from the radar data), zoom lens position” Camacho, Paragraph 26.) control the zoom objective so that a particular zoom level is provided which zoom level correlates with the focussing distance.” (See application of focus in Wong, Paragraph 175. Camacho elaborates: “RFO auto-focus system adjusts the focus setting, the zoom setting, … dual lens system (e.g., a focusing lens and a zoom lens). Each of these lenses can be controlled programmatically … The range-to-focus module estimates the lens position for ideal focus based on three input variables: target distance (e.g., as determined from the radar data), zoom lens position” Camacho, Paragraphs 24-26.) Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to supplement the teachings of Bridges and Wong to perform the above functions in the manner taught in Camacho, in order to provide the camera with an autofocus functionality. See Camacho, Paragraph 24. Finally, in reviewing the present application, there does not seem to be objective evidence that the claim limitations are particularly directed to: addressing a particular problem which was recognized but unsolved in the art, producing unexpected results at the level of the ordinary skill in the art, or any other objective indicators of non-obviousness. Claims 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over US 20150015895 to Bridges (“Bridges”) in view of US 20130226344 to Wong (“Wong”) and in view of US 20170230551 Akkaya (“Akkaya”). Regarding Claim 17: “The measuring device according to claim 1, wherein the image sensor is configured to provide the scene image by generating pixel-related image data by detecting VIS light and NIR light, and (“The image sensor might be a photosensitive array such as a CMOS or CCD array, … A light field camera may operate based on natural light,” thus sensors are capable to detect visible and near infrared light by design and configured to detect natural light which comprises visible and near infrared light. Bridges, Paragraph 28. Similarly see the use of IR and non-infrared image sensors in combination with TOF in Wong, Paragraphs 13-14, 24, 117-119 and statement of motivation in Claim 1.) Bridges and Wong do not teach: “the measuring device comprises a switchable spectral filter for switching between VIS and IR imaging.” However, this is a common way to make a color and IR camera: Akkaya teaches this feature in the context of digital camera structures: “he above examples stress the value of selectively blocking ambient visible light for combined visible and IR imaging, the stopband may be an IR band in alternative implementations. There, the light outside the stopband may include visible light. Other example light valves may use a mechanical structure that incorporates separate passive optical filters—e.g., red, green, blue, and IR. Using this approach, the light valve may switch to the desired optical filter by moving or rotating the mechanical structure to bring the appropriate filter in front of sensor array 28. Envisaged filter-switching modalities include … ” Akkaya, Paragraph 36. Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to supplement the teachings of Bridges and Wong to employ a switchable spectral filter for switching between VIS and IR imaging as taught in Akkaya, in order to acquire color and IR images using the full resolution of the image sensor. See Akkaya, Paragraphs 11, 36. Finally, in reviewing the present application, there does not seem to be objective evidence that the claim limitations are particularly directed to: addressing a particular problem which was recognized but unsolved in the art, producing unexpected results at the level of the ordinary skill in the art, or any other objective indicators of non-obviousness. Regarding Claim 18: “The measuring device according to claim 1, wherein the TOF sensor is embodied as an RGB-IR sensor having a four channel Bayer-pattern.” (“Configured for visible as well as IR imaging, camera 12 may also include a color filter array (CFA) 36 of color filter elements 38. The color filter elements are arranged in registry with sensor elements 30 of sensor array 28. An example CFA may present a Bayer pattern-i.e., a repeated tiling of 2x2 subarrays [four channels] … In implementations in which both visible and IR response is required at each sensor element, all of the color filter elements may be highly transmissive in the IR band of interest.” Akkaya, Paragraph 17. See statement of motivation in Claim 17.) Regarding Claim 19: “The measuring device according to claim 1 further comprising in-coupling components embodied as one of a beam splitter and a semi-transparent mirror for coaxially aligning the optical axis of the directing unit and the capturing unit.” (“Beam splitting optics may be used to align, in effect, the two sensor arrays on the same optical axis” Akkaya, Paragraph 10. See statement of motivation in Claim 17.) Regarding Claim 20: “The measuring device according to claim 1, wherein the receiving unit is spaced away from the capturing unit, (“Images acquired concurrently from different sensor arrays may exhibit parallax [spacing], which is objectionable if the images are to be registered to each other. Beam splitting optics may be used to align, in effect, the two sensor arrays on the same optical axis” Akkaya, Paragraph 10. See statement of motivation in Claim 17.) and the measuring device further comprising beam splitter for directing the part of the collimated measuring radiation reflected (R) by the target towards the receiving unit.” (“Beam splitting optics may be used to align, in effect, the two sensor arrays on the same optical axis” Akkaya, Paragraph 10. See statement of motivation in Claim 17.) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MIKHAIL ITSKOVICH whose telephone number is (571)270-7940. The examiner can normally be reached Mon. - Thu. 9am - 8pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Joseph Ustaris can be reached at (571)272-7383. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MIKHAIL ITSKOVICH/Primary Examiner, Art Unit 2483
Read full office action

Prosecution Timeline

Aug 18, 2023
Application Filed
Jun 27, 2025
Non-Final Rejection — §103, §112
Sep 25, 2025
Response Filed
Jan 03, 2026
Final Rejection — §103, §112
Mar 03, 2026
Request for Continued Examination
Mar 14, 2026
Response after Non-Final Action
Mar 21, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12548733
Automating cryo-electron microscopy data collection
2y 5m to grant Granted Feb 10, 2026
Patent 12489911
IMAGE CODING METHOD, IMAGE DECODING METHOD, IMAGE CODING APPARATUS, RECEIVING APPARATUS, AND TRANSMITTING APPARATUS
2y 5m to grant Granted Dec 02, 2025
Patent 12477146
ENCODING AND DECODING METHOD, DEVICE AND APPARATUS
2y 5m to grant Granted Nov 18, 2025
Patent 12452404
METHOD FOR DETERMINING SPECIFIC LINEAR MODEL AND VIDEO PROCESSING DEVICE
2y 5m to grant Granted Oct 21, 2025
Patent 12432328
SYSTEM AND METHOD FOR RENDERING THREE-DIMENSIONAL IMAGE CONTENT
2y 5m to grant Granted Sep 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
35%
Grant Probability
59%
With Interview (+23.8%)
4y 0m
Median Time to Grant
High
PTA Risk
Based on 585 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month