DETAILED ACTIONNotice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Applicant Response to Official Action
The response filed on 12/8/2025 has been entered and made of record.
Acknowledgment
Claims 1-21, were canceled, are acknowledged by the examiner.
Claims 22, 37-38, and 41, amended on 12/8/2025, are acknowledged by the examiner.
Response to Arguments
Applicant’s arguments with respect to claims 22, 38, 41, and their dependent claims have been considered but they are moot in view of the new grounds of rejection necessitated by amendments initiated by the applicant. Examiner addresses the main arguments of the Applicant as below.
Regarding the drawing objection related to an inertial measurement unit, the Remark filed on 12/8/2025 explained that it is embedded in a pose and motion correction module. As a result, the drawing objection is withdrawn.
Regarding the 35 U.S.C. 112(a) rejection, the Remark filed on 12/8/2025 addresses the issue. As a result, the 35 U.S.C. 112(a) rejection is withdrawn.
Regarding the Double Patenting rejections, the Applicant reserves the right to file a terminal disclaimer and/or traverse the double patenting rejections once the other substantive issues have resolved. As a result, the Double Patenting rejections are maintained.
Regarding the U.S.C. 103 rejection, the Applicant amended the claim then argued that, “The Office Action's reliance on Binder, Schlemmer, and Eguchi is in error at least because the order of operations in the combination of Binder, Schlemmer, and Eguchi differs completely from Applicant's claims.” [Paragraph 4 on page 11 of the Remarks]. In order to support the previous argument, the Applicant cited McIntyre (US Patent 5,546,156) then argued that “the Binder’s point aid emitter is not caused to emit the light beam along the directed optical path” [Paragraph 4 on page 12 of the Remarks]. Examiner respectfully disagrees with the Applicant’s argument for few reasons. First, it is noted that the Office action did not cited McIntyre, hence the argument based on one of McIntyre’s embodiments may not be relevant. Second, the Applicant ignored the facts that Binder cited several dozen references in his invention, not only McIntyre. Third, Binder stated very clear that there are various approaches that can be used to guide the beam along the optical path, “Various optical components for beam shaping, deflection, or filtering such as lenses, wavelength filters, or mirrors may be provided and positioned as part of the optical transmission path or the optical reception path, or both” [Binder: col. 6, line 6-9]. For examples, “According to the invention, the receiver contains a light guide with a downstream opto-electronic transducer, in which the light guide inlet surface is arranged in the imaging plane of the reception object lens for long distances from the object and can be controllably moved from this position transversely to the optical axis. In an alternative embodiment, the light inlet surface is fixed and there are optical means outside the optical axis of the reception object lens, which for short object distances” [Binder: col. 13, line 53-62]. “The invention also includes a radiant energy source that works with the camera. The radiant energy source produces a beam of radiant energy and projects the beam during intermissions between readings. The beam produces a light pattern on an object within or near the camera's field of view” [Binder: col. 45, line 28-33]. Moreover, as it has been shown in the Office action, Schlemmer and Eguchi also disclosed the argued limitation. As a result, the Applicant’s argument is not persuasive.
Accordingly, the Examiner respectfully maintains the rejections and applicability of the arts used.
Double Patenting
The non-statutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A non-statutory double patenting rejection is appropriate where the claims at issue are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); and In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on a non-statutory double patenting ground provided the reference application or patent either is shown to be commonly owned with this application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP §§ 706.02(l)(1) - 706.02(l)(3) for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/forms/. The filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to http://www.uspto.gov/patents/process/file/efs/guidance/eTD-info-I.jsp.
Claims 22, 38, and 40 of the instant application are rejected on the ground of non-statutory double patenting as being unpatentable over related claims in the U.S. Patent 17,022,483 B2. Although the claims at issue are not identical, they are not patentably distinct from each other because the instant claims are broader than the claims in the US Patent 17,022,483 B2.
Claims 22, 38, and 40 and their dependent claims of the instant application are rejected on the ground of non-statutory double patenting as being unpatentable over related claims in the U.S. Patent Application 18/106,026. Although the claims are not identical, they are not patentably distinct from each other.
Claims 22, 38, and 40 and their dependent claims of the instant application are rejected on the ground of non-statutory double patenting as being unpatentable over related claims in the U.S. Patent Application 18/382,294. Although the claims are not identical, they are not patentably distinct from each other.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under pre-AIA 35 U.S.C. 103(a) are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims under pre-AIA 35 U.S.C. 103(a), the examiner presumes that the subject matter of the various claims was commonly owned at the time any inventions covered therein were made absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and invention dates of each claim that was not commonly owned at the time a later invention was made in order for the examiner to consider the applicability of pre-AIA 35 U.S.C. 103(c) and potential pre-AIA 35 U.S.C. 102(e), (f) or (g) prior art under pre-AIA 35 U.S.C. 103(a).
Claims 22 and 26-39 are rejected under 35 U.S.C. 103 as being unpatentable over Binder (US Patent 11,255,663 B2), (“Binder”), in view of Schlemmer (US Patent 11,589,570 B2), (“Schlemmer”), in view of Eguchi et al. (US Patent 6,477,403 B1), (“Eguchi”).
Regarding claim 22, Binder meets the claim limitations as follow.
A system (i.e. automotive electronic systems) [Binder: col. 70, line 35] for damaging or killing plants (i.e. heat in the form of IR that can cause damage to sensitive objects) [Binder: col. 18, line 9-10], the system (i.e. automotive electronic systems) [Binder: col. 70, line 35] comprising:a first camera ((i.e. CCD image sensors) [Binder: col. 22, line 22-23]; (i.e. a camera) [Binder: col. 45, line 3]) configured to capture images of plants in a field (((i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (i.e. the object (or surface) sensed by the angle meter #1 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 62-64];a second camera ((i.e. CCD image sensors) [Binder: col. 22, line 22-23]; (i.e. a camera) [Binder: col. 45, line 3]) configured to capture images of plants in the field ((i.e. the object (or surface) sensed by the angle meter #1 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 62-64]; (i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]);a light source configured to emit a light beam ((The actuator 501 may be used to activate or control the light emitted by a light source, being based on converting electrical energy or another energy to a light. The light emitted may be a visible light, or invisible light such as infrared, ultraviolet, X-ray or gamma rays. A shade, reflector, enclosing globe, housing, lens, and other accessories may be used, typically as part of a light fixture, in order to control the illumination intensity, shape or direction) [Binder: col. 263, line 61 – col. 264, line 62]; (i.e. The instrument comprising a light emitting unit for emitting a distance measuring light, a photodetecting unit for receiving and detecting a reflected distance measuring light from an object to be measured and a part of the distance measuring light emitted from the light emitting unit as an internal reference light, a sensitivity adjusting unit for electrically adjusting photodetecting sensitivity of the photodetecting unit, and a control arithmetic unit for calculating a measured distance based on a photodetection signal of the reflected distance measuring light from the photodetecting unit and based on a photodetection signal of the internal reference light, wherein the control arithmetic unit can measure a distance by selecting a prism mode measurement and a non-prism mode measurement, and controls so that photo-detecting sensitivity of the photo-detecting unit is changed by the sensitivity adjusting unit in response to the selected measurement mode) [Binder: col. 14, line 2-18]; (i.e. The pointing aid emitter produces a visible beam generally aligned with the optical axis of the camera objective lens such that the visible beam illuminates an object in the scene includes a scene measurement system that measures an aspect of the scene) [Binder: col. 45, line 6-11; Fig. 27b]);a control system (i.e. control system) [Binder: col. 71, line 31] comprising one or more actuators (i.e. Any apparatus or device herein may further comprise an actuator that converts electrical energy to affect or produce a physical phenomenon, the actuator may be coupled to be operated, controlled, or activated, by the processor, in response to a value of the first distance, the second distance, the first angle, or any combination, manipulation, or function thereof) [Binder: col. 162, line 17-23], wherein the control system (i.e. control system) [Binder: col. 71, line 31] is configured to direct an optical path of the light beam ((The actuator 501 may be used to activate or control the light emitted by a light source, being based on converting electrical energy or another energy to a light. The light emitted may be a visible light, or invisible light such as infrared, ultraviolet, X-ray or gamma rays. A shade, reflector, enclosing globe, housing, lens, and other accessories may be used, typically as part of a light fixture, in order to control the illumination intensity, shape or direction) [Binder: col. 263, line 61 – col. 264, line 62]; (Various optical components for beam shaping, deflection, or filtering such as lenses, wavelength filters, or mirrors may be provided and positioned as part of the optical transmission path or the optical reception path, or both) [Binder: col. 6, line 6-9]).(According to the invention, the receiver contains a light guide with a downstream opto-electronic transducer, in which the light guide inlet surface is arranged in the imaging plane of the reception object lens for long distances from the object and can be controllably moved from this position transversely to the optical axis. In an alternative embodiment, the light inlet surface is fixed and there are optical means outside the optical axis of the reception object lens, which for short object distances) [Binder: col. 13, line 53-62]; (The invention also includes a radiant energy source that works with the camera. The radiant energy source produces a beam of radiant energy and projects the beam during intermissions between readings. The beam produces a light pattern on an object within or near the camera's field of view” [Binder: col. 45, line 28-33]; (i.e. The pointing aid emitter produces a visible beam generally aligned with the optical axis of the camera objective lens such that the visible beam illuminates an object in the scene includes a scene measurement system that measures an aspect of the scene) [Binder: col. 45, line 6-11; Fig. 27b]); anda computing system ((i.e. a processor) [Binder: col. 256, line 43]; (i.e. The block 263 further contains a digital image processor, which receives the digital data from the AFE, and processes this digital representation of the image) [Binder: col. 58, line 53-56]), wherein the computing system is configured to perform operations comprising ((i.e. a processor for executing the software) [Binder: col. 167, line 14-15]; (i.e. A non-transitory computer readable medium may include computer executable instructions stored thereon, and the instructions may include any of the steps) [Binder: col. 180, line 1-4]):receiving (i.e. signal received from the sensor) [Binder: col. 2, line 66] a first image of at least one plant in the field captured by the first camera (((i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (i.e. the object (or surface) sensed by the angle meter #1 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 62-64]);identifying a plant in the first image ((identifying at least a part of the field of view) [Binder: col. 45, line 33-34]; (i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]);predicting a location of the plant based on the first image ((Most multi-sensor AF cameras allow manual selection of the active sensor, and many offer an automatic selection of the sensor using algorithms that attempt to discern the location of the subject) [Binder: col. 64, line 17-30]; (The gauge utilizes complementary simultaneous measurements based upon both Doppler and time of flight principles. A complete record can be produced of the location and shape of a target object even when the object has severe discontinuities such as the edges of a turbine blade.) [Binder: col. 47, line 55-60]; (The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40]; (Measurements are made by determining the pulse time of flight of the distances of objects which respectively form a distance picture element and at which the transmission pulses are reflected) [Binder: col. 33, line 10-13]; (i.e. The pointing aid emitter produces a visible beam generally aligned with the optical axis of the camera objective lens such that the visible beam illuminates an object in the scene includes a scene measurement system that measures an aspect of the scene) [Binder: col. 45, line 6-11; Fig. 27b]);causing ((i.e. a processor for executing the software) [Binder: col. 167, line 14-15]; (i.e. A non-transitory computer readable medium may include computer executable instructions stored thereon, and the instructions may include any of the steps) [Binder: col. 180, line 1-4]) the second camera to capture a second image of a region of the field including the predicted location (i.e. the object (or surface) sensed by the angle meter #1 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 62-64];predicting a target location of the plant (((i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (The gauge utilizes complementary simultaneous measurements based upon both Doppler and time of flight principles. A complete record can be produced of the location and shape of a target object even when the object has severe discontinuities such as the edges of a turbine blade.) [Binder: col. 47, line 55-60]; (The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40]) in the second image (displaying a captured image in the display) [Binder: col. 245, line 5];causing the control system ((i.e. a processor for executing the software) [Binder: col. 167, line 14-15]; (i.e. A non-transitory computer readable medium may include computer executable instructions stored thereon, and the instructions may include any of the steps) [Binder: col. 180, line 1-4]) to direct the optical path of the light beam ((i.e. In the case of using light wave, the splitter 142 may consist of, comprise, or be based on, an optical beam splitter. Such an optical beam splitter may consist of, comprise, or be based on, two triangular glass prisms which are glued together at their base, a half-silvered mirror using a sheet of glass or plastic with a transparently thin coating of metal, a diffractive beam splitter, or a dichroic mirrored prism assembly which uses dichroic optical coatings. A polarizing beam splitter may consist of, comprise, or be based on Wollaston prism that use birefringent materials for splitting light into beams of differing polarization) [Binder: col. 212, line 20-30]; ; (Various optical components for beam shaping, deflection, or filtering such as lenses, wavelength filters, or mirrors may be provided and positioned as part of the optical transmission path or the optical reception path, or both) [Binder: col. 6, line 6-9]).(According to the invention, the receiver contains a light guide with a downstream opto-electronic transducer, in which the light guide inlet surface is arranged in the imaging plane of the reception object lens for long distances from the object and can be controllably moved from this position transversely to the optical axis. In an alternative embodiment, the light inlet surface is fixed and there are optical means outside the optical axis of the reception object lens, which for short object distances) [Binder: col. 13, line 53-62]; (The invention also includes a radiant energy source that works with the camera. The radiant energy source produces a beam of radiant energy and projects the beam during intermissions between readings. The beam produces a light pattern on an object within or near the camera's field of view” [Binder: col. 45, line 28-33]) toward the predicted target location (i.e. Mobile LIDAR (also mobile laser scanning) is when two or more scanners are attached to a moving vehicle to collect data along a path. These scanners are usually paired with other kinds of equipment, including GNSS receivers and IMUs. One example application is surveying bordering trees, etc. all need to be taken into account) [Binder: col. 305, line 42-48]; (Measurements are made by determining the pulse time of flight of the distances of objects which respectively form a distance picture element and at which the transmission pulses are reflected) [Binder: col. 33, line 10-13]); andcausing ((i.e. a processor for executing the software) [Binder: col. 167, line 14-15]; (i.e. A non-transitory computer readable medium may include computer executable instructions stored thereon, and the instructions may include any of the steps) [Binder: col. 180, line 1-4]) the light source to emit the light beam (The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40] along the optical path (i.e. The pointing aid emitter produces a visible beam generally aligned with the optical axis of the camera objective lens such that the visible beam illuminates an object in the scene includes a scene measurement system that measures an aspect of the scene) [Binder: col. 45, line 6-11; Fig. 27b] toward the predicted target location of the plant ((i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40]; (i.e. Mobile LIDAR (also mobile laser scanning) is when two or more scanners are attached to a moving vehicle to collect data along a path. These scanners are usually paired with other kinds of equipment, including GNSS receivers and IMUs. One example application is surveying bordering trees, etc. all need to be taken into account) [Binder: col. 305, line 42-48]; (Measurements are made by determining the pulse time of flight of the distances of objects which respectively form a distance picture element and at which the transmission pulses are reflected) [Binder: col. 33, line 10-13]) for a length of time sufficient (i.e. One of the mirrors is a broadband reflector and the other mirror is wavelength selective so that gain is favored on a single longitudinal mode, resulting in lasing at a single resonant frequency. The broadband mirror is usually coated with a low reflectivity coating to allow emission. The wavelength selective mirror is a periodically structured diffraction grating with high reflectivity) [Binder: col. 10, line 34-41] to damage or kill the plant.
Binder does not explicitly disclose the following claim limitations (Emphasis added).
causing the light source to emit the light beam the directed optical path toward the predicted target location of the plant for a length of time sufficient to damage or kill the plant.
However, in the same field of endeavor Schlemmer further discloses the claim limitations and the deficient claim limitations, as follows:
damaging or killing plants ((i.e. a system for semiautomatic and/or automatic weed removal) [Schlemmer: col. 1, line 19-20]; (i.e. By means of a laser, for example, weed removal can be performed (optically-) thermally. The energy applied by the laser allows weed to be removed precisely and quickly. Especially, it is conceivable to use a laser, by means of which the weed plant is at least partially vaporized or (optically-) thermally damaged in such a way that it dies) [Schlemmer: col. 4, line 10-16]).
at least one plant in the field captured by the first camera (i.e. wherein image data of at least one weed can be generated by means of the optical sensor and transmitted via the communication network to the server unit, which analyses the transmitted image data such that the weed can be determined) [Schlemmer: col. 2, line 9-13].
causing the control system to direct the optical path of the light beam toward the predicted target location (i.e. By means of a laser, for example, weed removal can be performed (optically-) thermally. The energy applied by the laser allows weed to be removed precisely and quickly. Especially, it is conceivable to use a laser, by means of which the weed plant is at least partially vaporized or (optically-) thermally damaged in such a way that it dies) [Schlemmer: col. 4, line 10-16].
It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Binder with Schlemmer to program the system to use the semi- automatic or automatic weed removal system of Schlemmer.
Therefore, the combination of Binder with Schlemmer will enable a lawn moving to kill weeds with an environmentally friendly approach [Schlemmer: col. 3, line 64 – col. 4, line 16].
In the same field of endeavor, Eguchi further discloses the claim limitations as follows:
causing the light source to emit the light beam the directed optical path toward the predicted target location of the plant ((i.e. The scanning unit causes the light beams emitted from the plurality of optical paths of the first light guide to be incident on the object with the plurality of light beams being aligned such that a detection line is formed on the object) [Eguchi: col. 2, line 39-43]; (i.e. the scanning unit includes a deflector that deflects the plurality of light beams emitted from the tip of the plurality of optical paths of the first light guide toward the object with the plurality of beams aligned in parallel, and shifts the detection line in the direction perpendicular to the detection line with the plurality of beams remained to be aligned in parallel) [Eguchi: col. 2, line 52-58]).
It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Binder and Schlemmer with Eguchi to implement an optical coherence tomography technique of Eguchi.
Therefore, the combination of Binder and Schlemmer with Eguchi will enable the system to utilize tomography technique for object detections [Eguchi: col. 8, line 55-59].
Regarding claim 26, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 22. Binder further meets the claim limitations as follow.
wherein the operations further comprise tracking motion of the system ((The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40]; (i.e. The controller 268a serve as both the control block 61 used by the angle meter #1 55 and the controller 268 used by the digital camera 260, and the display 266a serve as both the display 63 used by the angle meter #1 55 and the display 266 used by the digital camera 260. Preferably, the measurement lines 51a and 51b are aligned with and parallel to the digital camera 260 optical axis 272, and may further be in close proximity thereto, so that the object (or surface) sensed by the angle meter #1 55 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 55-64; Fig. 27a-b]; (Measurements are made by determining the pulse time of flight of the distances of objects which respectively form a distance picture element and at which the transmission pulses are reflected) [Binder: col. 33, line 10-13]; (The correlator 19 is typically implemented using one of four predominant methods for active distance measurement: interferometric, triangulation, pulsed time-of-flight (TOF), and phase measuring. Interferometric methods may result in accuracies of less than one micrometer over ranges of up to several millimeters, while triangulation techniques may result in devices with accuracy in the micrometer range) [Binder: col. 3, line 38-44]) over an elapsed time (The time elapsed during transmission to echo reception gives information on
the distance to the object) [Binder: col. 23, line 59-61].
Regarding claim 27, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 26. Binder further meets the claim limitations as follow.
wherein the motion is tracked using one or more of the following (The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40]: an inertial measurement unit (IMU), a global positioning system (GPS) or an internal navigation system (INS) ((A computer controls the scanning of the radar and the collection of data-points. A global positioning satellite (GPS) unit locates the precise portion of the radar and another unit loads a fixed referenced location to which all measurements) [Binder: col. 32, line 61-66].
Regarding claim 28, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 26. Binder further meets the claim limitations as follow.
wherein the operations further comprise adjusting the predicted target location based on the tracked motion (i.e. The instrument comprising a light emitting unit for emitting a distance measuring light, a photodetecting unit for receiving and detecting a reflected distance measuring light from an object to be measured and a part of the distance measuring light emitted from the light emitting unit as an internal reference light, a sensitivity adjusting unit for electrically adjusting photodetecting sensitivity of the photodetecting unit, and a control arithmetic unit for calculating a measured distance based on a photodetection signal of the reflected distance measuring light from the photodetecting unit and based on a photodetection signal of the internal reference light, wherein the control arithmetic unit can measure a distance by selecting a prism mode measurement and a non-prism mode measurement, and controls so that photo-detecting sensitivity of the photo-detecting unit is changed by the sensitivity adjusting unit in response to the selected measurement mode) [Binder: col. 14, line 2-18].
Regarding claim 29, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 22. Binder further meets the claim limitations as follow.
comprising a frame (i.e. the frame (on which the body is mounted)) [Binder: col. 69, line 34-35] supporting the first camera, the second camera ((i.e. CCD image sensors) [Binder: col. 22, line 22-23]; (i.e. surround view cameras) [Binder: col. 45, line 3]), the light source (i.e. Active systems use an infrared light source built into the car to illuminate the road ahead with light) [Binder: col. 72, line 24-25], the control system (i.e. an Electronic Control Unit (ECU) is a generic term for any embedded system that controls one or more of the electrical system or subsystems in a vehicle) [Binder: col. 73, line 23-26], and the computing system ((i.e. a processor) [Binder: col. 256, line 43]; (i.e. The block 263 further contains a digital image processor, which receives the digital data from the AFE, and processes this digital representation of the image) [Binder: col. 58, line 53-56]).
Regarding claim 30, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 29. Binder further meets the claim limitations as follow.
wherein the frame is configured to move over the field (i.e. a chassis is the underpart of a motor vehicle, consisting of the frame (on which the body is mounted)) [Binder: col. 69, line 33-35].
Regarding claim 31, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 30. Binder further meets the claim limitations as follow.
wherein the frame is configured to be moved autonomously or by a driver (i.e. a chassis is the underpart of a motor vehicle, consisting of the frame (on which the body is mounted)) [Binder: col. 69, line 33-35].
Regarding claim 32, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 22. Binder further meets the claim limitations as follow.
wherein the light beam is a laser beam (i.e. The optical pulse radar according to the
present invention comprises a laser system) [Binder: col. 89, line 13-14].
Regarding claim 33, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 22. Binder further meets the claim limitations as follow.
wherein the light beam has a wavelength within a range from 300 nm to 100 µm ((i.e. Argon-ion lasers emit light in the range 351-528.7 nm. Depending on the optics and the laser tube a different number of lines is usable but the most commonly used lines are 458 nm, 488 nm and 514.5 nm. A nitrogen transverse electrical discharge in gas at atmospheric pressure (TEA) laser is an inexpensive gas laser producing UV light at 337.1 nm. Copper laser (copper vapor, and copper bromide vapor), with two spectral lines of green (510.6 nm) and yellow (578.2 nm), is the most powerful laser with the highest efficiency in the visible spectrum. Metal-ion lasers are gas lasers that typically generate ultraviolet wavelengths. Helium-silver (HeAg) 224 nm neon-copper (NeCu) 248 nm and helium-cadmium (HeCd) 325 nm are three examples. These lasers have particularly narrow oscillation linewidths ofless than 3 GHz (0.5 picometers), making them candidates for use in fluorescence suppressed Raman spectroscopy. Examples of gas lasers are Helium-Neon (HeNe) laser operating at 632.8 nm, 543.5 nm, 593.9 nm, 611.8 nm, 1.1523 μm, 1.52 μm, or3.3913 μm, Argon laser working at 454.6 nm, 488.0 nm, 514.5 nm, 351 nm, 363.8, 457.9 nm, 465.8 nm, 476.5 nm, 472.7 nm, or 528.7 nm, also frequency doubled to provide 244 nm and 257 nm, Krypton laser working at 416 nm, 530.9 nm, 568.2 nm, 647.1 nm, 676.4 nm, 752.5 nm, or 799.3 nm, Xenon ion laser working at visible spectrum extending into the UV and IR, Nitrogen laser working at 337.1 nm, Carbon dioxide laser working at 10.6 μm, or 9.4 μm, and Carbon monoxide laser working at 2.6 to 4 μm or 4.8 to 8.3 μm) [Binder: col. 7, line 9-37]; (i.e. Some solid-state lasers can also be tunable using several intracavity techniques which employ etalons, prisms, and gratings, or a combination of these. Titanium-doped sapphire is widely used for its broad tuning range, 660 to 1080 nanometers. Alexandrite lasers are tunable from 700 to 820 nm, and they yield higher-energy pulses than titaniumsapphire lasers because of the gain medium's longer energy storage time and higher damage threshold. Ruby laser typically operates at 694.3 nm, Nd:YAG and NdCrYAG laser typically operates at 1.064 μm or 1.32 μm, Er:YAG laser typically operates at 2.94 μm, Neodymium YLF (Nd:YLF) solid-state laser typically operates at 1.047 and 1.053 μm, Neodymium doped Yttrium orthovanadate (Nd:YVO4) laser operates at 1.064 μm, Neodymium doped yttrium calcium oxoborate Nd:YCa4O(BO3)3 (Nd:YCOB) operates at -1.060 μm or -530 nm, Neodymium glass (Nd:Glass) laser typically operates at -1.062 μm (Silicate glasses) or -1.054 μm (Phosphate glasses), Titanium sapphire (Ti:sapphire) laser operates at 650-1100 nm, Thulium YAG (Tm:YAG) laser operates at 2.0 μm, Ytterbium YAG (Yb:YAG) laser operates at 1.03 μm, Ytterbium:2O3 (glass or ceramics) laser operates at 1.03 μm, Ytterbium doped glass laser (rod, plate/chip, and fiber) operates at 1. Mm, Holmium YAG (Ho:YAG) laser operates at 2.1 μm, Chromium ZnSe (Cr:ZnSe) laser operates at 2.2-2.8 μm range, Cerium doped lithium strontium (or calcium) aluminum fluoride (Ce:LiSAF, Ce:LiCAF) operates at -280 to 316 nm range, Promethium 147 doped phosphate glass (147Pm+3: Glass) solid-state laser operates at 933 nm or 1098 nm, Chromium doped chrysoberyl (alexandrite) laser operates at the range of 700 to 820 nm, and Erbium doped and erbiumytterbium codoped glass lasers operate at 1.53-1.56 μm) [Binder: col. 7, line 64 – col. 8, line 28].
Regarding claim 34, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 22. Binder further meets the claim limitations as follow.
wherein the light beam has a power within a range from 10W to 10kW (i.e. Carbon dioxide lasers, or CO2 lasers can emit hundreds of kilowatts at 9.6 μm and 10.6 μm) [Binder: col. 7, line 4-6].
Regarding claim 35, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 22. Binder further meets the claim limitations as follow.
wherein the control system comprises one or more mirrors (i.e. An embodiment uses a switchable mirror) [Binder: col. 96, line 33-34] for directing the optical path of the light beam (i.e. In the case of using light wave, the splitter 142 may consist of, comprise, or be based on, an optical beam splitter. Such an optical beam splitter may consist of, comprise, or be based on, two triangular glass prisms which are glued together at their base, a half-silvered mirror using a sheet of glass or plastic with a transparently thin coating of metal, a diffractive beam splitter, or a dichroic mirrored prism assembly which uses dichroic optical coatings. A polarizing beam splitter may consist of, comprise, or be based on Wollaston prism that use birefringent materials for splitting light into beams of differing polarization) [Binder: col. 212, line 20-30].
Regarding claim 36, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 22. Binder further meets the claim limitations as follow.
further comprising a housing (i.e. the frame (on which the body is mounted)) [Binder: col. 69, line 35] containing the optical control system (i.e. a control system and a motor or tunable optical element to focus) [Binder: col. 63, line 50-51], wherein the housing comprises an escape portion configured for the light beam to pass through (i.e. Photons emitted into a mode of the waveguide will travel along the waveguide and be reflected several times from each end face before they exit. As a light wave passes through the cavity) [Binder: col. 9, line 34-37].
Regarding claim 37, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 22. Binder further meets the claim limitations as follow.
wherein the optical control system is configured to direct the light beam ((i.e. a control system and a motor or tunable optical element to focus) [Binder: col. 63, line 50-51]; (i.e. the conventional laser distance-measuring device can measure a straight distance of an object from the device) [Binder: col. 16, line 44-46; Fig. 5C]) toward the plan (i.e. a tree) [Binder: col. 16, line 50; Fig. 5C] while the system is moving relative to the field (i.e. imagery of the target is captured in both a still and moving format. Using a queuing mechanism for both distance and imagery data along with time stamps associated with each, a target's image, both in motion and still) [Binder: col. 63, line 18-22].
Regarding claim 38, Binder meets the claim limitations as follow.
A system (i.e. automotive electronic systems) [Binder: col. 70, line 35] for damaging or killing plants (i.e. heat in the form of IR that can cause damage to sensitive objects) [Binder: col. 18, line 9-10], the system (i.e. automotive electronic systems) [Binder: col. 70, line 35] comprising:a first camera ((i.e. CCD image sensors) [Binder: col. 22, line 22-23]; (i.e. a camera) [Binder: col. 45, line 3]) configured to capture images of plants in a field (((i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (i.e. the object (or surface) sensed by the angle meter #1 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 62-64]);a second camera ((i.e. CCD image sensors) [Binder: col. 22, line 22-23]; (i.e. a camera) [Binder: col. 45, line 3]) configured to capture images of plants in the field (((i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (i.e. the object (or surface) sensed by the angle meter #1 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 62-64]);a beam emitter configured to emit a light beam ((The actuator 501 may be used to activate or control the light emitted by a light source, being based on converting electrical energy or another energy to a light. The light emitted may be a visible light, or invisible light such as infrared, ultraviolet, X-ray or gamma rays. A shade, reflector, enclosing globe, housing, lens, and other accessories may be used, typically as part of a light fixture, in order to control the illumination intensity, shape or direction) [Binder: col. 263, line 61 – col. 264, line 62]; (i.e. The instrument comprising a light emitting unit for emitting a distance measuring light, a photodetecting unit for receiving and detecting a reflected distance measuring light from an object to be measured and a part of the distance measuring light emitted from the light emitting unit as an internal reference light, a sensitivity adjusting unit for electrically adjusting photodetecting sensitivity of the photodetecting unit, and a control arithmetic unit for calculating a measured distance based on a photodetection signal of the reflected distance measuring light from the photodetecting unit and based on a photodetection signal of the internal reference light, wherein the control arithmetic unit can measure a distance by selecting a prism mode measurement and a non-prism mode measurement, and controls so that photo-detecting sensitivity of the photo-detecting unit is changed by the sensitivity adjusting unit in response to the selected measurement mode) [Binder: col. 14, line 2-18]; (i.e. The pointing aid emitter produces a visible beam generally aligned with the optical axis of the camera objective lens such that the visible beam illuminates an object in the scene includes a scene measurement system that measures an aspect of the scene) [Binder: col. 45, line 6-11; Fig. 27b]);an optical control system (i.e. control system) [Binder: col. 71, line 31] comprising one or more actuators (i.e. Any apparatus or device herein may further comprise an actuator that converts electrical energy to affect or produce a physical phenomenon, the actuator may be coupled to be operated, controlled, or activated, by the processor, in response to a value of the first distance, the second distance, the first angle, or any combination, manipulation, or function thereof) [Binder: col. 162, line 17-23]; (i.e. an ECU typically includes a relay, H-Bridge, injector, or logic drivers, or outputs for connecting to various actuators) [Binder: col. 73, line 47-49]) and one or more reflective elements (i.e. An embodiment uses a switchable mirror) [Binder: col. 96, line 33-34], wherein the control system (i.e. control system) [Binder: col. 71, line 31] is configured to direct an optical path of the light beam (i.e. In the case of using light wave, the splitter 142 may consist of, comprise, or be based on, an optical beam splitter. Such an optical beam splitter may consist of, comprise, or be based on, two triangular glass prisms which are glued together at their base, a half-silvered mirror using a sheet of glass or plastic with a transparently thin coating of metal, a diffractive beam splitter, or a dichroic mirrored prism assembly which uses dichroic optical coatings. A polarizing beam splitter may consist of, comprise, or be based on Wollaston prism that use birefringent materials for splitting light into beams of differing polarization) [Binder: col. 212, line 20-30]; (The actuator 501 may be used to activate or control the light emitted by a light source, being based on converting electrical energy or another energy to a light. The light emitted may be a visible light, or invisible light such as infrared, ultraviolet, X-ray or gamma rays. A shade, reflector, enclosing globe, housing, lens, and other accessories may be used, typically as part of a light fixture, in order to control the illumination intensity, shape or direction) [Binder: col. 263, line 61 – col. 264, line 62] ; (Various optical components for beam shaping, deflection, or filtering such as lenses, wavelength filters, or mirrors may be provided and positioned as part of the optical transmission path or the optical reception path, or both) [Binder: col. 6, line 6-9]).(According to the invention, the receiver contains a light guide with a downstream opto-electronic transducer, in which the light guide inlet surface is arranged in the imaging plane of the reception object lens for long distances from the object and can be controllably moved from this position transversely to the optical axis. In an alternative embodiment, the light inlet surface is fixed and there are optical means outside the optical axis of the reception object lens, which for short object distances) [Binder: col. 13, line 53-62]; (The invention also includes a radiant energy source that works with the camera. The radiant energy source produces a beam of radiant energy and projects the beam during intermissions between readings. The beam produces a light pattern on an object within or near the camera's field of view” [Binder: col. 45, line 28-33]; (i.e. The pointing aid emitter produces a visible beam generally aligned with the optical axis of the camera objective lens such that the visible beam illuminates an object in the scene includes a scene measurement system that measures an aspect of the scene) [Binder: col. 45, line 6-11; Fig. 27b]); anda computing system ((i.e. a processor) [Binder: col. 256, line 43]; (i.e. The block 263 further contains a digital image processor, which receives the digital data from the AFE, and processes this digital representation of the image) [Binder: col. 58, line 53-56]),
wherein the computing system is configured to perform operations comprising ((i.e. a processor for executing the software) [Binder: col. 167, line 14-15]; (i.e. A non-transitory computer readable medium may include computer executable instructions stored thereon, and the instructions may include any of the steps) [Binder: col. 180, line 1-4]):receiving (i.e. signal received from the sensor) [Binder: col. 2, line 66] a first image of at least one plant in the field captured by the first camera ((i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (i.e. the object (or surface) sensed by the angle meter #1 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 62-64]);identifying a plant in the first image ((i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (identifying at least a part of the field of view) [Binder: col. 45, line 33-34]; (i.e. a tree) [Binder: col. 16, line 50; Fig. 5C]);predicting a location of the plant based on the first image ((Most multi-sensor AF cameras allow manual selection of the active sensor, and many offer an automatic selection of the sensor using algorithms that attempt to discern the location of the subject) [Binder: col. 64, line 17-30]; (The gauge utilizes complementary simultaneous measurements based upon both Doppler and time of flight principles. A complete record can be produced of the location and shape of a target object even when the object has severe discontinuities such as the edges of a turbine blade.) [Binder: col. 47, line 55-60]; (i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40]; (Measurements are made by determining the pulse time of flight of the distances of objects which respectively form a distance picture element and at which the transmission pulses are reflected) [Binder: col. 33, line 10-13]; (i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (i.e. The pointing aid emitter produces a visible beam generally aligned with the optical axis of the camera objective lens such that the visible beam illuminates an object in the scene includes a scene measurement system that measures an aspect of the scene) [Binder: col. 45, line 6-11; Fig. 27b]);causing ((i.e. a processor for executing the software) [Binder: col. 167, line 14-15]; (i.e. A non-transitory computer readable medium may include computer executable instructions stored thereon, and the instructions may include any of the steps) [Binder: col. 180, line 1-4]) the second camera to capture a second image of a region of the field including the predicted location (i.e. the object (or surface) sensed by the angle meter #1 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 62-64];predicting a target location of the plant ((i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (The gauge utilizes complementary simultaneous measurements based upon both Doppler and time of flight principles. A complete record can be produced of the location and shape of a target object even when the object has severe discontinuities such as the edges of a turbine blade.) [Binder: col. 47, line 55-60]; (identifying at least a part of the field of view) [Binder: col. 45, line 33-34]; (i.e. a tree) [Binder: col. 16, line 50; Fig. 5C]; (The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40] in the second image (displaying a captured image in the display) [Binder: col. 245, line 5];causing the control system ((i.e. a processor for executing the software) [Binder: col. 167, line 14-15]; (i.e. A non-transitory computer readable medium may include computer executable instructions stored thereon, and the instructions may include any of the steps) [Binder: col. 180, line 1-4]) to direct the optical path of the light beam ((i.e. In the case of using light wave, the splitter 142 may consist of, comprise, or be based on, an optical beam splitter. Such an optical beam splitter may consist of, comprise, or be based on, two triangular glass prisms which are glued together at their base, a half-silvered mirror using a sheet of glass or plastic with a transparently thin coating of metal, a diffractive beam splitter, or a dichroic mirrored prism assembly which uses dichroic optical coatings. A polarizing beam splitter may consist of, comprise, or be based on Wollaston prism that use birefringent materials for splitting light into beams of differing polarization) [Binder: col. 212, line 20-30]; (Various optical components for beam shaping, deflection, or filtering such as lenses, wavelength filters, or mirrors may be provided and positioned as part of the optical transmission path or the optical reception path, or both) [Binder: col. 6, line 6-9]).(According to the invention, the receiver contains a light guide with a downstream opto-electronic transducer, in which the light guide inlet surface is arranged in the imaging plane of the reception object lens for long distances from the object and can be controllably moved from this position transversely to the optical axis. In an alternative embodiment, the light inlet surface is fixed and there are optical means outside the optical axis of the reception object lens, which for short object distances) [Binder: col. 13, line 53-62]; (The invention also includes a radiant energy source that works with the camera. The radiant energy source produces a beam of radiant energy and projects the beam during intermissions between readings. The beam produces a light pattern on an object within or near the camera's field of view” [Binder: col. 45, line 28-33]) toward the predicted target location (i.e. Mobile LIDAR (also mobile laser scanning) is when two or more scanners are attached to a moving vehicle to collect data along a path. These scanners are usually paired with other kinds of equipment, including GNSS receivers and IMUs. One example application is surveying bordering trees, etc. all need to be taken into account) [Binder: col. 305, line 42-48]; (Measurements are made by determining the pulse time of flight of the distances of objects which respectively form a distance picture element and at which the transmission pulses are reflected) [Binder: col. 33, line 10-13]); andcausing ((i.e. a processor for executing the software) [Binder: col. 167, line 14-15]; (i.e. A non-transitory computer readable medium may include computer executable instructions stored thereon, and the instructions may include any of the steps) [Binder: col. 180, line 1-4]) the light source to emit the light beam (The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40] along the directed optical path (; (Various optical components for beam shaping, deflection, or filtering such as lenses, wavelength filters, or mirrors may be provided and positioned as part of the optical transmission path or the optical reception path, or both) [Binder: col. 6, line 6-9]).(According to the invention, the receiver contains a light guide with a downstream opto-electronic transducer, in which the light guide inlet surface is arranged in the imaging plane of the reception object lens for long distances from the object and can be controllably moved from this position transversely to the optical axis. In an alternative embodiment, the light inlet surface is fixed and there are optical means outside the optical axis of the reception object lens, which for short object distances) [Binder: col. 13, line 53-62]; (The invention also includes a radiant energy source that works with the camera. The radiant energy source produces a beam of radiant energy and projects the beam during intermissions between readings. The beam produces a light pattern on an object within or near the camera's field of view” [Binder: col. 45, line 28-33]; ; (i.e. The pointing aid emitter produces a visible beam generally aligned with the optical axis of the camera objective lens such that the visible beam illuminates an object in the scene includes a scene measurement system that measures an aspect of the scene) [Binder: col. 45, line 6-11; Fig. 27b] toward the predicted target location of the plant ((i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40] ; (i.e. Mobile LIDAR (also mobile laser scanning) is when two or more scanners are attached to a moving vehicle to collect data along a path. These scanners are usually paired with other kinds of equipment, including GNSS receivers and IMUs. One example application is surveying bordering trees, etc. all need to be taken into account) [Binder: col. 305, line 42-48] ; (Measurements are made by determining the pulse time of flight of the distances of objects which respectively form a distance picture element and at which the transmission pulses are reflected) [Binder: col. 33, line 10-13]) for a length of time sufficient (i.e. One of the mirrors is a broadband reflector and the other mirror is wavelength selective so that gain is favored on a single longitudinal mode, resulting in lasing at a single resonant frequency. The broadband mirror is usually coated with a low reflectivity coating to allow emission. The wavelength selective mirror is a periodically structured diffraction grating with high reflectivity) [Binder: col. 10, line 34-41] to damage or kill the plant.
Binder does not explicitly disclose the following claim limitations (Emphasis added).
causing the light source to emit the light beam along the directed optical path toward the predicted target location of the plant for a length of time sufficient to damage or kill the plant.
However, in the same field of endeavor Schlemmer further discloses the claim limitations and the deficient claim limitations, as follows:
damaging or killing plants ((i.e. a system for semiautomatic and/or automatic weed removal) [Schlemmer: col. 1, line 19-20]; (i.e. By means of a laser, for example, weed removal can be performed (optically-) thermally. The energy applied by the laser allows weed to be removed precisely and quickly. Especially, it is conceivable to use a laser, by means of which the weed plant is at least partially vaporized or (optically-) thermally damaged in such a way that it dies) [Schlemmer: col. 4, line 10-16]).
at least one plant in the field captured by the first camera (i.e. wherein image data of at least one weed can be generated by means of the optical sensor and transmitted via the communication network to the server unit, which analyses the transmitted image data such that the weed can be determined) [Schlemmer: col. 2, line 9-13].
causing the control system to direct the optical path of the light beam toward the predicted target location (i.e. By means of a laser, for example, weed removal can be performed (optically-) thermally. The energy applied by the laser allows weed to be removed precisely and quickly. Especially, it is conceivable to use a laser, by means of which the weed plant is at least partially vaporized or (optically-) thermally damaged in such a way that it dies) [Schlemmer: col. 4, line 10-16].
It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Binder with Schlemmer to program the system to use the semi- automatic or automatic weed removal system of Schlemmer.
Therefore, the combination of Binder with Schlemmer will enable a lawn moving to kill weeds with an environmentally friendly approach [Schlemmer: col. 3, line 64 – col. 4, line 16].
In the same field of endeavor, Eguchi further discloses the claim limitations as follows:
causing the light source to emit the light beam along the directed optical path toward the predicted target location of the plant ((i.e. The scanning unit causes the light beams emitted from the plurality of optical paths of the first light guide to be incident on the object with the plurality of light beams being aligned such that a detection line is formed on the object) [Eguchi: col. 2, line 39-43]; (i.e. the scanning unit includes a deflector that deflects the plurality of light beams emitted from the tip of the plurality of optical paths of the first light guide toward the object with the plurality of beams aligned in parallel, and shifts the detection line in the direction perpendicular to the detection line with the plurality of beams remained to be aligned in parallel) [Eguchi: col. 2, line 52-58]).
It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Binder and Schlemmer with Eguchi to implement an optical coherence tomography technique of Eguchi.
Therefore, the combination of Binder and Schlemmer with Eguchi will enable the system to utilize tomography technique for object detections [Eguchi: col. 8, line 55-59].
Regarding claim 39, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 22. Binder further meets the claim limitations as follow.
wherein the one or more reflective elements comprises a first mirror controllable by a first actuator of the one or more actuators, and a second mirror controllable by a second actuator of the one or more actuators ((i.e. Among the inter-cavity doubled lasers,
VCSELs have shown much promise and potential to be the basis for a mass produced frequency doubled laser. A VECSEL is a vertical cavity, and is composed of two mirrors) [Binder: col. 290, line 58-62]; (i.e. a control system and a motor or tunable optical element to focus) [Binder: col. 63, line 50-51]).
Claims 23-25 and 40-41 is rejected under 35 U.S.C. 103 as being unpatentable over Binder (US Patent 11,255,663 B2), (“Binder”), in view of Schlemmer (US Patent 11,589,570 B2), (“Schlemmer”), in view of Eguchi et al. (US Patent 6,477,403 B1), (“Eguchi”), in view of Steinberg et al. (US Patent 10,776,639 B2), (“Steinberg”).
Regarding claim 23, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 22. Binder further meets the claim limitations as follow.
wherein identifying a plant in the first image ((identifying at least a part of the field of view) [Binder: col. 45, line 33-34]; (i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]) is performed using a neural network.
Binder, Eguchi and Schlemmer do not explicitly disclose the following claim limitations (Emphasis added).
identifying a plant in the first image using a neural network.
In the same field of endeavor, Steinberg further discloses the claim limitations as follows:
identifying a plant in the first image using a neural network ((i.e. At step 1707, based on the classification information and the received measurements with the at least one associated confidence level, the at least one processor may identify a plurality of pixels as being associated with a particular object. As explained above, the at least one processor may identify the particular object using matching with the accessed classification information (e.g., similar to the matching of step 805 of method 800, described above). Additionally or alternatively, steps 1705 and 1707 may be performed with one or more neural networks such that the at least one processor may use the one or more neural networks to determine the best match (or a list of matches optionally including a confidence level for each match)) [Steinberg: col. 61, line 36-48]).
It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Binder, Eguchi and Schlemmer with Steinberg to implement an optical coherence tomography technique of Steinberg.
Therefore, the combination of Binder, Eguchi and Schlemmer with Steinberg will enable the system to improve performance of the system [Steinberg: col. 2, line 6-15].
Regarding claim 24, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 23. Binder further meets the claim limitations as follow.
wherein the neural network is configured to differentiate between (LIDAR technology is being used in robotics for the perception of the environment as well as object classification) [Binder: col. 305, line 64-66] a weed and a crop (i.e. the undesirable wrong determination of, for example, crop plants as weeds can be further reduced, resulting in additional protection of the plants not to be removed) [Schlemmer: col. 5, line 20-23].
Binder, Eguchi and Schlemmer do not explicitly disclose the following claim limitations (Emphasis added).
wherein the neural network is configured to differentiate between a weed and a crop.
In the same field of endeavor, Steinberg further discloses the claim limitations as follows:
wherein the neural network is configured to differentiate between a weed and a crop ((i.e. At step 1707, based on the classification information and the received measurements with the at least one associated confidence level, the at least one processor may identify a plurality of pixels as being associated with a particular object. As explained above, the at least one processor may identify the particular object using matching with the accessed classification information (e.g., similar to the matching of step 805 of method 800, described above). Additionally or alternatively, steps 1705 and 1707 may be performed with one or more neural networks such that the at least one processor may use the one or more neural networks to determine the best match (or a list of matches optionally including a confidence level for each match)) [Steinberg: col. 61, line 36-48]).
It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Binder, Eguchi and Schlemmer with Steinberg to implement an optical coherence tomography technique of Steinberg.
Therefore, the combination of Binder, Eguchi and Schlemmer with Steinberg will enable the system to improve performance of the system [Steinberg: col. 2, line 6-15].
Regarding claim 25, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 23. Binder further meets the claim limitations as follow.
comprises a convolutional neural network that is trained on images (digital images taken of the scanned area) [Binder: col. 305, line 35-36] of weeds and crops (i.e. the undesirable wrong determination of, for example, crop plants as weeds can be further reduced, resulting in additional protection of the plants not to be removed) [Schlemmer: col. 5, line 20-23].
Binder, Eguchi and Schlemmer do not explicitly disclose the following claim limitations (Emphasis added).
comprises a convolutional neural network that is trained on images a weed and a crop.
In the same field of endeavor, Steinberg further discloses the claim limitations as follows:
comprises a convolutional neural network ((i.e. The processor may apply any type of algorithm and/or hardware to achieve the above products, such as Computer Vision (CV), Machine Leaming (ML), Convolutional Neural Network (CNN), Deep Learning (DL), Look Up Table (LUT), or the like) [Steinberg: col. 74, line 25-29]; (i.e. the "comparator" module may be implemented using a neural network (e.g., a CNN)) [Steinberg: col. 72, line 35-36]; (i.e. the feature extraction module may be implemented using a neural network (e.g., a convolutional neural network (CNN))) [Steinberg: col. 71, line 16-18]).
It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Binder, Eguchi and Schlemmer with Steinberg to implement an optical coherence tomography technique of Steinberg.
Therefore, the combination of Binder, Eguchi and Schlemmer with Steinberg will enable the system to improve performance of the system [Steinberg: col. 2, line 6-15].
Regarding claim 40, Binder and Schlemmer with Eguchi meet the claim limitations as set forth in claim 22. Binder and Schlemmer further meets the claim limitations as follow.
wherein identifying a plant in the first image ((i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (identifying at least a part of the field of view) [Binder: col. 45, line 33-34]) is performed using a neural network trained to differentiate between a weed and a crop ((LIDAR technology is being used in robotics for the perception of the environment as well as object classification) [Binder: col. 305, line 64-66]; (i.e. the undesirable wrong determination of, for example, crop plants as weeds can be further reduced, resulting in additional protection of the plants not to be removed) [Schlemmer: col. 5, line 20-23]).
Binder, Eguchi and Schlemmer do not explicitly disclose the following claim limitations (Emphasis added).
a neural network.
In the same field of endeavor, Steinberg further discloses the claim limitations as follows:
a neural network ((i.e. The processor may apply any type of algorithm and/or hardware to achieve the above products, such as Computer Vision (CV), Machine Leaming (ML), Convolutional Neural Network (CNN), Deep Learning (DL), Look Up Table (LUT), or the like) [Steinberg: col. 74, line 25-29]; (i.e. the "comparator" module may be implemented using a neural network (e.g., a CNN)) [Steinberg: col. 72, line 35-36]; (i.e. the feature extraction module may be implemented using a neural network (e.g., a convolutional neural network (CNN))) [Steinberg: col. 71, line 16-18]).
It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Binder, Eguchi and Schlemmer with Steinberg to implement an optical coherence tomography technique of Steinberg.
Therefore, the combination of Binder, Eguchi and Schlemmer with Steinberg will enable the system to improve performance of the system [Steinberg: col. 2, line 6-15].
Regarding claim 41, Binder meets the claim limitations as follow.
A system (i.e. automotive electronic systems) [Binder: col. 70, line 35] for damaging or killing objects (i.e. heat in the form of IR that can cause damage to sensitive objects) [Binder: col. 18, line 9-10], the system (i.e. automotive electronic systems) [Binder: col. 70, line 35] comprising:a first camera ((i.e. CCD image sensors) [Binder: col. 22, line 22-23]; (i.e. a camera) [Binder: col. 45, line 3]) configured to capture images of objects in a field (i.e. the object (or surface) sensed by the angle meter #1 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 62-64];a second camera ((i.e. CCD image sensors) [Binder: col. 22, line 22-23]; (i.e. a camera) [Binder: col. 45, line 3]) configured to capture images of plants in the field ((i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (i.e. the object (or surface) sensed by the angle meter #1 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 62-64];a light source configured to emit a light beam ((The actuator 501 may be used to activate or control the light emitted by a light source, being based on converting electrical energy or another energy to a light. The light emitted may be a visible light, or invisible light such as infrared, ultraviolet, X-ray or gamma rays. A shade, reflector, enclosing globe, housing, lens, and other accessories may be used, typically as part of a light fixture, in order to control the illumination intensity, shape or direction) [Binder: col. 263, line 61 – col. 264, line 62]; (i.e. The instrument comprising a light emitting unit for emitting a distance measuring light, a photodetecting unit for receiving and detecting a reflected distance measuring light from an object to be measured and a part of the distance measuring light emitted from the light emitting unit as an internal reference light, a sensitivity adjusting unit for electrically adjusting photodetecting sensitivity of the photodetecting unit, and a control arithmetic unit for calculating a measured distance based on a photodetection signal of the reflected distance measuring light from the photodetecting unit and based on a photodetection signal of the internal reference light, wherein the control arithmetic unit can measure a distance by selecting a prism mode measurement and a non-prism mode measurement, and controls so that photo-detecting sensitivity of the photo-detecting unit is changed by the sensitivity adjusting unit in response to the selected measurement mode) [Binder: col. 14, line 2-18]; (i.e. The pointing aid emitter produces a visible beam generally aligned with the optical axis of the camera objective lens such that the visible beam illuminates an object in the scene includes a scene measurement system that measures an aspect of the scene) [Binder: col. 45, line 6-11; Fig. 27b]);a control system (i.e. control system) [Binder: col. 71, line 31] comprising one or more actuators (i.e. Any apparatus or device herein may further comprise an actuator that converts electrical energy to affect or produce a physical phenomenon, the actuator may be coupled to be operated, controlled, or activated, by the processor, in response to a value of the first distance, the second distance, the first angle, or any combination, manipulation, or function thereof) [Binder: col. 162, line 17-23], wherein the control system (i.e. control system) [Binder: col. 71, line 31] is configured to direct an optical path of the light beam ((The actuator 501 may be used to activate or control the light emitted by a light source, being based on converting electrical energy or another energy to a light. The light emitted may be a visible light, or invisible light such as infrared, ultraviolet, X-ray or gamma rays. A shade, reflector, enclosing globe, housing, lens, and other accessories may be used, typically as part of a light fixture, in order to control the illumination intensity, shape or direction) [Binder: col. 263, line 61 – col. 264, line 62] ; (Various optical components for beam shaping, deflection, or filtering such as lenses, wavelength filters, or mirrors may be provided and positioned as part of the optical transmission path or the optical reception path, or both) [Binder: col. 6, line 6-9]).(According to the invention, the receiver contains a light guide with a downstream opto-electronic transducer, in which the light guide inlet surface is arranged in the imaging plane of the reception object lens for long distances from the object and can be controllably moved from this position transversely to the optical axis. In an alternative embodiment, the light inlet surface is fixed and there are optical means outside the optical axis of the reception object lens, which for short object distances) [Binder: col. 13, line 53-62]; (The invention also includes a radiant energy source that works with the camera. The radiant energy source produces a beam of radiant energy and projects the beam during intermissions between readings. The beam produces a light pattern on an object within or near the camera's field of view” [Binder: col. 45, line 28-33]; (i.e. The pointing aid emitter produces a visible beam generally aligned with the optical axis of the camera objective lens such that the visible beam illuminates an object in the scene includes a scene measurement system that measures an aspect of the scene) [Binder: col. 45, line 6-11; Fig. 27b]); anda computing system ((i.e. a processor) [Binder: col. 256, line 43]; (i.e. The block 263 further contains a digital image processor, which receives the digital data from the AFE, and processes this digital representation of the image) [Binder: col. 58, line 53-56]), wherein the computing system is configured to perform operations comprising ((i.e. a processor for executing the software) [Binder: col. 167, line 14-15]; (i.e. A non-transitory computer readable medium may include computer executable instructions stored thereon, and the instructions may include any of the steps) [Binder: col. 180, line 1-4]):receiving (i.e. signal received from the sensor) [Binder: col. 2, line 66] a first image of at least one object captured by the first camera (i.e. the object (or surface) sensed by the angle meter #1 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 62-64];identifying an object in the first image (identifying at least a part of the field of view) [Binder: col. 45, line 33-34] using a neural network;predicting a location of the object based on the first image ((Most multi-sensor AF cameras allow manual selection of the active sensor, and many offer an automatic selection of the sensor using algorithms that attempt to discern the location of the subject) [Binder: col. 64, line 17-30]; (The gauge utilizes complementary simultaneous measurements based upon both Doppler and time of flight principles. A complete record can be produced of the location and shape of a target object even when the object has severe discontinuities such as the edges of a turbine blade.) [Binder: col. 47, line 55-60]; (The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40]; (Measurements are made by determining the pulse time of flight of the distances of objects which respectively form a distance picture element and at which the transmission pulses are reflected) [Binder: col. 33, line 10-13]; (i.e. The pointing aid emitter produces a visible beam generally aligned with the optical axis of the camera objective lens such that the visible beam illuminates an object in the scene includes a scene measurement system that measures an aspect of the scene) [Binder: col. 45, line 6-11; Fig. 27b]);causing ((i.e. a processor for executing the software) [Binder: col. 167, line 14-15]; (i.e. A non-transitory computer readable medium may include computer executable instructions stored thereon, and the instructions may include any of the steps) [Binder: col. 180, line 1-4]) the second camera to capture a second image of a region including the predicted location (i.e. the object (or surface) sensed by the angle meter #1 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 62-64];
tracking motion of the second camera relative to the identified plant ((i.e. FIG. 5c depicts schematically a non-direct measuring of a height of a tree by an angle meter) [Binder: col. 180, line 57-58; Fig. 5C]; (The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40]; (i.e. The controller 268a serve as both the control block 61 used by the angle meter #1 55 and the controller 268 used by the digital camera 260, and the display 266a serve as both the display 63 used by the angle meter #1 55 and the display 266 used by the digital camera 260. Preferably, the measurement lines 51a and 51b are aligned with and parallel to the digital camera 260 optical axis 272, and may further be in close proximity thereto, so that the object (or surface) sensed by the angle meter #1 55 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 55-64; Fig. 27a-b]; (Measurements are made by determining the pulse time of flight of the distances of objects which respectively form a distance picture element and at which the transmission pulses are reflected) [Binder: col. 33, line 10-13]; (The correlator 19 is typically implemented using one of four predominant methods for active distance measurement: interferometric, triangulation, pulsed time-of-flight (TOF), and phase measuring. Interferometric methods may result in accuracies of less than one micrometer over ranges of up to several millimeters, while triangulation techniques may result in devices with accuracy in the micrometer range) [Binder: col. 3, line 38-44]);predicting a target location of the object (The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40] comprises correcting for the tracked motion of the second camera ((The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40]; (i.e. The controller 268a serve as both the control block 61 used by the angle meter #1 55 and the controller 268 used by the digital camera 260, and the display 266a serve as both the display 63 used by the angle meter #1 55 and the display 266 used by the digital camera 260. Preferably, the measurement lines 51a and 51b are aligned with and parallel to the digital camera 260 optical axis 272, and may further be in close proximity thereto, so that the object (or surface) sensed by the angle meter #1 55 is the same object whose image is captured by the digital camera 260) [Binder: col. 242, line 55-64; Fig. 27a-b]; (Measurements are made by determining the pulse time of flight of the distances of objects which respectively form a distance picture element and at which the transmission pulses are reflected) [Binder: col. 33, line 10-13]; (The correlator 19 is typically implemented using one of four predominant methods for active distance measurement: interferometric, triangulation, pulsed time-of-flight (TOF), and phase measuring. Interferometric methods may result in accuracies of less than one micrometer over ranges of up to several millimeters, while triangulation techniques may result in devices with accuracy in the micrometer range) [Binder: col. 3, line 38-44]);causing the control system ((i.e. a processor for executing the software) [Binder: col. 167, line 14-15]; (i.e. A non-transitory computer readable medium may include computer executable instructions stored thereon, and the instructions may include any of the steps) [Binder: col. 180, line 1-4]) to direct the optical path of the light beam (i.e. In the case of using light wave, the splitter 142 may consist of, comprise, or be based on, an optical beam splitter. Such an optical beam splitter may consist of, comprise, or be based on, two triangular glass prisms which are glued together at their base, a half-silvered mirror using a sheet of glass or plastic with a transparently thin coating of metal, a diffractive beam splitter, or a dichroic mirrored prism assembly which uses dichroic optical coatings. A polarizing beam splitter may consist of, comprise, or be based on Wollaston prism that use birefringent materials for splitting light into beams of differing polarization) [Binder: col. 212, line 20-30] ; (Various optical components for beam shaping, deflection, or filtering such as lenses, wavelength filters, or mirrors may be provided and positioned as part of the optical transmission path or the optical reception path, or both) [Binder: col. 6, line 6-9]).(According to the invention, the receiver contains a light guide with a downstream opto-electronic transducer, in which the light guide inlet surface is arranged in the imaging plane of the reception object lens for long distances from the object and can be controllably moved from this position transversely to the optical axis. In an alternative embodiment, the light inlet surface is fixed and there are optical means outside the optical axis of the reception object lens, which for short object distances) [Binder: col. 13, line 53-62]; (The invention also includes a radiant energy source that works with the camera. The radiant energy source produces a beam of radiant energy and projects the beam during intermissions between readings. The beam produces a light pattern on an object within or near the camera's field of view” [Binder: col. 45, line 28-33]) toward the predicted target location (i.e. Mobile LIDAR (also mobile laser scanning) is when two or more scanners are attached to a moving vehicle to collect data along a path. These scanners are usually paired with other kinds of equipment, including GNSS receivers and IMUs. One example application is surveying bordering trees, etc. all need to be taken into account) [Binder: col. 305, line 42-48]; (Measurements are made by determining the pulse time of flight of the distances of objects which respectively form a distance picture element and at which the transmission pulses are reflected) [Binder: col. 33, line 10-13]); andcausing ((i.e. a processor for executing the software) [Binder: col. 167, line 14-15]; (i.e. A non-transitory computer readable medium may include computer executable instructions stored thereon, and the instructions may include any of the steps) [Binder: col. 180, line 1-4]) the light source to emit the light beam (The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40] along the optical path (i.e. The pointing aid emitter produces a visible beam generally aligned with the optical axis of the camera objective lens such that the visible beam illuminates an object in the scene includes a scene measurement system that measures an aspect of the scene) [Binder: col. 45, line 6-11; Fig. 27b] toward the predicted target location of the object ((The microcontroller controls the radiation source to emit a modulated laser beam. The laser beam is received by the radiation receiver after being reflected by a target object, and is modulated by the microcontroller. The time that the laser beam takes during the journey is recorded, and is multiplied by a propagation velocity of the laser beam to determine the distance that the device is distant from the target object) [Binder: col. 16, line 32-40] ; (i.e. Mobile LIDAR (also mobile laser scanning) is when two or more scanners are attached to a moving vehicle to collect data along a path. These scanners are usually paired with other kinds of equipment, including GNSS receivers and IMUs. One example application is surveying bordering trees, etc. all need to be taken into account) [Binder: col. 305, line 42-48] ; (Measurements are made by determining the pulse time of flight of the distances of objects which respectively form a distance picture element and at which the transmission pulses are reflected) [Binder: col. 33, line 10-13]) for a length of time sufficient (i.e. One of the mirrors is a broadband reflector and the other mirror is wavelength selective so that gain is favored on a single longitudinal mode, resulting in lasing at a single resonant frequency. The broadband mirror is usually coated with a low reflectivity coating to allow emission. The wavelength selective mirror is a periodically structured diffraction grating with high reflectivity) [Binder: col. 10, line 34-41] to damage or kill the object;causing the light source to be deactivated after the length of time sufficient to damage or kill the object.
Binder does not explicitly disclose the following claim limitations (Emphasis added).
in the first image using a neural network;causing the light source to be deactivated after the length of time sufficient to damage or kill the plant.
However, in the same field of endeavor Schlemmer further discloses the claim limitations and the deficient claim limitations, as follows:
damaging or killing objects ((i.e. a system for semiautomatic and/or automatic weed removal) [Schlemmer: col. 1, line 19-20]; (i.e. By means of a laser, for example, weed removal can be performed (optically-) thermally. The energy applied by the laser allows weed to be removed precisely and quickly. Especially, it is conceivable to use a laser, by means of which the weed plant is at least partially vaporized or (optically-) thermally damaged in such a way that it dies) [Schlemmer: col. 4, line 10-16]).
at least one object in the field captured by the first camera (i.e. wherein image data of at least one weed can be generated by means of the optical sensor and transmitted via the communication network to the server unit, which analyses the transmitted image data such that the weed can be determined) [Schlemmer: col. 2, line 9-13].
causing the control system to direct the optical path of the light beam toward the predicted target location (i.e. By means of a laser, for example, weed removal can be performed (optically-) thermally. The energy applied by the laser allows weed to be removed precisely and quickly. Especially, it is conceivable to use a laser, by means of which the weed plant is at least partially vaporized or (optically-) thermally damaged in such a way that it dies) [Schlemmer: col. 4, line 10-16].
causing the light source to be deactivated (i.e. Other circuits are designed to communicate through a serial port with a host controller in order to set switches on or off) [Schlemmer: col. 197, line 15-17] after a length of time sufficient (i.e. starting of emitting an energy by an emitter 11, and ending after a set time interval) [Schlemmer: col. 200, line 12-14] to damage or kill the object (i.e. By means of a laser, for example, weed removal can be performed (optically-) thermally. The energy applied by the laser allows weed to be removed precisely and quickly. Especially, it is conceivable to use a laser, by means of which the weed plant is at least partially vaporized or (optically-) thermally damaged in such a way that it dies) [Schlemmer: col. 4, line 10-16].
It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Binder with Schlemmer to program the system to use the semi- automatic or automatic weed removal system of Schlemmer.
Therefore, the combination of Binder with Schlemmer will enable a lawn moving to kill weeds with an environmentally friendly approach [Schlemmer: col. 3, line 64 – col. 4, line 16].
In the same field of endeavor, Eguchi further discloses the claim limitations as follows:
causing the light source to emit the light beam along the directed optical path toward the predicted target location of the object ((i.e. The scanning unit causes the light beams emitted from the plurality of optical paths of the first light guide to be incident on the object with the plurality of light beams being aligned such that a detection line is formed on the object) [Eguchi: col. 2, line 39-43]; (i.e. the scanning unit includes a deflector that deflects the plurality of light beams emitted from the tip of the plurality of optical paths of the first light guide toward the object with the plurality of beams aligned in parallel, and shifts the detection line in the direction perpendicular to the detection line with the plurality of beams remained to be aligned in parallel) [Eguchi: col. 2, line 52-58]).
It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Binder and Schlemmer with Eguchi to implement an optical coherence tomography technique of Eguchi.
Therefore, the combination of Binder and Schlemmer with Eguchi will enable the system to utilize tomography technique for object detections [Eguchi: col. 8, line 55-59].
Binder, Eguchi and Schlemmer do not explicitly disclose the following claim limitations (Emphasis added).
identifying an object in the first image using a neural network.
In the same field of endeavor, Steinberg further discloses the claim limitations as follows:
identifying an object in the first image using a neural network ((i.e. At step 1707, based on the classification information and the received measurements with the at least one associated confidence level, the at least one processor may identify a plurality of pixels as being associated with a particular object. As explained above, the at least one processor may identify the particular object using matching with the accessed classification information (e.g., similar to the matching of step 805 of method 800, described above). Additionally or alternatively, steps 1705 and 1707 may be performed with one or more neural networks such that the at least one processor may use the one or more neural networks to determine the best match (or a list of matches optionally including a confidence level for each match)) [Steinberg: col. 61, line 36-48]).
It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Binder, Eguchi and Schlemmer with Steinberg to implement an optical coherence tomography technique of Steinberg.
Therefore, the combination of Binder, Eguchi and Schlemmer with Steinberg will enable the system to improve performance of the system [Steinberg: col. 2, line 6-15].
Reference Notice
Additional prior arts, included in the Notice of Reference Cited, made of record and not relied upon is considered pertinent to applicant's disclosure.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Contact Information
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Philip Dang whose telephone number is (408) 918-7529. The examiner can normally be reached on Monday-Thursday between 8:30 am - 5:00 pm (PST).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sath Perungavoor can be reached on 571-272-7455. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Philip P. Dang/Primary Examiner, Art Unit 2488