Prosecution Insights
Last updated: April 19, 2026
Application No. 18/015,924

SYSTEM AND METHOD FOR GEOLOCATION OF AN OBJECT IN WATER

Final Rejection §103
Filed
Jan 12, 2023
Examiner
ZAK, JACQUELINE ROSE
Art Unit
2666
Tech Center
2600 — Communications
Assignee
Witted Srl
OA Round
2 (Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
2y 10m
To Grant
55%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
8 granted / 12 resolved
+4.7% vs TC avg
Minimal -11% lift
Without
With
+-11.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
46 currently pending
Career history
58
Total Applications
across all art units

Statute-Specific Performance

§101
5.7%
-34.3% vs TC avg
§103
56.3%
+16.3% vs TC avg
§102
21.1%
-18.9% vs TC avg
§112
13.8%
-26.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 12 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Status Claims 1-16 are pending for examination in the application filed 11/12/2025. Claims 1-16 have been amended. Priority Acknowledgement is made of Applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent application IT102020000017104 filed on 07/14/2020. Acknowledgement is additionally made of the present application as a national stage entry of PCT/EP2021/069497, international filing date: 07/13/2021. Response to Arguments and Amendments Applicant’s arguments with respect to claims 1-16 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument, as facilitated by the newly added amendments. Claim Objections Claim 8 is objected to because of the following informalities: claim 8 recites the limitation “wherein the processing unit is configured to predict a next position of the light beam in the two-dimensional image captured by the camera…”. “Processing unit” should be “processor” in accordance with the amended claim language of claim 1. Appropriate correction is required. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-6, 9-14, and 16 a rejected under 35 U.S.C. 103 as being unpatentable over Jongsma (US20170328982A1) in view of Sheldon (US20200056578A1). Regarding claim 1, Jongsma teaches a system for geolocation of an object in water, ([0001] The present invention relates to systems and methods for positioning in an underwater environment and more particularly to a system of beacons for such use. The invention also relates to a subsea metrology system for determining the relative positions and orientations of two objects) the system comprising: a first device (beacon) configured to be immersed in, or to float on, water, the first device comprising a light source configured to emit a light beam ([0011] the system comprising: at least one beacon having a light source, located at a fixed position within the reference frame. [0015] beacons with light sources may be deployed at fixed positions with respect to the fixed reference frame e.g. on the seabed and/or in/on objects that are assumed to remain stationary with respect to the seabed. In alternative implementations, underwater positioning systems may be configured for providing relative positioning information for a rover with respect to a dynamic reference frame. A dynamic reference frame may for example be associated with another rover or leader vehicle that is provided with at least one beacon light source, and which is moveable underwater); a second device (ROV underwater imaging device) configured to be immersed in, or to float on, water, the second device comprising a camera configured to take two-dimensional images and a measuring device configured to provide an orientation of the camera relative to a main reference frame defined by three orthogonal axes ([0011] the system comprising: …an underwater imaging device, moveable with the rover in the reference frame to observe the light source from different viewpoints and determine direction data representing a direction or change in direction of the light source with respect to the imaging device; an orientation sensor, associated with the imaging device to determine an orientation of the imaging device with respect to the reference frame and generate orientation data; and a scaling element for providing scaling data representative of a distance between the imaging device and the light source. [0119] A reference frame XYZ is defined with respect to one of the two beacons 504, and this reference frame may be assumed to remain fixed with respect to the body of water 508); a processor operatively connected to at least the camera (Fig. 10. [0121] The ROV 502 further includes an orientation sensor 514, a processor 516, and a communications interface 518), the processor being configured to: determine a vertical distance between the first device and the second device by depth obtained from a depth gauge ([0048] a scaling element for providing scaling data representative of a distance between the imaging device and the light source. [0022] In another embodiment, the scaling element may comprise a depth sensor associated with the imaging device and capable of resolving changes in depth thereof. In this manner, movement in the Z direction between a first viewpoint and a second viewpoint may be determined and used alone or together with other data to evaluate the scaling data), capture a two-dimensional image of the first device through the camera ([0103] The processor 16 controls operation of the camera 10 to produce a photogrammetric image of the light source 30 on beacon 4), calculate the pixel position in the two-dimensional image of light beam emitted by the light source of the first device ([0070] In embodiments, the beacon may be provided with a processor configured for determining the position information for the rover from the direction data and the scaling data. Alternatively, the processor may only be configured to determine pixel coordinates of detected light sources in images acquired by the imaging device, and to store these coordinates for further use and/or transmission to the rover), and calculate one of (i): a position of the first device relative to the main reference frame based on the pixel position of the light beam, the orientation of the camera, a position of the second device relative to the main reference frame, and the vertical distance, and (ii) a position of the second device relative to the main reference frame based on the pixel position of the light beam, the orientation of the camera, a position of the first device relative to the main reference frame, and the vertical distance ([0103] Operation of the positioning system 1 will now be explained with reference to FIG. 2, which shows a schematic plan view of the ROV 2 and beacon 4. The ROV 2 has its axis AR directed at a heading H with respect to North N. The processor 16 controls operation of the camera 10 to produce a photogrammetric image of the light source 30 on beacon 4. Based on the reading, the processor can calculate the bearing B to the light source 30 and its angle a with respect to North N. The processor 16 also interrogates the beacon 4 using the transceiver 12 to pulse the transponder 28 and detect a returned pulse. The transmission time is converted into a range R using conventional ranging techniques for the given water depth and temperature. Although not further discussed it will be understood that all additional readings required for performing such ranging will be provided either from sensors aboard the ROV or elsewhere. Once bearing and range are determined for the beacon 4 relative to the ROV, the Δx and Δy offsets from the ROV 2 to the beacon 4 can be evaluated. It will thus be understood that for a fixed location of the beacon 4, the position of the ROV 2 can be established. Conversely, if the position of the ROV 2 is known, the location of the beacon 4 may be established). Jongsma does not teach determine a vertical distance between the first device and the second device by subtracting a depth of the first device in the water and a depth of the second device in the water, each of the depth of the first device in the water and the depth of the second device in the water being obtained from a respective depth gauge included in the respective device. Sheldon, in the same field of endeavor of underwater depth analysis, teaches determine a vertical distance between the first device and the second device by subtracting a depth of the first device in the water and a depth of the second device in the water, each of the depth of the first device in the water and the depth of the second device in the water being obtained from a respective depth gauge included in the respective device (See Fig. 247. [1680] The ROV also incorporates a pressure and/or depth sensor that allows it to determine its depth, at least to an approximate degree. [1682] The embodiment possesses stereoscopic cameras, and other sensors, mounted to an upper portion of the buoy 2150 that allow it to calculate the height of its own waterline, and/or the draft or depth of its inertial water tube 2158. Subtracting this tube depth from the ROV depth determined by the ROV's pressure and/or depth sensor (not shown) permits the embodiment to determine the “tube-relative depth” of the ROV relative to the bottom of its inertial water tube 2158 and/or to the tube-mounted acoustic sensor 2172 mounted thereto). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the system of Jongsma with the teachings of Sheldon to subtract the depth of the first device and second device to determine a distance between the devices because "This enables the embodiment's control system (not shown) to make adjustments to the ROV's position (e.g., through commands to the ROV 2157 causing the ROV to execute specific thrust vectors relative to its own longitudinal axis and/or geometry)" [Sheldon 1683]. Regarding claim 2, Jongsma and Sheldon teach the system of claim 1. Jongsma further teaches wherein the position of the second device or the first device used to calculate the position of the first device or the second device, respectively, is stored in a data storage or provided by a position device ([0025] The system preferably comprises a processor arranged to receive and analyze the direction data and the scaling data to determine the position information), and wherein the depth of the first device in the water is stored in the data storage or measured by a depth gauge comprised in the first device ([0103] Based on the reading, the processor can calculate the bearing B to the light source 30 and its angle a with respect to North N. The processor 16 also interrogates the beacon 4 using the transceiver 12 to pulse the transponder 28 and detect a returned pulse. The transmission time is converted into a range R using conventional ranging techniques for the given water depth and temperature) and the depth of the second device in the water is stored in the data storage or measured by a depth gauge comprised in the second device ([0048] a scaling element for providing scaling data representative of a distance between the imaging device and the light source. [0022] In another embodiment, the scaling element may comprise a depth sensor associated with the imaging device and capable of resolving changes in depth thereof. In this manner, movement in the Z direction between a first viewpoint and a second viewpoint may be determined and used alone or together with other data to evaluate the scaling data). Regarding claim 3, Jongsma and Sheldon teach the system of claim 2. Jongsma further teaches wherein the processor is operatively connected to one of the data storage and the position device to obtain one of the position of the second device and the position of the first device to calculate one of the position of the first device and the position of the second device, respectively ([0025] The system preferably comprises a processor arranged to receive and analyze the direction data and the scaling data to determine the position information), and wherein the processor is operatively connected to one of: (i) the data storage or the depth gauge of the first device to obtain the depth of the first device in the water ([0103] Based on the reading, the processor can calculate the bearing B to the light source 30 and its angle a with respect to North N. The processor 16 also interrogates the beacon 4 using the transceiver 12 to pulse the transponder 28 and detect a returned pulse. The transmission time is converted into a range R using conventional ranging techniques for the given water depth and temperature), and (ii) the data storage or the depth gauge of the second device to obtain the depth of the second device in the water to determine the vertical distance ([0121] The ROV 502 further includes an orientation sensor 514, a processor 516, and a communications interface 518. At least one from an acoustic transceiver 512, an INS 513, and a depth sensor 515 may also be present). Regarding claim 4, Jongsma and Sheldon teach the system of claim 3. Jongsma further teaches wherein the position device comprises at least one of an absolute position sensor, a real-time kinematic (RTK) positioning system, a mobile-phone tracking system, a real-time locating system based on radio, optical or ultrasonic technology, and a positioning system based on methods of underwater acoustic positioning, wherein the first device or the second device is provided with the position device ([0020] According to another embodiment of the invention, the system further comprises an Inertial Navigation System (INS) associated and moveable together with the imaging device. INS's are generally conventional devices, used to provide relative and absolute local orientation and position information…Additionally, the INS should preferably be an aided INS, in that it is provided with additional inputs to improve the INS accuracy. This may include hydro acoustic positioning, a depth sensor providing an absolute depth measurement and/or a verticality sensor, providing orientation with respect to the earth's gravitational field and the USBL, SBL and DVL systems mentioned above). Regarding claim 5, Jongsma and Sheldon teach the system of claim 1. Jongsma further teaches wherein the first device comprises a first controller ([0034] The method may be implemented entirely by a suitable control device) connected to the light source to modulate the light beam so that the light beam transmits information about one of the position and a depth of the first device ([0112] The beacon 404 includes three light sources 430a, 430b, 430c, which are adjacently located at predetermined positions to define an origin OR of a local beacon reference frame. The three light sources 430a-c are mounted on a base 422, and are each adapted for projecting a respective beam of light 432a, 432b, 432c into the water 408, in a predetermined direction away from the origin OR of the local reference frame. [0114] Preferably, each of the three light sources 430a-c generates light of a different wavelength range. Alternatively or in addition, the three light sources 430a-c may be configured to flash in different time-patterns, to allow the processor in the rover to resolve the orientation of the projected (local) reference frame) and wherein the second device comprises an optical sensor configured to detect the light beam, the processor being connected to the optical sensor to obtain one of the position and depth of the first device based on the light beam detected by the optical sensor ([0103] Operation of the positioning system 1 will now be explained with reference to FIG. 2, which shows a schematic plan view of the ROV 2 and beacon 4. The ROV 2 has its axis AR directed at a heading H with respect to North N. The processor 16 controls operation of the camera 10 to produce a photogrammetric image of the light source 30 on beacon 4. Based on the reading, the processor can calculate the bearing B to the light source 30 and its angle a with respect to North N). Regarding claim 6, Jongsma and Sheldon teach the system of claim 1. Jongsma further teaches wherein one of the first device and the second device has an acoustic emitter configured to emit an acoustic signal which represents one of the position and depth of the respective device ([0071] In further embodiments, the beacon may comprise a communication device arranged for wireless transmission of the position information and/or the determined pixel coordinates of detected light sources through the body of water to the rover. At least one of acoustic transmission, optical transmission, and electromagnetic transmission techniques may be used for conveying the position information from the beacon to the rover) and the other of the first device and the second device has an acoustic receiver configured to receive the acoustic signal emitted by the acoustic emitter, the processor being connected to the acoustic receiver to obtain one of the position and the depth of the one of the first device and the second device based on the signal received by the acoustic receiver ([0019] In an alternative arrangement, the scaling element may comprise an acoustic transponder located at a known position in or on the beacon with respect to the light source and a corresponding acoustic transceiver associated with the imaging device. By appropriate triangulation between the respective light sources and with the addition of a single distance measurement from one light source to the imaging device, the overall scale of the reference frame can be determined. [0019] In an alternative arrangement, the scaling element may comprise an acoustic transponder located at a known position in or on the beacon with respect to the light source and a corresponding acoustic transceiver associated with the imaging device). Regarding claim 9, Jongsma and Sheldon teach the system of claim 1. Jongsma further teaches wherein the first device comprises at least two light sources configured to emit respective light beams, the distance between each couple of light sources being fixed ([0112] The beacon 404 includes three light sources 430a, 430b, 430c, which are adjacently located at predetermined positions to define an origin OR of a local beacon reference frame. The three light sources 430a-c are mounted on a base 422, and are each adapted for projecting a respective beam of light 432a, 432b, 432c into the water 408, in a predetermined direction away from the origin OR of the local reference frame), and wherein the processor is configured to: calculate the pixel position in the two-dimensional image of each light beam emitted by the at least two light sources of the first device, and one of (i): calculate the position of each of the at least two light sources relative to the main reference frame [0070] the processor may only be configured to determine pixel coordinates of detected light sources in images acquired by the imaging device, and to store these coordinates for further use and/or transmission to the rover) based on the pixel position of the relevant light beam, the orientation of the camera, a position of the second device relative to the main reference frame and the vertical distance ([0103] Operation of the positioning system 1 will now be explained with reference to FIG. 2, which shows a schematic plan view of the ROV 2 and beacon 4. The ROV 2 has its axis AR directed at a heading H with respect to North N. The processor 16 controls operation of the camera 10 to produce a photogrammetric image of the light source 30 on beacon 4. Based on the reading, the processor can calculate the bearing B to the light source 30 and its angle a with respect to North N. The processor 16 also interrogates the beacon 4 using the transceiver 12 to pulse the transponder 28 and detect a returned pulse. The transmission time is converted into a range R using conventional ranging techniques for the given water depth and temperature. Although not further discussed it will be understood that all additional readings required for performing such ranging will be provided either from sensors aboard the ROV or elsewhere. Once bearing and range are determined for the beacon 4 relative to the ROV, the Δx and Δy offsets from the ROV 2 to the beacon 4 can be evaluated. It will thus be understood that for a fixed location of the beacon 4, the position of the ROV 2 can be established. Conversely, if the position of the ROV 2 is known, the location of the beacon 4 may be established. In FIG. 2, the situation is illustrated for a two-dimensional configuration in which, for simplicity, only bearing is taken into consideration. It will be understood that in practice, elevation will also be taken into account and the Δz value will also be determined), and determine the orientation of the first device relative to the second device based on the calculated positions of the at least two light sources ([0103] Operation of the positioning system 1 will now be explained with reference to FIG. 2, which shows a schematic plan view of the ROV 2 and beacon 4. The ROV 2 has its axis AR directed at a heading H with respect to North N. The processor 16 controls operation of the camera 10 to produce a photogrammetric image of the light source 30 on beacon 4. Based on the reading, the processor can calculate the bearing B to the light source 30 and its angle a with respect to North N. The processor 16 also interrogates the beacon 4 using the transceiver 12 to pulse the transponder 28 and detect a returned pulse. The transmission time is converted into a range R using conventional ranging techniques for the given water depth and temperature. Although not further discussed it will be understood that all additional readings required for performing such ranging will be provided either from sensors aboard the ROV or elsewhere. Once bearing and range are determined for the beacon 4 relative to the ROV, the Δx and Δy offsets from the ROV 2 to the beacon 4 can be evaluated. It will thus be understood that for a fixed location of the beacon 4, the position of the ROV 2 can be established. Conversely, if the position of the ROV 2 is known, the location of the beacon 4 may be established), and (ii) calculate the position of each of the at least two light sources relative to the main reference frame based on the pixel position of the relevant light beam, an orientation of a rigid surface of the first device relative to the main reference frame, a position of the first device relative to the main reference frame and the vertical distance, and determine the orientation of the second device relative to the first device based on the calculated positions of the at least two light sources. Regarding claim 10, Jongsma teaches a method for geolocation of an object in water, ([0001] The present invention relates to systems and methods for positioning in an underwater environment and more particularly to a system of beacons for such use. The invention also relates to a subsea metrology system for determining the relative positions and orientations of two objects) the method comprising: putting a first device (beacon) into water, the first device comprising a light source configured to emit a light beam ([0011] the system comprising: at least one beacon having a light source, located at a fixed position within the reference frame. [0015] beacons with light sources may be deployed at fixed positions with respect to the fixed reference frame e.g. on the seabed and/or in/on objects that are assumed to remain stationary with respect to the seabed. In alternative implementations, underwater positioning systems may be configured for providing relative positioning information for a rover with respect to a dynamic reference frame. A dynamic reference frame may for example be associated with another rover or leader vehicle that is provided with at least one beacon light source, and which is moveable underwater); putting a second device (ROV underwater imaging device) into the water, the second device comprising a camera configured to take images; emitting the light beam by the light source; obtaining an orientation of the camera relative to a main reference frame defined by three orthogonal axes ([0011] the system comprising: …an underwater imaging device, moveable with the rover in the reference frame to observe the light source from different viewpoints and determine direction data representing a direction or change in direction of the light source with respect to the imaging device; an orientation sensor, associated with the imaging device to determine an orientation of the imaging device with respect to the reference frame and generate orientation data; and a scaling element for providing scaling data representative of a distance between the imaging device and the light source. [0119] A reference frame XYZ is defined with respect to one of the two beacons 504, and this reference frame may be assumed to remain fixed with respect to the body of water 508); obtaining a depth of the first device and a depth of the second device in the water; determining a vertical distance between the first device and the second device ([0048] a scaling element for providing scaling data representative of a distance between the imaging device and the light source. [0022] In another embodiment, the scaling element may comprise a depth sensor associated with the imaging device and capable of resolving changes in depth thereof. In this manner, movement in the Z direction between a first viewpoint and a second viewpoint may be determined and used alone or together with other data to evaluate the scaling data), capturing a two-dimensional image of the first device through the camera ([0103] The processor 16 controls operation of the camera 10 to produce a photogrammetric image of the light source 30 on beacon 4), calculating the pixel position in the two-dimensional image of the light beam emitted by the light source of the first device ([0070] In embodiments, the beacon may be provided with a processor configured for determining the position information for the rover from the direction data and the scaling data. Alternatively, the processor may only be configured to determine pixel coordinates of detected light sources in images acquired by the imaging device, and to store these coordinates for further use and/or transmission to the rover), and one of: (i) obtaining a position of the second device relative to the main reference frame and calculating a position of the first device relative to the main reference frame based on the pixel position of the light beam, the orientation of the camera, the position of the second device relative to the main reference frame and the vertical distance, and (ii) obtaining a position of the first device relative to the main reference frame and calculating a position of the second device relative to the main reference frame based on the pixel position of the light beam, the orientation of the camera, the position of the first device relative to the main reference frame and the vertical distance ([0103] Operation of the positioning system 1 will now be explained with reference to FIG. 2, which shows a schematic plan view of the ROV 2 and beacon 4. The ROV 2 has its axis AR directed at a heading H with respect to North N. The processor 16 controls operation of the camera 10 to produce a photogrammetric image of the light source 30 on beacon 4. Based on the reading, the processor can calculate the bearing B to the light source 30 and its angle a with respect to North N. The processor 16 also interrogates the beacon 4 using the transceiver 12 to pulse the transponder 28 and detect a returned pulse. The transmission time is converted into a range R using conventional ranging techniques for the given water depth and temperature. Although not further discussed it will be understood that all additional readings required for performing such ranging will be provided either from sensors aboard the ROV or elsewhere. Once bearing and range are determined for the beacon 4 relative to the ROV, the Δx and Δy offsets from the ROV 2 to the beacon 4 can be evaluated. It will thus be understood that for a fixed location of the beacon 4, the position of the ROV 2 can be established. Conversely, if the position of the ROV 2 is known, the location of the beacon 4 may be established). Jongsma does not teach determining a vertical distance between the first device and the second device by subtracting the depth of the first device in the water and the depth of the second device in the water, each of the depth of the first device in the water and the depth of the second device in the water being obtained from a respective depth gauge included in the corresponding device. Sheldon, in the same field of endeavor of underwater depth analysis, teaches determining a vertical distance between the first device and the second device by subtracting the depth of the first device in the water and the depth of the second device in the water, each of the depth of the first device in the water and the depth of the second device in the water being obtained from a respective depth gauge included in the corresponding device (See Fig. 247. [1680] The ROV also incorporates a pressure and/or depth sensor that allows it to determine its depth, at least to an approximate degree. [1682] The embodiment possesses stereoscopic cameras, and other sensors, mounted to an upper portion of the buoy 2150 that allow it to calculate the height of its own waterline, and/or the draft or depth of its inertial water tube 2158. Subtracting this tube depth from the ROV depth determined by the ROV's pressure and/or depth sensor (not shown) permits the embodiment to determine the “tube-relative depth” of the ROV relative to the bottom of its inertial water tube 2158 and/or to the tube-mounted acoustic sensor 2172 mounted thereto). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the method of Jongsma with the teachings of Sheldon to subtract the depth of the first device and second device to determine a distance between the devices because "This enables the embodiment's control system (not shown) to make adjustments to the ROV's position (e.g., through commands to the ROV 2157 causing the ROV to execute specific thrust vectors relative to its own longitudinal axis and/or geometry)" [Sheldon 1683]. Regarding claim 11, Jongsma and Sheldon teach the method of claim 10. Jongsma further teaches wherein the position of the second device or the first device used to calculate the position of the first device or the second device, respectively, is stored in a data storage or provided by a position device ([0025] The system preferably comprises a processor arranged to receive and analyze the direction data and the scaling data to determine the position information), and wherein the depth of the first device in the water is stored in the data storage or measured by a depth gauge comprised in the first device ([0103] Based on the reading, the processor can calculate the bearing B to the light source 30 and its angle a with respect to North N. The processor 16 also interrogates the beacon 4 using the transceiver 12 to pulse the transponder 28 and detect a returned pulse. The transmission time is converted into a range R using conventional ranging techniques for the given water depth and temperature) and the depth of the second device in the water is stored in the data storage or measured by a depth gauge comprised in the second device ([0048] a scaling element for providing scaling data representative of a distance between the imaging device and the light source. [0022] In another embodiment, the scaling element may comprise a depth sensor associated with the imaging device and capable of resolving changes in depth thereof. In this manner, movement in the Z direction between a first viewpoint and a second viewpoint may be determined and used alone or together with other data to evaluate the scaling data). Regarding claim 12, Jongsma and Sheldon teach the method of claim 11. Jongsma further teaches wherein the position device comprises at least one of an absolute position sensor, a real-time kinematic (RTK) positioning system, a mobile-phone tracking, a real-time locating system based on radio, optical or ultrasonic technology, and a positioning system based on methods of underwater acoustic positioning, wherein the first device or the second device is provided with the position device ([0020] According to another embodiment of the invention, the system further comprises an Inertial Navigation System (INS) associated and moveable together with the imaging device. INS's are generally conventional devices, used to provide relative and absolute local orientation and position information…Additionally, the INS should preferably be an aided INS, in that it is provided with additional inputs to improve the INS accuracy. This may include hydro acoustic positioning, a depth sensor providing an absolute depth measurement and/or a verticality sensor, providing orientation with respect to the earth's gravitational field and the USBL, SBL and DVL systems mentioned above). Regarding claim 13, Jongsma and Sheldon teach the method of claim 10. Jongsma further teaches wherein the obtaining the position or the obtaining the depth of the first device comprises: modulating the emitted light beam so that the light beam transmits information about the position or depth of the first device, detecting the light beam by an optical sensor ([0112] The beacon 404 includes three light sources 430a, 430b, 430c, which are adjacently located at predetermined positions to define an origin OR of a local beacon reference frame. The three light sources 430a-c are mounted on a base 422, and are each adapted for projecting a respective beam of light 432a, 432b, 432c into the water 408, in a predetermined direction away from the origin OR of the local reference frame. [0114] Preferably, each of the three light sources 430a-c generates light of a different wavelength range. Alternatively or in addition, the three light sources 430a-c may be configured to flash in different time-patterns, to allow the processor in the rover to resolve the orientation of the projected (local) reference frame) and determining the position or depth of the first device based on the light beam detected by the optical sensor ([0103] Operation of the positioning system 1 will now be explained with reference to FIG. 2, which shows a schematic plan view of the ROV 2 and beacon 4. The ROV 2 has its axis AR directed at a heading H with respect to North N. The processor 16 controls operation of the camera 10 to produce a photogrammetric image of the light source 30 on beacon 4. Based on the reading, the processor can calculate the bearing B to the light source 30 and its angle a with respect to North N). Regarding claim 14, Jongsma teaches the method of claim 10. Jongsma further teaches wherein the obtaining the position or the obtaining the depth of at least one of the first device and the second device in the water comprises: emitting an acoustic or electric signal which represents the position or depth of one between the first device or the second device ([0071] In further embodiments, the beacon may comprise a communication device arranged for wireless transmission of the position information and/or the determined pixel coordinates of detected light sources through the body of water to the rover. At least one of acoustic transmission, optical transmission, and electromagnetic transmission techniques may be used for conveying the position information from the beacon to the rover). receiving the acoustic or electric signal, determining the position or depth of the one between the first device or the second device based on the received acoustic or electric signal ([0019] In an alternative arrangement, the scaling element may comprise an acoustic transponder located at a known position in or on the beacon with respect to the light source and a corresponding acoustic transceiver associated with the imaging device. By appropriate triangulation between the respective light sources and with the addition of a single distance measurement from one light source to the imaging device, the overall scale of the reference frame can be determined. [0019] In an alternative arrangement, the scaling element may comprise an acoustic transponder located at a known position in or on the beacon with respect to the light source and a corresponding acoustic transceiver associated with the imaging device). Regarding claim 16, Jongsma teaches the method of claim 10. Jongsma further teaches wherein the first device comprises at least two light sources configured to emit respective light beams, the distance between each pair of light sources being fixed ([0112] The beacon 404 includes three light sources 430a, 430b, 430c, which are adjacently located at predetermined positions to define an origin OR of a local beacon reference frame. The three light sources 430a-c are mounted on a base 422, and are each adapted for projecting a respective beam of light 432a, 432b, 432c into the water 408, in a predetermined direction away from the origin OR of the local reference frame), the method further comprising: calculating the pixel position in the two-dimensional image of each light beam emitted by each light source of the first device; and one of: (i) calculating the position of each of the at least two light sources relative to the main reference frame [0070] the processor may only be configured to determine pixel coordinates of detected light sources in images acquired by the imaging device, and to store these coordinates for further use and/or transmission to the rover) based on the pixel position of the respective light beam, the orientation of the camera, a position of the second device relative to the main reference frame and the vertical distance ([0103] Operation of the positioning system 1 will now be explained with reference to FIG. 2, which shows a schematic plan view of the ROV 2 and beacon 4. The ROV 2 has its axis AR directed at a heading H with respect to North N. The processor 16 controls operation of the camera 10 to produce a photogrammetric image of the light source 30 on beacon 4. Based on the reading, the processor can calculate the bearing B to the light source 30 and its angle a with respect to North N. The processor 16 also interrogates the beacon 4 using the transceiver 12 to pulse the transponder 28 and detect a returned pulse. The transmission time is converted into a range R using conventional ranging techniques for the given water depth and temperature. Although not further discussed it will be understood that all additional readings required for performing such ranging will be provided either from sensors aboard the ROV or elsewhere. Once bearing and range are determined for the beacon 4 relative to the ROV, the Δx and Δy offsets from the ROV 2 to the beacon 4 can be evaluated. It will thus be understood that for a fixed location of the beacon 4, the position of the ROV 2 can be established. Conversely, if the position of the ROV 2 is known, the location of the beacon 4 may be established. In FIG. 2, the situation is illustrated for a two-dimensional configuration in which, for simplicity, only bearing is taken into consideration. It will be understood that in practice, elevation will also be taken into account and the Δz value will also be determined), and determining the orientation of the first device relative to the second device based on the calculated positions of the at least two light sources ([0103] Operation of the positioning system 1 will now be explained with reference to FIG. 2, which shows a schematic plan view of the ROV 2 and beacon 4. The ROV 2 has its axis AR directed at a heading H with respect to North N. The processor 16 controls operation of the camera 10 to produce a photogrammetric image of the light source 30 on beacon 4. Based on the reading, the processor can calculate the bearing B to the light source 30 and its angle a with respect to North N. The processor 16 also interrogates the beacon 4 using the transceiver 12 to pulse the transponder 28 and detect a returned pulse. The transmission time is converted into a range R using conventional ranging techniques for the given water depth and temperature. Although not further discussed it will be understood that all additional readings required for performing such ranging will be provided either from sensors aboard the ROV or elsewhere. Once bearing and range are determined for the beacon 4 relative to the ROV, the Δx and Δy offsets from the ROV 2 to the beacon 4 can be evaluated. It will thus be understood that for a fixed location of the beacon 4, the position of the ROV 2 can be established. Conversely, if the position of the ROV 2 is known, the location of the beacon 4 may be established), and (ii) calculating the position of each of the at least two light sources relative to the main reference frame based on the pixel position of the respective light beam, an orientation of a rigid surface of the first device relative to the main reference frame, a position of the first device relative to the main reference frame and the vertical distance, and to determine the orientation of the second device relative to the first device based on the calculated positions of the at least two light sources. Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Jongsma in view of Sheldon and Shigeta (US20180263500A1). Regarding claim 7, Jongsma and Sheldon teach the system of claim 1. Jongsma further teaches wherein the first device and the second device are connected through which one of: (i) the first device transmits to the second device information on its depth or position, and (ii) the second device transmits to the first device information on its depth or position ([0071] In further embodiments, the beacon may comprise a communication device arranged for wireless transmission of the position information and/or the determined pixel coordinates of detected light sources through the body of water to the rover. At least one of acoustic transmission, optical transmission, and electromagnetic transmission techniques may be used for conveying the position information from the beacon to the rover). Jongsma does not teach wherein the first device and the second device are connected to each other by a marine communication cable. Shigeta, in the same field of endeavor of underwater imaging, teaches wherein the first device and the second device are connected to each other by a marine communication cable ([0032] As shown in FIG. 1, the photoacoustic imaging apparatus 100 according to the embodiment of the present invention includes an illumination portion 1a and an illumination portion 1b, a probe body 2, and an apparatus body 3. The photoacoustic imaging apparatus 100 includes a cable 4a for connecting the illumination portion 1a (1b) and the apparatus body 3, and a cable 4b for connecting the probe body 2 and the apparatus body 3). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the system of Jongsma with the teachings of Shigeta to use a cable so that "The light source drive circuit 32 is configured to supply the power to the light source portion 11 based on a light trigger signal from the control portion 33" (Figure 2) [Shigeta 0048]. Claims 8 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Jongsma and Sheldon in view of Long (JP2018084976A). Regarding claim 8, Jongsma and Sheldon teach the system of claim 1. Long, in the same field of endeavor of light beam position prediction, teaches wherein the processing unit is configured to predict a next position of the light beam in the two-dimensional image captured by the camera (optical camera 11) by performing a recursive filtering algorithm based on at least an actual position and previous positions of the light beam in the two-dimensional image ([pg. 3 para. 3] The light beam recognition device 12 repeatedly performs the processing from S200 to S400. [pg. 7 para. 2] Alternatively, the light spot position may be set using a Kalman filter. That is, the positions of a plurality of maxima pixels extracted to the prediction pixel B .sub.t by updated using a Kalman filter may be set shaft of light position. In other words, may be set straight line update from the positional relationship between the shaft of light pixels detected a straight line representing the state vector s .sub.t-1 using a Kalman filter as the expected line S .sub.t. Specifically, the prediction straight line S .sub.1 may be a straight line obtained by updating the initially set state vector s .sub.0 using the Kalman filter based on the positional relationship with the position z .sub.1 of the light beam pixel). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the system of Jongsma with the teachings of Long to use recursive filtering to "update the predicted straight line" of a light beam [Long pg. 8 para. 13]. Regarding claim 15, Jongsma and Sheldon teaches the method of claim 10. Long, in the same field of endeavor of light beam position prediction, teaches predicting a next position of the light beam in the two-dimensional image captured by the camera (optical camera 11) by performing a recursive filtering based on at least an actual position and previous positions of the light beam in the two-dimensional image ([pg. 3 para. 3] The light beam recognition device 12 repeatedly performs the processing from S200 to S400. [pg. 7 para. 2] Alternatively, the light spot position may be set using a Kalman filter. That is, the positions of a plurality of maxima pixels extracted to the prediction pixel B .sub.t by updated using a Kalman filter may be set shaft of light position. In other words, may be set straight line update from the positional relationship between the shaft of light pixels detected a straight line representing the state vector s .sub.t-1 using a Kalman filter as the expected line S .sub.t. Specifically, the prediction straight line S .sub.1 may be a straight line obtained by updating the initially set state vector s .sub.0 using the Kalman filter based on the positional relationship with the position z .sub.1 of the light beam pixel). Therefore, it would have been obvious to a person of ordinary skill in the art at the time that the invention was made to modify the method of Jongsma with the teachings of Long to use recursive filtering to "update the predicted straight line" of a light beam [Long pg. 8 para. 13]. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jacqueline R Zak whose telephone number is (571)272-4077. The examiner can normally be reached M-F 9-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Emily Terrell can be reached at (571) 270-3717. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JACQUELINE R ZAK/Examiner, Art Unit 2666 /EMILY C TERRELL/Supervisory Patent Examiner, Art Unit 2666
Read full office action

Prosecution Timeline

Jan 12, 2023
Application Filed
Aug 07, 2025
Non-Final Rejection — §103
Nov 12, 2025
Response Filed
Dec 29, 2025
Final Rejection — §103
Feb 20, 2026
Interview Requested

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586340
PIXEL PERSPECTIVE ESTIMATION AND REFINEMENT IN AN IMAGE
2y 5m to grant Granted Mar 24, 2026
Patent 12462343
MEDICAL DIAGNOSTIC APPARATUS AND METHOD FOR EVALUATION OF PATHOLOGICAL CONDITIONS USING 3D OPTICAL COHERENCE TOMOGRAPHY DATA AND IMAGES
2y 5m to grant Granted Nov 04, 2025
Patent 12373946
ASSAY READING METHOD
2y 5m to grant Granted Jul 29, 2025
Study what changed to get past this examiner. Based on 3 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
55%
With Interview (-11.4%)
2y 10m
Median Time to Grant
Moderate
PTA Risk
Based on 12 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month