Prosecution Insights
Last updated: April 19, 2026
Application No. 18/779,165

MONOCULAR VISUAL PATH FOLLOWING

Final Rejection §103
Filed
Jul 22, 2024
Examiner
DEL VALLE, LUIS GERARDO
Art Unit
3666
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
NEC Corporation Of America
OA Round
2 (Final)
72%
Grant Probability
Favorable
3-4
OA Rounds
2y 11m
To Grant
96%
With Interview

Examiner Intelligence

Grants 72% — above average
72%
Career Allow Rate
111 granted / 154 resolved
+20.1% vs TC avg
Strong +24% interview lift
Without
With
+23.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
30 currently pending
Career history
184
Total Applications
across all art units

Statute-Specific Performance

§101
13.1%
-26.9% vs TC avg
§103
60.5%
+20.5% vs TC avg
§102
11.2%
-28.8% vs TC avg
§112
12.7%
-27.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 154 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Examiner’s Response re: Claim Objections Applicant’s arguments, see Page 8, filed 09 Jan 26, with respect to Claims 1, 11, and 14-15 have been fully considered and are persuasive. The Claim Objections of Claims 1, 11, and 14-15 has been withdrawn. Examiner’s Response re: 112b Rejection Applicant’s arguments, see Pages 8-9, filed 09 Jan 26, with respect to Claims 1, 8, 11, and 14-15 have been fully considered and are persuasive. The 112b Rejection of Claims 1, 8, 11, and 14-15 has been withdrawn. Examiner’s Response re: 103 Rejection Applicant’s arguments, see Pages 9-19, filed 09 Jan 2026, with respect to the rejection(s) of claim(s) 1-15 under 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of Dincal, Gros, Chiu, and Zimmerman. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1-7 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Dincal et. al., US 20200211203 A1 (herein, Dincal) in view of Gros et al. US 20190066283 A1 (herein, Gros), Zimmerman US 20160364248 A1 (herein, Zimmerman) and in further view of Chiu et. al., US 20190114507 A1 (herein, Chiu). Regarding Claim 1, Dincal discloses, a method for planar path navigation of a moving platform (FIG. 9, ¶[0058] – “…ROV 62 navigates through the subsea environment to a desired location therein by identifying and tracking the position (relative to the ROV 62) of the visual pattern 72…”), comprising: receiving a first video frame from a camera (FIG. 1, #64 – camera and ¶[0031] – “…computer 9 receives images or video frames of visual pattern 7 captured and transmitted by camera assembly 4..”) or a data interface; receiving a target pixel separation value from an external process (FIG. 1 and ¶[0026] – “…a digital image by an algorithm executable by a processor whereby the algorithm may determine pixel coordinates of the visual indicator within the digital image…”); identifying and selecting in the first video frame two separate vertical edges (FIG. 3, #202 – video frame – the video frame has two vertical edges that are separate); determining the horizontal coordinates corresponding (¶[0034] – “…coordinates (X.sub.n, Y.sub.n),…”) to the two selected vertical edges in the first video frame; obtaining the relative horizontal coordinate of a waypoint (FIG. 3 and ¶[0058] – “…a desired location (i.e. has horizontal coordinates)… ROV 62 may use offshore posts 90 as guide posts or waypoints…”) located between said two vertical edges (¶[0058] – ROV 62 may use offshore posts 90 as guide posts or waypoints” – i.e. the way points is between the two vertical edges); and receiving a video frame (FIG. 3 illustrates video frame 205) from the camera or the data interface; determining the current horizontal coordinates of the two previously selected vertical edges in the video frame (¶[0006] – “…a first set of pixel coordinates associated with the first visual pattern in the first image and a second set of pixel coordinates associated with the first visual pattern…”); computing a current pixel separation value (¶[0006] – “…a first set of pixel coordinates associated with the first visual pattern in the first image…”) between the two previously selected vertical edges in the video frame (¶[0006] – “…determine a movement of the first visual pattern relative to the second visual pattern between the first image and the second image…”); Dincal discloses performing at each sensing iteration in a path segment (¶[0052] – “…camera 64 is coupled or mounted to ROV 62, which follows the offshore structure 66 as it travels through the subsea environment to ensure offshore structure 66 remains within a fie…”) but does not disclose, wherein each sensing iteration is conducted following a navigation of said moving platform toward said waypoint. However, Gros teaches, wherein each sensing iteration (¶[0036] – “… resulting in a carved 3D shape 100 being formed as the process iterates….”) is conducted following a navigation of said moving platform toward said waypoint (¶[0018] – “…movement can be based on either locally generated path waypoints or guidance or path guidance and waypoints generated by a remote system and communicated to the drone…”) . Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method as disclosed by Dincal to include a sensing iteration pertaining to the moving platform as taught by Gros. Doing so, improves the navigation of the platform by providing the requisite iteration pertaining to the that the waypoint has moved and thus this provides the operator with additional information so to effectively navigate the craft. Modified Dincal discloses, responsive to determining that the current pixel separation value exceeds the target pixel separation value (¶[0043] – “…magnitude and direction of first movement 210 can be determined using computer 9 based on the difference between pixel coordinates (X.sub.n, Y.sub.n), (X.sub.n+1, Y.sub.n+1); the magnitude and direction of second movement 212 can be determined based on the difference between pixel coordinates (X.sub.n+1, Y.sub.n+1), (X.sub.n+2, Y.sub.n+2),…”), terminating the path segment, and subsequently restarting or terminating the method (Claim 18 – “…a direction of a movement of the first visual pattern based on changes in the pixel coordinates…” – i.e., a change of direction requires a stop and restarting) but does not disclose, an indication that the moving platform has reached the waypoint. However, Zimmerman teaches, an indication that the moving platform has reached the waypoint (¶[0036] – “…A message may be indicative of a vehicle type of a UAV, indicative of a model of a UAV, indicative of a history of a previously controlled UAV by the GUI, indicative of a history of a UAV previously controlled by a user using the GUI, indicative of an emergency associated with a UAV, indicative of a time threshold, indicative of a maintenance history of a UAV, indicative of a UAV arriving at a waypoint, or indicative of a UAV sensing an object….”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method as disclosed by modified Dincal to include indication when the moving platform reaches its waypoint as taught by Zimmerman. Doing so, improves the information management of the moving platform by informing the operator when the moving platform reaches its waypoint. Modified Dincal does not disclose, computing a navigational parameter; and outputting the navigational parameter. However, Chiu teaches, computing the navigational parameter (See above Claim Objection) (¶[0020] – “…a navigation inference engine for computing navigation information for the navigation system,…”); and outputting the navigational parameter (¶[0020] – “…the navigation inference engine stores feature constraints as factors in a factor graph….”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method as disclosed by modified Dincal to include the computation and outputting the navigational parameter as taught by Chiu. Doing so, improves the efficiency of the method by providing navigational parameter during the operation of the ROV. Regarding Claim 2, modified Dincal further discloses, further comprising: receiving a relaxation parameter from an external process (Claim 6 – “…the processor… determine a magnitude and a direction of a movement of the visual pattern in physical units from the first position of the visual pattern to the second position of the visual pattern…”); and wherein the navigational parameter in part comprises the relaxation parameter (Chiu, ¶[0020] – “…to make an inlier/outlier decision for each of the objects…”). Regarding Claim 3, modified Dincal further discloses, wherein obtaining the relative horizontal coordinate of the waypoint comprises: receiving an absolute horizontal coordinate (FIG. 11 illustrates absolute horizontal coordinate) of the waypoint (90) in the first video frame from an external process (FIG. 11 illustrates the external process); calculating the relative horizontal coordinate of the waypoint (FIG. 11 and ¶[0057] – “…plurality of offshore posts 90 such that a visual pattern 72 positioned on an exterior of a first offshore post 90 falls within the field of view 70…”). Regarding Claim 4, modified Dincal further discloses, further comprising: receiving a current heading angle of the moving platform from an external process (FIG. 11 illustrates the heading angle of the ROV per 800). Regarding Claim 5, modified Dincal further discloses, wherein obtaining the relative horizontal coordinate of the waypoint (90) comprises receiving an relative horizontal coordinate (FIG. 11 illustrates a relative horizontal coordinate) of the waypoint in the first video frame from an external process (FIG. 11 and ¶[0058] – “…by tracking the relative position of the visual patterns 72 positioned on the exteriors of offshore posts 90, …”). Regarding Claim 6, modified Dincal further discloses, wherein the navigational parameter comprises the motion direction horizontal coordinate for the current sensing iteration in the path segment (FIG. 1 illustrates the x-axis and ¶[0037] – “…the determined 3D physical displacements of visual pattern 7 calculated by the computer is representative of the 3D physical displacements of at least the portion of the offshore structure…”). Regarding Claim 7, modified Dincal further discloses, wherein the navigational parameter comprises the heading angle of the moving platform for the current sensing iteration in the path segment (FIG. 1 illustrates the heading angle per x-y-z axis). Regarding Claim 14, Dincal discloses, a system comprising (Abstract): a processor configured to execute stored executable instructions (¶[0030] – “…Processor 30 executes software 38 (e.g., machine-readable instructions)…”); and a non-transitory computer readable medium storing executable instructions that, when executed by a processor (¶[0030] – “…The software 38 may comprise non-transitory computer readable medium…”), cause the computer system to perform a method for planar navigation of a moving platform (FIG. 9, ¶[0058] – “…ROV 62 navigates through the subsea environment to a desired location therein by identifying and tracking the position (relative to the ROV 62) of the visual pattern 72…”), the method comprising: receiving a first video frame from a camera (FIG. 1, #64 – camera and ¶[0031] – “…computer 9 receives images or video frames of visual pattern 7 captured and transmitted by camera assembly 4..”) or a data interface; receiving a target pixel separation value from an external process (FIG. 1 and ¶[0026] – “…a digital image by an algorithm executable by a processor whereby the algorithm may determine pixel coordinates of the visual indicator within the digital image…”); identifying and selecting in the first video frame two separate vertical edges (FIG. 3, #202 – video frame – the video frame has two edges that are vertical); determining the horizontal coordinates corresponding (¶[0034] – “…coordinates (X.sub.n, Y.sub.n),…”) to the two selected vertical edges in the first video frame (¶[0006] – “…a first set of pixel coordinates associated with the first visual pattern in the first image and a second set of pixel coordinates associated with the first visual pattern…”); obtaining the relative horizontal coordinate of a waypoint (FIG. 3 and ¶[0058] – “…a desired location (i.e. has horizontal coordinates)… ROV 62 may use offshore posts 90 as guide posts or waypoints…”) located between said two vertical edges (¶[0058] – ROV 62 may use offshore posts 90 as guide posts or waypoints” – i.e. the way points is between the two vertical edges); and receiving a video frame (FIG. 3 illustrates video frame 205) from the camera or the data interface; determining the current horizontal coordinates of the two previously selected vertical edges in the video frame (¶[0006] – “…a first set of pixel coordinates associated with the first visual pattern in the first image and a second set of pixel coordinates associated with the first visual pattern…”); computing a current pixel separation value (¶[0006] – “…a first set of pixel coordinates associated with the first visual pattern in the first image…”) between the two previously selected vertical edges in the video frame (¶[0006] – “…determine a movement of the first visual pattern relative to the second visual pattern between the first image and the second image…”); Dincal discloses performing at each sensing iteration in a path segment (¶[0052] – “…camera 64 is coupled or mounted to ROV 62, which follows the offshore structure 66 as it travels through the subsea environment to ensure offshore structure 66 remains within a fie…”) but does not disclose, wherein each sensing iteration is conducted following a navigation of said moving platform toward said waypoint. However, Gros teaches, wherein each sensing iteration (¶[0036] – “… resulting in a carved 3D shape 100 being formed as the process iterates….”) is conducted following a navigation of said moving platform toward said waypoint (¶[0018] – “…movement can be based on either locally generated path waypoints or guidance or path guidance and waypoints generated by a remote system and communicated to the drone…”) . Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method as disclosed by Dincal to include a sensing iteration pertaining to the moving platform as taught by Gros. Doing so, improves the navigation of the platform by providing the requisite iteration pertaining to the that the waypoint has moved and thus this provides the operator with additional information so to effectively navigate the craft. Modified Dincal discloses, responsive to determining that the current pixel separation value exceeds the target pixel separation value (¶[0043] – “…magnitude and direction of first movement 210 can be determined using computer 9 based on the difference between pixel coordinates (X.sub.n, Y.sub.n), (X.sub.n+1, Y.sub.n+1); the magnitude and direction of second movement 212 can be determined based on the difference between pixel coordinates (X.sub.n+1, Y.sub.n+1), (X.sub.n+2, Y.sub.n+2),…”), terminating the path segment, and subsequently restarting or terminating the method (Claim 18 – “…a direction of a movement of the first visual pattern based on changes in the pixel coordinates…” – i.e., a change of direction requires a stop and restarting) but does not disclose, an indication that the moving platform has reached the waypoint. However, Zimmerman teaches, an indication that the moving platform has reached the waypoint (¶[0036] – “…A message may be indicative of a vehicle type of a UAV, indicative of a model of a UAV, indicative of a history of a previously controlled UAV by the GUI, indicative of a history of a UAV previously controlled by a user using the GUI, indicative of an emergency associated with a UAV, indicative of a time threshold, indicative of a maintenance history of a UAV, indicative of a UAV arriving at a waypoint, or indicative of a UAV sensing an object….”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method as disclosed by modified Dincal to include indication when the moving platform reaches its waypoint as taught by Zimmerman. Doing so, improves the information management of the moving platform by informing the operator when the moving platform reaches its waypoint. Modified Dincal does not disclose, computing a navigational parameter; and outputting the navigational parameter. However, Chiu teaches, computing a navigational parameter (¶[0020] – “…a navigation inference engine for computing navigation information for the navigation system,…”); and outputting the navigational parameter (¶[0020] – “…the navigation inference engine stores feature constraints as factors in a factor graph….”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method as disclosed by modified Dincal to include the computation and outputting the navigational parameter as taught by Chiu. Doing so, improves the efficiency of the method by providing navigational parameter during the operation of the ROV. Claim(s) 11-13 and 15are rejected under 35 U.S.C. 103 as being unpatentable over Dincal et. al., US 20200211203 A1 (herein, Dincal) and in view, Zimmerman, and in further view of Kron US 20210341946 A1 (herein, Kron). Regarding Claim 11, Dincal discloses, a method for planar path navigation of a moving platform (FIG. 9, ¶[0058] – “…ROV 62 navigates through the subsea environment to a desired location therein by identifying and tracking the position (relative to the ROV 62) of the visual pattern 72…”), comprising: receiving a first video frame from a camera or a data interface (FIG. 1 and ¶[0026] – “…a digital image by an algorithm executable by a processor whereby the algorithm may determine pixel coordinates of the visual indicator within the digital image…”); receiving a target angular separation value from an external process (FIG. 1 illustrates a target angular separation value from the process of 4 to 7); identifying and selecting in the first video frame a first vertical edge and an adjacent second edge separated by a non-zero angle (FIG. 3 illustrates the edges separated by the non-zero angles represented by the movements 210 and 214) for said first vertical edge; receiving a time-dependent goal function of the horizontal coordinate of the first edge from an external process (¶[0036] – “…quantified as a physical rotation of visual pattern 7 over time through multi-dimensional space…”); and receiving a video frame from the camera (FIG. 1, #64 – camera and ¶[0031] – “…computer 9 receives images or video frames of visual pattern 7 captured and transmitted by camera assembly 4..”) or the data interface; determining the current horizontal coordinate corresponding to the first edge in the video frame (¶[0034] – “…coordinates (X.sub.n, Y.sub.n),…”); computing the current angular separation value between the first and the second edges (FIG. 3 illustrates the angular separation); Dincal discloses performing at each sensing iteration in a path segment (¶[0052] – “…camera 64 is coupled or mounted to ROV 62, which follows the offshore structure 66 as it travels through the subsea environment to ensure offshore structure 66 remains within a fie…”) and a heading angle (FIG. 1 illustrates a heading angle of ROV) but does not disclose, wherein each sensing iteration is conducted following a navigation of said moving platform according to a heading angle computed in a previous sensing iteration. However, Gros teaches, wherein each sensing iteration (¶[0036] – “… resulting in a carved 3D shape 100 being formed as the process iterates….”) is conducted following a navigation of said moving platform toward a heading angle computed in a previous iteration (¶[0018] – “…movement can be based on either locally generated path waypoints or guidance or path guidance and waypoints generated by a remote system and communicated to the drone…” – i.e. from a previous iteration) . Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method as disclosed by Dincal to include a sensing iteration pertaining to the moving platform as taught by Gros. Doing so, improves the navigation of the platform by providing the requisite iteration pertaining to the that the waypoint has moved and thus this provides the operator with additional information so to effectively navigate the craft. Modified Dincal discloses, responsive to determining that the current pixel separation value exceeds the target pixel separation value (¶[0043] – “…magnitude and direction of first movement 210 can be determined using computer 9 based on the difference between pixel coordinates (X.sub.n, Y.sub.n), (X.sub.n+1, Y.sub.n+1); the magnitude and direction of second movement 212 can be determined based on the difference between pixel coordinates (X.sub.n+1, Y.sub.n+1), (X.sub.n+2, Y.sub.n+2),…”), terminating the path segment, and subsequently restarting or terminating the method (Claim 18 – “…a direction of a movement of the first visual pattern based on changes in the pixel coordinates…” – i.e., a change of direction requires a stop and restarting) but does not disclose, an indication that the moving platform has reached the waypoint. However, Zimmerman teaches, an indication that the moving platform has reached the waypoint (¶[0036] – “…A message may be indicative of a vehicle type of a UAV, indicative of a model of a UAV, indicative of a history of a previously controlled UAV by the GUI, indicative of a history of a UAV previously controlled by a user using the GUI, indicative of an emergency associated with a UAV, indicative of a time threshold, indicative of a maintenance history of a UAV, indicative of a UAV arriving at a waypoint, or indicative of a UAV sensing an object….”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method as disclosed by modified Dincal to include indication when the moving platform reaches its waypoint as taught by Zimmerman. Doing so, improves the information management of the moving platform by informing the operator when the moving platform reaches its waypoint. Modified Dincal does not disclose, computing the heading angle of the moving platform for the current sensing iteration in the path segment; and outputting the heading angle of the moving platform. However, Kron teaches, computing the heading angle of the moving platform for the current sensing iteration in the path segment; and outputting the heading angle of the moving platform (FIG. 16, #120 – method, and ¶[0203-0208] – “FIG. 16 is a flowchart illustrating a method 120 for controlling heading angle ψ of an aircraft… using the commanded bank angle ϕcmd to control one or more actuators 34 of the aircraft during flight (see block 130).”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method as disclosed by Dincal to include the computation and outputting of the heading angle as taught by Kron. Doing so, improves the efficiency of the method by providing the heading angle during the operation of the ROV. Regarding Claim 12, modified Dincal further discloses, further comprising: receiving a current heading angle of the moving platform from an external process (FIG. 1 illustrates the heading angle per 100). Regarding Claim 13, modified Dincal further discloses, further comprising: receiving a relaxation parameter from an external process (Claim 6 – “…the processor… determine a magnitude and a direction of a movement of the visual pattern in physical units from the first position of the visual pattern to the second position of the visual pattern…”); and wherein the navigational parameter in part comprises the relaxation parameter (Chiu, ¶[0020] – “…to make an inlier/outlier decision for each of the objects…”). Regarding Claim 15, Dincal discloses, a system comprising (Abstract): a processor configured to execute stored executable instructions (¶[0030] – “…Processor 30 executes software 38 (e.g., machine-readable instructions)…”); and a non-transitory computer readable medium storing executable instructions that, when executed by a processor (¶[0030] – “…The software 38 may comprise non-transitory computer readable medium…”), cause the computer system to perform a method for planar navigation of a moving platform (FIG. 9, ¶[0058] – “…ROV 62 navigates through the subsea environment to a desired location therein by identifying and tracking the position (relative to the ROV 62) of the visual pattern 72…”), the method comprising: receiving a first video frame from a camera (FIG. 1, #64 – camera and ¶[0031] – “…computer 9 receives images or video frames of visual pattern 7 captured and transmitted by camera assembly 4..”) or a data interface; receiving a target angular separation value from an external process (FIG. 1 illustrates a target angular separation value from the process of 4 to 7); identifying and selecting in the first video frame a first vertical edge and an adjacent second edge separated by a non-zero angle (FIG. 3 illustrates the vertical edges separated by the non-zero angles represented by the movements 210 and 214); receiving a time-dependent goal function of the horizontal coordinate of the first edge from an external process (¶[0036] – “…quantified as a physical rotation of visual pattern 7 over time through multi-dimensional space…”); and receiving a video frame (FIG. 3 illustrates video frame 205) from the camera or the data interface; determining the current horizontal coordinates of the two previously selected substantially vertical edges in the video frame (¶[0006] – “…a first set of pixel coordinates associated with the first visual pattern in the first image and a second set of pixel coordinates associated with the first visual pattern…”); computing the current angular separation value between the first and the second edges (FIG. 3 illustrates the angular separation); Dincal discloses performing at each sensing iteration in a path segment (¶[0052] – “…camera 64 is coupled or mounted to ROV 62, which follows the offshore structure 66 as it travels through the subsea environment to ensure offshore structure 66 remains within a fie…”) and a heading angle (FIG. 1 illustrates a heading angle of ROV) but does not disclose, wherein each sensing iteration is conducted following a navigation of said moving platform according to a heading angle computed in a previous sensing iteration. However, Gros teaches, wherein each sensing iteration (¶[0036] – “… resulting in a carved 3D shape 100 being formed as the process iterates….”) is conducted following a navigation of said moving platform toward a heading angle computed in a previous iteration (¶[0018] – “…movement can be based on either locally generated path waypoints or guidance or path guidance and waypoints generated by a remote system and communicated to the drone…” – i.e. from a previous iteration) . Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method as disclosed by Dincal to include a sensing iteration pertaining to the moving platform as taught by Gros. Doing so, improves the navigation of the platform by providing the requisite iteration pertaining to the that the waypoint has moved and thus this provides the operator with additional information so to effectively navigate the craft. Modified Dincal discloses, responsive to determining that the current pixel separation value exceeds the target pixel separation value (¶[0043] – “…magnitude and direction of first movement 210 can be determined using computer 9 based on the difference between pixel coordinates (X.sub.n, Y.sub.n), (X.sub.n+1, Y.sub.n+1); the magnitude and direction of second movement 212 can be determined based on the difference between pixel coordinates (X.sub.n+1, Y.sub.n+1), (X.sub.n+2, Y.sub.n+2),…”), terminating the path segment, and subsequently restarting or terminating the method (Claim 18 – “…a direction of a movement of the first visual pattern based on changes in the pixel coordinates…” – i.e., a change of direction requires a stop and restarting) but does not disclose, an indication that the moving platform has reached the waypoint defined according to said first and said second edge,. However, Zimmerman teaches, an indication that the moving platform has reached the waypoint (¶[0036] – “…A message may be indicative of a vehicle type of a UAV, indicative of a model of a UAV, indicative of a history of a previously controlled UAV by the GUI, indicative of a history of a UAV previously controlled by a user using the GUI, indicative of an emergency associated with a UAV, indicative of a time threshold, indicative of a maintenance history of a UAV, indicative of a UAV arriving at a waypoint, or indicative of a UAV sensing an object….”) defined according to said first and said second edge (Dincal, ¶[0036]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method as disclosed by modified Dincal to include indication when the moving platform reaches its waypoint as taught by Zimmerman. Doing so, improves the information management of the moving platform by informing the operator when the moving platform reaches its waypoint. Modified Dincal does not disclose, computing the heading angle of the moving platform for the current sensing iteration in the path segment; and outputting the heading angle of the moving platform. However, Kron teaches, computing the heading angle of the moving platform for the current sensing iteration in the path segment; and outputting the heading angle of the moving platform (FIG. 16, #120 – method, and ¶[0203-0208] – “FIG. 16 is a flowchart illustrating a method 120 for controlling heading angle ψ of an aircraft… using the commanded bank angle ϕcmd to control one or more actuators 34 of the aircraft during flight (see block 130).”). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the method as disclosed by Dincal to include the computation and outputting of the heading angle as taught by Kron. Doing so, improves the efficiency of the method by providing the heading angle during the operation of the ROV. Allowable Subject Matter Claims 8-9 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. The references cited but not utilized in the Office Action pertain to the system and method for planar path navigation of a moving platform. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to LUIS G DEL VALLE whose telephone number is (303)297-4313. The examiner can normally be reached Monday-Friday, 0730 - 1630 MST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Antonucci can be reached at (313) 446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /LUIS G DEL VALLE/Examiner, Art Unit 3666 /ANNE MARIE ANTONUCCI/Supervisory Patent Examiner, Art Unit 3666
Read full office action

Prosecution Timeline

Jul 22, 2024
Application Filed
Oct 17, 2025
Non-Final Rejection — §103
Jan 09, 2026
Response Filed
Mar 04, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597040
SHARED CHECKLISTS FOR ONBOARD ASSISTANT
2y 5m to grant Granted Apr 07, 2026
Patent 12596010
DISPLAY DEVICE, DISPLAY METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12592151
SYSTEM AND METHOD FOR MULTI-IMAGE-BASED VESSEL PROXIMITY SITUATION RECOGNITION SUPPORT
2y 5m to grant Granted Mar 31, 2026
Patent 12570325
VEHICLE MOVING METHOD AND VEHICLE
2y 5m to grant Granted Mar 10, 2026
Patent 12546615
SYSTEMS AND METHODS FOR PREDICTING FUEL CONSUMPTION EFFICIENCY
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
72%
Grant Probability
96%
With Interview (+23.8%)
2y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 154 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month