Prosecution Insights
Last updated: April 19, 2026
Application No. 18/788,453

MARINE NAVIGATION SYSTEM

Non-Final OA §102§103§112
Filed
Jul 30, 2024
Examiner
KAZIMI, MAHMOUD M
Art Unit
3665
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Volvo Penta Corporation
OA Round
3 (Non-Final)
64%
Grant Probability
Moderate
3-4
OA Rounds
3y 2m
To Grant
79%
With Interview

Examiner Intelligence

Grants 64% of resolved cases
64%
Career Allow Rate
131 granted / 204 resolved
+12.2% vs TC avg
Strong +15% interview lift
Without
With
+15.2%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
36 currently pending
Career history
240
Total Applications
across all art units

Statute-Specific Performance

§101
21.2%
-18.8% vs TC avg
§103
56.2%
+16.2% vs TC avg
§102
12.3%
-27.7% vs TC avg
§112
8.5%
-31.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 204 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This communication is in response to application 18/788,453 filed on 02/18/2026. Claims 1, 10, 17 and 18 have been amended. Claims 1-20 are pending and examined in the instant office action. The rejections are as stated below. Priority Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in Application No. EP23189784.4, filed on 08/04/2023. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 02/18/2026 has been entered. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding claims 1, 9-10, 17 and 18, the terms “over time and continuously different” are relative terms which renders the claim indefinite. The terms “over time and continuously different” are not defined by the claim, the specification does not provide a standard for ascertaining the requisite degree, and one of ordinary skill in the art would not be reasonably apprised of the scope of the invention. Examiner notes claims 2-16 and 18-20 depend from claims 1 and 17. Response to Arguments Applicant’s arguments submitted on 02/18/2025, with respect to the previous 35 U.S.C. 102(a)(1) of claim 1 has been fully considered and is unpersuasive. With respect to the previous 35 U.S.C. 102(a)(1) of claim 1, Applicant argues the cited art of record fails to explicitly disclose collecting scanned data “over time” and I identifying portions that are “continuously different” from nautical chart information, asserting that Johnson performs only an instantaneous comparison and cannot distinguish transient objects from persistent discrepancies. Examiner respectfully disagrees. Johnson discloses a navigation system that receives environmental sensor data and compares the sensed environment with information stored in a navigation database to identify mismatches between the sensed data and chart information (See at least p. 27, lines 11-20). Environmental sensing systems used in marine navigation, such as radar, operate by repeatedly scanning the surroundings environment and generating successive sensor readings during operation. Accordingly, Johnson collects scanned data over time as the system continuously receives sensor measurements while the vessel is underway (See at least p.13, lines 7-20). Further, Johnson identifies discrepancies when sensed objects do not correspond to entries in the navigation database. If such an object continues to be detected while remaining absent from the database, the discrepancy persists across successive scans. Thus, the sensed features remains continuously different from the chart information during the time the feature continues to appear in the sensor data. Applicant’s arguments that the claim requires distinguishing transient objects from persistent discrepancies is not equivalent to the claim scope. The claim does not recite any temporal validation algorithm, persistence threshold, or filtering process for distinguishing transient objects. Even transient objects would appear in sensor scans and therefore constitute scanned data that differs from chart information during the time the object is detected. Accordingly, Johnson discloses the claimed limitations. Examiner notes independent claim 17 is rejected under same rationale as above. For at least the above, the previous 35 U.S.C. 102(a)(1) rejection is maintained. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: a positioning unit configured to and storage unit in claims 1 and 4. Because these claim limitation(s) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, they are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-2 and 4-19 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Johnson et al., WO2021178603 A1, hereinafter referred to as Johnson. Regarding claim 1, Johnson discloses a marine navigation system for a marine vessel (In accordance with various embodiments of the present disclosure, sensor fusion navigation systems may be provided by various portable and/or fixed navigational sensors associated with a mobile structure, i.e. marine vessel – See at least “Detailed Description”, page 6, lines 12-14), comprising a positioning unit configured to detect a position of the marine vessel (Sensors may be mounted to or within the mobile structure (e.g., a watercraft), may be integrated with other sensor assemblies, or may be integrated within a portable device. Examples of portable devices include portable (global navigation satellite system (GNSS), i.e. positioning unit, devices – See at least “Detailed Description”, page 6, lines 21-25), a nautical chart database (Generate further images and/or charts, i.e. nautical chart database, augmented with navigation data from navigation sensors and/or derived from the processed images – See at least “Detailed Description”, page 52, lines 26-29), a display being configured to present a view of a nautical chart information and the position of the marine vessel on the nautical chart (In one embodiment, user interface may be adapted to display a time series of various sensor information and/or other parameters as part of or overlaid on a graph or map, i.e. nautical chart, which may be referenced to a position and/or orientation of mobile structure, i.e. marine vessel – See at least See at least “Detailed Description”, page 13, lines 18-20), a short range distance sensor module arranged on the marine vessel (Sensors may be mounted to or within the mobile structure (e.g., a watercraft) – See at least “Detailed Description”, page 6, lines 21-23), the short range distance sensor module being configured to provide scanned data of a surrounding environment of the marine vessel (Other modules may include other and/or additional sensors used to provide additional environmental information of mobile structure – See at least “Detailed Description”, page 17, lines 3-5), a control unit being operatively connected with the short range distance sensor module, the positioning unit, the nautical chart database and the display, the control unit is configured to process the scanned data to provide a scanned view of the surrounding environment (In one embodiment, user interface may be adapted to receive a sensor or control signal over communication links formed by one or more associated logic devices. User interface may be adapted to process sensor and/or control signals to determine sensor and/or other information. For example, a sensor signal may include a position of mobile structure. User interface may be adapted to display a time series of various sensor information and/or other parameters as part of or overlaid on a graph or map, i.e. nautical chart, which may be referenced to a position and/or orientation of mobile structure – See at least “Detailed Description”, page 13, lines 7-20. Controller may be adapted to execute one or more control loops for actuated device control and/or performing other various operations of mobile structure and/or system. In some embodiments, a control loop may include processing sensor signals and/or sensor information in order to control one or more operations of mobile structure and/or various elements of system - See at least “Detailed Description”, page 14, lines 31-33 and page 15, lines 1-3), wherein the control unit is configured to present an augmented view of the nautical chart and scanned data on the display (Embodiments of the present disclosure can implement processing steps to integrate visible or thermal images into a marine navigation system, for use with augmented reality or watercraft automation, as described herein. For example, Fig. 17 illustrates image processing steps to generate images and/or charts augmented with navigational data using sensor fusion for navigation systems in accordance with an embodiment of the disclosure – See at least “Detailed Description”, page 57, lines 13-17), and wherein the control unit is configured to collect scanned data over time and to identify and indicate portions of the scanned data which are determined to be continuously different from the nautical chart information in the augmented view (For example, a sensor signal may include a position of mobile structure. User interface may be adapted to display a time series of various sensor information and/or other parameters as part of or overlaid on a graph or map, i.e. nautical chart, which may be referenced to a position and/or orientation of mobile structure – See at least “Detailed Description”, page 13, lines 7-20. In other embodiments, controller may compare data of other sensors related to other aspects of the database and/or the integrated model and determine any mismatches within the data of navigational database. The mismatches may be directed to any aspect of navigational database or integrated model. For example, mismatches between terrain features, wildlife, mobile structure position, environmental conditions, and/or other aspects of data within navigational database may be highlighted. The mismatches may then be highlighted within display, such as through renderings within the integrated model – See at least “Detailed Description”, page 27, lines 20-28). Regarding claim 2, Johnson discloses wherein the scanned view is presented as a graphical overlay to the nautical chart information on the display (For example, controller may be configured to combine at least a portion of the navigational data received in block with a horizon- stabilized image or a synthetic viewpoint elevated image to produce a combined image, for example, and render the combined image on a display of a user interface of the mobile structure. The combined image may include the horizon-stabilized image or the synthetic viewpoint elevated image overlaid, i.e. graphical overlay, with object identification information and/or other navigational information corresponding to the mobile structure or a detected object, for example, or a navigational chart comprising navigational data derived, at least in part, from the horizon-stabilized image or the synthetic viewpoint elevated image – See at least “Detailed Description” page 63, lines 28-33 and page 64, lines 1-4). Regarding claim 4, Johnson discloses wherein the scanned data is stored in the storage unit (Controller may be implemented as any appropriate logic device (e.g., memory storage device,) that may be adapted to execute, store, and/or receive appropriate instructions, such as software instructions implementing a control loop for controlling various operations of mobile structure, and/or system. Such software instructions may also implement methods for processing sensor signals, determining sensor information – See at least “Detailed Description”, page 14, lines 10-20). Regarding claim 5, Johnson discloses wherein the control unit is configured to determine an object provided by the scanned data (In some embodiments, other modules may include a radar system, a LIDAR systems, and/or other environmental sensors providing measurements and/or other sensor signals that can be displayed to a user and/or used by other devices of system (e.g., controller) to provide operational control of mobile structure that compensates for environmental conditions, such as an object in a path of mobile structure – See at least “Detailed Description”, page 17, lines 5-13). Regarding claim 6, Johnson discloses an object database with known objects and/or classifications of scanned data points, the control unit is configured to compare the scanned data with the object database for determining an object substantially similar to the scanned data (Controller may compare data of other sensors related to other aspects of the database, i.e. object database and/or the integrated model and determine any matches or mismatches within the data, i.e. scanned data, of navigational database. The matches and/or mismatches may be directed to any aspect of navigational database or integrated model – See at least “Detailed Description”, page 27, lines 20-24). Regarding claim 7, Johnson discloses wherein the control unit is configured to divide the scanned data into segmented objects, the control unit is configured to present the segmented objects graphically different on the display (For example, matches and/or mismatches between terrain features, wildlife (e.g., flora and/or fauna), mobile structure position, environmental conditions, and/or other aspects of data within navigational database may be highlighted. The matches and/or mismatches may then be highlighted within display, such as through renderings within the integrated model See at least “Detailed Description”, page 27, lines 24-28). Regarding claim 8, Johnson discloses wherein the augmented view is corrected by redrawing the overlay or by adding other geometrically indications and/or color (For example, in augmented image (An augmented version of synthetic viewpoint elevated image of Fig. 14B), distance indicators have been added so as to have a perspective roughly following the surface of the water in their respective areas. Moreover, shaded/hatched area has been added to indicate an area too shallow to traverse. In Fig. 5, AIS data and a distance indicator has been added to the image – See at least “Detailed Description”, page 54, lines 21-25). Regarding claim 9, Johnson discloses wherein the system collects scanned data over time as function of the marine vessel’s position and orientation, and presents an averaged scanned view as overlay to the nautical chart (In one embodiment, user interface may be adapted to display a time series of various sensor information and/or other parameters as part of or overlaid on a graph or map, which may be referenced to a position and/or orientation of mobile structure. For example, user interface may be adapted to display a time series of positions, headings, and/or orientations of mobile structure and/or other elements of system overlaid on a geographical map, which may include one or more graphs indicating a corresponding time series of actuator control signals, sensor information, and/or other sensor and/or control signals – See at least “Detailed Description” page 13, lines 18-25). Regarding claim 10, Johnson discloses wherein the system compare the scanned data collected over time with the nautical chart information to identify the portions of the scanned data that are continuously different to the nautical chart information (In various embodiments, controller may be configured to apply various types of image processing to the sonar data when rendering an integrated model, such as processing to visually differentiate real time and prior-acquired image data and other data, to visually indicate a relative age of different portions of such data, to visually indicate surface orientations of above water and underwater features, and/or to provide additional methods to visually differentiate different above water and underwater features and/or different above water and underwater feature characteristics from one another – See at least “Detailed Description” page 43, lines 14-20). Regarding claim 11, Johnson discloses wherein the scanned stationary objects are updated in the augmented view based on real-time scanned data gradually by redrawing the augmented view (A general algorithm structure corresponding to Fig. 17 may be as follows: provide relative coordinates of detected objects in images and/or other navigational data to a coordinate frame transformer to derive absolute positions for the detected objects; provide the derived absolute object positions and corresponding AIS, chart, and/or otherwise provided cataloged absolute object positions to an object matching and sensor fusion process (e.g., which matches detected objects to cataloged objects based on the derived and cataloged positions, and which adjusts, refines, and/or aligns the derived and/or cataloged positions to each other through sensor fusion and aggregation of multiple types of object detection for each object) to generate an object database, and then use the object database to generate augmented reality image renderings and/or to help provide for autopiloting of mobile structure (e.g., by updating object positions – See at least “Detailed Description” page 57, lines 32-33 and page 58, lines 1-11). Regarding claim 12, Johnson discloses wherein the control unit is configured to differentiate the scanned data, the nautical chart, and the augmented view, or blend them together on the display (Once the images are received and/or processed, controller may be configured to combine navigational data with the images to produce augmented reality views and/or other sensor fusion views, as described herein. For example, controller may be configured to combine at least a portion of the navigational data received in block with a horizon- stabilized image or a synthetic viewpoint elevated image to produce a combined image, i.e. blend, for example, and render the combined image on a display of a user interface of the mobile structure – See at least “Detailed Description”, page 63, lines 26-33). Regarding claim 13, Johnson discloses wherein the short range distance sensor module comprises one or more LiDAR sensor(s) and/or one or more short-range radar(s) (Other modules may include other and/or additional sensors used to provide additional environmental information of mobile structure. Other modules may include a radar system, LIDAR systems – See at least “Detailed Description”, page 17, lines 3-8). Regarding claim 14, Johnson discloses wherein the short range distance sensor module comprises one or more LiDAR sensor(s), the LiDAR sensors are providing LiDAR scanned data (Other modules may include other and/or additional sensors used to provide additional environmental information, i.e. scanned data, of mobile structure. Other modules may include a LIDAR systems – See at least “Detailed Description”, page 17, lines 3-8). Regarding claim 15, Johnson discloses wherein the LiDAR scanned data is raw LiDAR data, the control unit is configured to translate the raw LiDAR data into a 2D grid map with probability and/or class values, the 2D grid map being the scanned view (Embodiments of the present disclosure provide sensor fusion that may combine sensor data from a plurality of sensors and present the sensor data according to a single reference frame, thereby providing enhanced data to a user that may also be more intuitive and easier to interpret than individually referenced data from each of the sensors. In certain embodiments, a 2D integrated model may also be rendered from the sensor data – See at least “Detailed Description”, page 6, lines 25-30). Regarding claim 16, Johnson discloses a marine vessel comprising a marine navigation system of claim 1 (In accordance with various embodiments of the present disclosure, sensor fusion navigation systems may be provided by various portable and/or fixed navigational sensors associated with a mobile structure, marine vessel – See at least “Detailed Description”, page 6, lines 12-15). Regarding claim 17, Johnson discloses a marine navigation method comprising (In accordance with various embodiments of the present disclosure, sensor fusion navigation systems may be provided by various portable and/or fixed navigational sensors associated with a mobile structure, marine vessel – See at least “Detailed Description”, page 6, lines 12-14) determining a position of a marine vessel (Sensors may be mounted to or within the mobile structure (e.g., a watercraft), may be integrated with other sensor assemblies, or may be integrated within a portable device. Examples of portable devices include portable (global navigation satellite system (GNSS) devices – See at least “Detailed Description”, page 6, lines 21-25), displaying a nautical chart information and the position of the marine vessel on the nautical chart (In one embodiment, user interface may be adapted to display a time series of various sensor information and/or other parameters as part of or overlaid on a graph or map, i.e. nautical chart, which may be referenced to a position and/or orientation of mobile structure, i.e. marine vessel – See at least See at least “Detailed Description”, page 13, lines 18-20), providing scanned data of a surrounding environment of the marine vessel by a short range distance sensor module arranged on the marine vessel (Other modules may include other and/or additional sensors used to provide additional environmental information of mobile structure – See at least “Detailed Description”, page 17, lines 3-5), processing the scanned data to provide a scanned view of the surrounding environment (In one embodiment, user interface may be adapted to receive a sensor or control signal over communication links formed by one or more associated logic devices. User interface may be adapted to process sensor and/or control signals to determine sensor and/or other information. For example, a sensor signal may include a position of mobile structure. User interface may be adapted to display a time series of various sensor information and/or other parameters as part of or overlaid on a graph or map, i.e. nautical chart, which may be referenced to a position and/or orientation of mobile structure – See at least “Detailed Description”, page 13, lines 7-20. Controller may be adapted to execute one or more control loops for actuated device control and/or performing other various operations of mobile structure and/or system. In some embodiments, a control loop may include processing sensor signals and/or sensor information in order to control one or more operations of mobile structure and/or various elements of system - See at least “Detailed Description”, page 14, lines 31-33 and page 15, lines 1-3), presenting an augmented view of the nautical chart and scanned data on a display, and collect scanned data over time and to identify and indicating portions of the scanned data which are determined to be continuously different from the nautical chart information in the augmented view (Embodiments of the present disclosure can implement processing steps to integrate visible or thermal images into a marine navigation system, for use with augmented reality or watercraft automation, as described herein. For example, Fig. 17 illustrates image processing steps to generate images and/or charts augmented with navigational data using sensor fusion for navigation systems in accordance with an embodiment of the disclosure – See at least “Detailed Description”, page 57, lines 13-17. In other embodiments, controller may compare data of other sensors related to other aspects of the database and/or the integrated model and determine any mismatches within the data of navigational database. The mismatches may be directed to any aspect of navigational database or integrated model. For example, mismatches between terrain features, wildlife, mobile structure position, environmental conditions, and/or other aspects of data within navigational database may be highlighted. The mismatches may then be highlighted within display, such as through renderings within the integrated model – See at least “Detailed Description”, page 27, lines 20-28). Regarding claim 18, Johnson discloses: comparing the scanned view of scanned data with the nautical chart information, (Controller may compare data of other sensors related to other aspects of the database, and/or the integrated model and determine any matches or mismatches within the data, i.e. scanned data, of navigational database. The matches and/or mismatches may be directed to any aspect of navigational database or integrated model – See at least “Detailed Description”, page 27, lines 20-24) and identifying the portions of the scanned data being continuously different to the nautical chart information so as to provide an augmented view of the nautical chart information including the scanned view (Provide relative coordinates of detected objects in images and/or other navigational data to a coordinate frame transformer to derive absolute positions for the detected objects; provide the derived absolute object positions and corresponding AIS, chart, and/or otherwise provided cataloged absolute object positions to an object matching and sensor fusion process (e.g., which matches detected objects to cataloged objects based on the derived and cataloged positions, and which adjusts, refines, and/or aligns the derived and/or cataloged positions to each other through sensor fusion and aggregation of multiple types of object detection for each object) to generate an object database, and then use the object database to generate augmented reality image renderings and/or to help provide for autopiloting of mobile structure (e.g., by updating object positions – See at least “Detailed Description” page 57, lines 32-33 and page 58, lines 1-11). Regarding claim 19, Johnson discloses presenting the scanned view as a graphically overlay to the nautical chart information on the display (For example, controller may be configured to combine at least a portion of the navigational data received in block with a horizon- stabilized image or a synthetic viewpoint elevated image to produce a combined image, for example, and render the combined image on a display of a user interface of the mobile structure. The combined image may include the horizon-stabilized image or the synthetic viewpoint elevated image overlaid with object identification information and/or other navigational information corresponding to the mobile structure or a detected object, for example, or a navigational chart comprising navigational data derived, at least in part, from the horizon-stabilized image or the synthetic viewpoint elevated image – See at least “Detailed Description” page 63, lines 28-33 and page 64, lines 1-4). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 3 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Johnson et al., WO2021178603 A1, in view of Kubota et al., US 20130286022 A1, hereinafter referred to as Johnson and Kubota, respectively. Regarding claim 3, Johnson fails to explicitly disclose wherein the graphical overlay provide information to an operator or captain regarding differences between the scanned view and the nautical chart information. However, Kubota teaches wherein the graphical overlay provide information to an operator or captain regarding differences between the scanned view and the nautical chart information (As described above, the controller acquires perimeter information of the ship from the sweep memory, the other-ship information memory, and the nautical chart information memory, and displays the acquired perimeter information on the display unit while changing the scales of the first and second ranges – See at least ¶70. In the setting of the display mode, the user can set which perimeter information to display within the first range and the second range individually. Additionally, the user can set to display ships as the echoes, the TT symbols, or the AIS symbols, or both the TT and AIS symbols by superimposing onto each other – See at least ¶73). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Johnson and include the feature of wherein the graphical overlay provide information to an operator or captain regarding differences between the scanned view and the nautical chart information, as taught by Kubota, to provide a display device for simultaneously displaying detailed information of a range near a predetermined location (e.g., a position of a movable body) (See at least ¶7 of Kubota). Regarding claim 20, Johnson fails to explicitly disclose providing information on the graphically overlay to an operator regarding differences between the scanned view and the nautical chart information. However, Kubota teaches providing information on the graphically overlay to an operator regarding differences between the scanned view and the nautical chart information (As described above, the controller acquires perimeter information of the ship from the sweep memory, the other-ship information memory, and the nautical chart information memory, and displays the acquired perimeter information on the display unit while changing the scales of the first and second ranges – See at least ¶70. In the setting of the display mode, the user can set which perimeter information to display within the first range and the second range individually. Additionally, the user can set to display ships as the echoes, the TT symbols, or the AIS symbols, or both the TT and AIS symbols by superimposing onto each other – See at least ¶73). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Johnson and include the feature of providing information on the graphically overlay to an operator regarding differences between the scanned view and the nautical chart information, as taught by Kubota, to provide a display device for simultaneously displaying detailed information of a range near a predetermined location (e.g., a position of a movable body) (See at least ¶7 of Kubota). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MAHMOUD M KAZIMI whose telephone number is (571)272-3436. The examiner can normally be reached M-F 7am-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Erin Bishop can be reached at 5712703713. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. RESPECTFULLY SUBMITTED /MAHMOUD M KAZIMI/Examiner, Art Unit 3665
Read full office action

Prosecution Timeline

Jul 30, 2024
Application Filed
Sep 23, 2025
Non-Final Rejection — §102, §103, §112
Dec 10, 2025
Response Filed
Dec 30, 2025
Final Rejection — §102, §103, §112
Feb 18, 2026
Request for Continued Examination
Mar 05, 2026
Response after Non-Final Action
Mar 07, 2026
Non-Final Rejection — §102, §103, §112
Apr 13, 2026
Interview Requested
Apr 15, 2026
Applicant Interview (Telephonic)
Apr 15, 2026
Examiner Interview Summary

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602046
AUTOMATIC WATERCRAFT PILOTING SYSTEM AND AUTOMATIC WATERCRAFT PILOTING ASSIST METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12592152
NAVIGATION PLANNING SYSTEM AND NAVIGATION PLANNING METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12566435
Predictive Modeling of Aircraft Dynamics
2y 5m to grant Granted Mar 03, 2026
Patent 12565263
INDEPENDENT STEERING CONTROL SYSTEM AND METHOD
2y 5m to grant Granted Mar 03, 2026
Patent 12554343
INPUT DEVICE
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
64%
Grant Probability
79%
With Interview (+15.2%)
3y 2m
Median Time to Grant
High
PTA Risk
Based on 204 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month