DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
This action is in response to amendments and remarks filed on 01/21/2026. The examiner notes the following adjustments to the claims by the applicant:
Claims 1, 5, 7-9, 14, and 17 are amended;
Claims 4, 6, and 20 are cancelled
No claims are added.
Therefore, Claims 1-3, 5 and 7-19 are pending examination, in which Claims 1, 14 and 17 are independent claims.
In light of the instant amendments and arguments:
Further examination resulted in a new rejection of Claims 1-3, 5 and 7-19 under 35 U.S.C. § 103, as detailed below.
THIS ACTION IS MADE FINAL. Necessitated by amendment.
Response to Arguments
Applicant presents the following arguments regarding the previous office action:
To overcome the 35 U.S.C. § 103 rejection, the applicant has amended each independent claim to include the additional underlined limitations: “determine a target within the live sonar image; cause presentation of a highlight feature over the target within the live sonar image such that the target is highlighted from the perspective of the user; and cause the target within the live sonar image to remain highlighted while the target moves within the live sonar image.”;
“Johnson fails to account for tracking a target in the augmented view, including highlighting the target that is in the live sonar and causing the highlighting to follow the target around. This feature is shown in FIG. 6A of the present application, for example. When rejecting that feature, the Examiner points to a different augmented reality view that shows a stationary information pop-up. That does not include tracking a target in live sonar in the augmented view.”;
“Applicant submits that the rejection of Claims 2-3, 5, 7-12, 15-16, and 18-19 are overcome for at least those reasons discussed above. Therefore, Applicant submits that Claims 2-3, 5, 7- 12, 15-16, and 18-19 are patentable and in condition for allowance.”.
Applicant's arguments A., B. and C. appear to be directed to the instantly amended subject matter. Accordingly, they have been addressed in the rejections below.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-3, 7-9, 11-12 and 14-19 are rejected under 35 U.S.C. §102 as being unpatentable over Johnson et al. (US 12,240,571 B2, henceforth Johnson).
Regarding Claim 1, Johnson discloses the limitations: an augmented reality system {system 220, Fig. 2A; “In various embodiments, it can be beneficial to generate augmented reality (AR) display views (e.g., for rendering on a display of user interface 120 of system 100, display 226 of system 220, and/or displays 426 of portable imaging device/system 420) that include graphics or AR overlays designed to indicate hidden or metadata characteristics of a particular identified object or position, or chart-referenced object or position, intuitively so as to reduce risk of user confusion when viewing the AR display view.”, Col. 51, Lns. 43-51} for use in a watercraft {Figs. 1, 4 and 28}, the augmented reality system comprising: an augmented reality device comprising a display configured to allow a user to view a portion of a surrounding environment through the display {wearable portable imaging device 420, Fig. 4}, wherein the portion of the surrounding environment corresponds to a current viewpoint of the user such that the user would otherwise see the portion of the surrounding environment if the augmented reality device was not present {as represented in Fig. 4, wearable portable imaging device 420 includes the lower shaded/augmented reality portion 432 and real imagery 434: “FIG. 4 includes some of the features above waterline 205 illustrated in scene 200 of FIGS. 2 and 3, and, in particular in the FOY of displays 426, includes detected waterlines 205b, portions 430 of the FOY that extend below respective waterlines 205b, and portions 434 of the FOY that extend above respective waterlines 205b. Portions 430 may include color and/or intensity shading 432 rendered by a controller (e.g., controller 221 of FIG. 2A) to distinguish depths, relative distances, various characteristics of bathymetric data, and/or other characteristics of various underwater features.”, Col. 29, Lns. 5-15}; one or more processors; and one or more memory devices {controller 221 and memory 222, Fig. 2A} comprising computer program code configured, when executed by the one or more processors, to cause the one or more processors {Col. 22, Lns. 26-35} to: determine a position and a line of sight of the augmented reality device {as represented in Fig. 4 for head-mounted, glass-type augmented reality device}; receive a live sonar image {use for real-time sonar data: “substantially real time sonar data may be rendered in color and prior-acquired and/or survey map sonar data may be rendered in greyscale. In some embodiments, a relative age of once real time sonar data may be indicated by reducing a chrominance level of the sonar data as the sonar data ages.”, Col. 27, Lns. 60-67}; correlate the position and the line of sight of the augmented reality device to a corresponding portion of the live sonar image {“Sonar data provided by the associated transducer assembly may be rendered using position data and/or orientation data provided by the OPS to correlate the sonar data with portion 430, for example, and/or to facilitate other rendering processing described herein.”, Col. 29, Lns. 36-40}; and cause presentation of the live sonar image on at least one portion of the display so that the portion of the surrounding environment and the corresponding portion of the live sonar image are both visible in the display from a perspective of the user {“portable imaging device 420 may be configured to determine portion 430 of the FOV of display 426 and use an OPS and actuator in an associated transducer assembly (e.g., actuator 116 coupled to transducer assembly 112 of sonar system 110 in FIG. 1B) to ensonify at least a subset of portion 430 substantially in real time as a user adjusts a position or orientation of wearable portable imaging device 420 by, for example, moving the user's head.”, Col. 29, Lns. 28-36 and Fig. 4}; determine a target within the live sonar image; cause presentation of a highlight feature over the target within the live sonar image such that the target is highlighted from the perspective of the user {1912/1913 corresponding to a fish location, Fig. 19: “As shown in FIG. 19 hidden waypoint flags 1910 and 1912 each include a flagpole base 1911 and 1913, respectively, and are positioned and/or oriented on surface 205a to indicate a range to (e.g., 56 m, or 71 m) or relative position of a submerged object type (e.g., navigational hazard, or fish) within display view 1900 corresponding to a waypoint stored in memory and/or a detected object”, Col. 53, Lns. 61-67; real-time sonar data and aged/sourced sonar data (and a method to visually distinguish between the two) is discussed in Col. 27, Lns. 60-67}; and cause the target within the live sonar image to remain highlighted while the target moves within the live sonar image {in Fig. 19, multiple objects are identified according to the distance from the mobile structure/AR-glasses-user, including an augmented reality overlay indicative of a fish (1912, Fig. 19) identified by sonar; “generate augmented reality (AR) display views…that include graphics or AR overlays designed to indicate hidden or metadata characteristics of a particular identified object or position, or chart-referenced object or position, intuitively so as to reduce risk of user confusion when viewing the AR display view…As shown in FIG. 19, hidden waypoint flags 1910 and 1912 each include a flagpole base 1911 and 1913, respectively, and are positioned and/or oriented on surface 205a to indicate a range to (e.g., 56 m, or 71 m) or relative position of a submerged object type (e.g., navigational hazard, or fish) within display view 1900 corresponding to a waypoint stored in memory and/or a detected object.”, Col., 51, Lns. 43-67}.
Regarding Claim 2, Johnson discloses all the limitations of Claim 1, as discussed supra. In addition, Johnson explicitly recites the limitation: wherein the augmented reality device is at least one of smart glasses or a headset, wherein the display is provided in the form of lenses {glasses mounted display in Fig. 4}, wherein causing the live sonar image to be presented on at least one portion of the display of the augmented reality device results in the live sonar image being shown in the lenses in a partially transparent manner so that both the portion of the surrounding environment and the corresponding portion of the live sonar image are visible {as represented in Fig. 4, wearable portable imaging device 420 includes the lower shaded/augmented reality portion 432 and real imagery 434: “FIG. 4 includes some of the features above waterline 205 illustrated in scene 200 of FIGS. 2 and 3, and, in particular in the FOY of displays 426, includes detected waterlines 205b, portions 430 of the FOY that extend below respective waterlines 205b, and portions 434 of the FOY that extend above respective waterlines 205b. Portions 430 may include color and/or intensity shading 432 rendered by a controller (e.g., controller 221 of FIG. 2A) to distinguish depths, relative distances, various characteristics of bathymetric data, and/or other characteristics of various underwater features.”, Col. 29, Lns. 5-15}.
Regarding Claim 3, Johnson discloses all the limitations of Claim 1, as discussed supra. In addition, Johnson explicitly recites the limitation: wherein the augmented reality device provided in the form of a phone, a tablet, a computer, a headset, or a marine electronic device {220, Fig. 2A and glasses mounted augmented display in Fig. 4}, wherein the augmented reality device comprises a screen {AR display view 2300, Fig. 23; 423, Fig. 4} and a camera {“FIG. 2B illustrates a diagram 201 of sensor fusion navigational system 100/220… sensor fusion navigational system 100/220 includes cameras/imaging modules (e.g., similar to visible spectrum imaging module 223 and/or infrared imaging module 224) mounted at positions 262, 264, and 266 providing a monitoring perimeter for mobile structure 101 corresponding roughly to the combined fields of view (FOVs) 263, 265, and 267, as shown.”, Col. 25, Lns. 14-28}, wherein the display is the screen {AR display view 2300, Fig. 23; imaging modules 423 and displays 426, Fig. 4}, wherein the camera is configured to generate one or more images of the portion of the surrounding environment {2330, Fig. 23}, and wherein the one or more images of the portion of the surrounding environment are presented on the screen alongside the live sonar image so that both the one or more images of the portion of the surrounding environment {“sonar system 110, which in turn includes transducer assembly 112…Transducer assembly 112 may be implemented with a sonar orientation and/or position sensor (OPS)”, Col. 17, Ln. 62 to Col. 18, Lns. 27, and “it can be beneficial to generate augmented reality (AR) display views (e.g., for rendering on a display of user interface 120 of system 100, display 226 of system 220, and/or displays 426 of portable imaging device/system 420) that include graphics or AR overlays designed to indicate hidden or metadata characteristics of a particular identified object or position, or chart-referenced object or position, intuitively so as to reduce risk of user confusion when viewing the AR display view.”, Col. 51, Lns. 43-51} and the corresponding portion of the live sonar image are visible on the screen {430, Fig. 4}.
Regarding Claim 7, Johnson discloses all the limitations of Claim 1, as discussed supra. In addition, Johnson explicitly recites the limitation: wherein the moving target is a fish {1912/1913 corresponding to fish location, Fig. 19: “As shown in FIG. 19 hidden waypoint flags 1910 and 1912 each include a flagpole base 1911 and 1913, respectively, and are positioned and/or oriented on surface 205a to indicate a range to (e.g., 56 m, or 71 m) or relative position of a submerged object type (e.g., navigational hazard, or fish)”, Col. 53, Lns. 61-65} or another underwater animal, and wherein the computer program code is configured, when executed by the one or more processors, to cause the one or more processors to: overlay at least one of a movement speed or a direction vector for the moving target on the at least one portion of the display of the augmented reality device {Col. 51, Ln. 43 – Col. 52, Lns. 50 teaches of adding markings/label/indicators to an AR display view (Fig. 19) for visible and non-visible objects (including non-visible objects like fish), and Col. 52, Lns. 26-50 teaches of supplementing this by adding directional indicators (2024, Fig. 20); additionally, Col. 50, Lns. 28-43 teaches of determining positional and velocity information related to objects evaluated by “controller 130 and/or 221” (i.e., Figs. 1A and 2A, respectively), which one skilled in the art can be the numerical data added to the AR display view, as in Figs. 19-20}.
Regarding Claim 8, Johnson discloses all the limitations of Claim 1, as discussed supra. In addition, Johnson explicitly recites the limitations: wherein the moving target is a fish or another underwater animal {1912/1913 corresponding to fish location, Fig. 19: “As shown in FIG. 19 hidden waypoint flags 1910 and 1912 each include a flagpole base 1911 and 1913, respectively, and are positioned and/or oriented on surface 205a to indicate a range to (e.g., 56 m, or 71 m) or relative position of a submerged object type (e.g., navigational hazard, or fish)”, Col. 53, Lns. 61-65}, and wherein the computer program code is configured, when executed by the one or more processors, to cause the one or more processors to: overlay identification information for the moving target on the at least one portion of the display of the augmented reality device {Col. 51, Ln. 43 – Col. 52, Lns. 50 teaches of adding markings/label/indicators to an AR display view (Fig. 19) for visible and non-visible objects (including non-visible objects like fish), and Col. 52, Lns. 26-50 teaches of supplementing this by adding directional indicators (2024, Fig. 20); additionally, Col. 50, Lns. 28-43 teaches of determining positional and velocity information related to objects evaluated by “controller 130 and/or 221” (i.e., Figs. 1A and 2A, respectively), which one skilled in the art can be the numerical data added to the AR display view, as in Figs. 19-20}.
Regarding Claim 9, Johnson discloses all the limitations of Claim 1, as discussed supra. In addition, Johnson explicitly recites the limitations: wherein the moving target is a fish or another underwater animal {1912/1913 corresponding to fish location, Fig. 19: “As shown in FIG. 19 hidden waypoint flags 1910 and 1912 each include a flagpole base 1911 and 1913, respectively, and are positioned and/or oriented on surface 205a to indicate a range to (e.g., 56 m, or 71 m) or relative position of a submerged object type (e.g., navigational hazard, or fish)”, Col. 53, Lns. 61-65}, wherein the computer program code is configured, when executed by the one or more processors, to cause the one or more processors to: determine a position on a water surface corresponding to a current location or an anticipated location for the moving target {1912, Fig. 19}; and overlay a target representation proximate to the position on the at least one portion of the display of the augmented reality device {identification of fish location 1912/1913 in Fig. 19}.
Regarding Claim 11, Johnson discloses all the limitations of Claim 1, as discussed supra. In addition, Johnson explicitly recites the limitation: further comprising: a marine electronic device attached to the watercraft {elements of the sensor fusion navigational system for mobile structure 101, Fig. 1A; “As depicted in FIG. 1B, mobile structure 101 includes actuated sonar system 110, which in tum includes transducer assembly 112 coupled to transom 107b of mobile structure 101 through assembly bracket/actuator 116 and transom bracket/electrical conduit 114.”, Col. 15, Lns. 62-66}, wherein the augmented reality device {AR system 220, Fig. 2A} is configured to record a live video feed {imaging modules 223 and 224, Fig. 2A} and send the live video feed to the marine electronic device or to the one or more memory devices {controller 221 and memory 222, Fig. 2A}.
Regarding Claim 12, Johnson discloses all the limitations of Claim 1, as discussed supra. In addition, Johnson explicitly recites the limitation: wherein at least one of a travel path representation, a target location representation, a hazard representation, topographical representations indicating depth information {in Fig. 19, multiple objects are identified according to the distance from the mobile structure/AR-glasses-user, with a flag added (1910,1912, 1920, 1922) to symbolically identify the object and mark the distance with a numerical value; Col. 51, Lns 43-60}, or a second image are presented on the display, wherein the second image is at least one of a sonar image, a radar image, an underwater image, or a video.
Regarding Claim 14, Johnson discloses the limitations: an augmented reality device {system 220, Fig. 2A, and wearable portable imaging device 420, Fig. 4; “In various embodiments, it can be beneficial to generate augmented reality (AR) display views (e.g., for rendering on a display of user interface 120 of system 100, display 226 of system 220, and/or displays 426 of portable imaging device/system 420) that include graphics or AR overlays designed to indicate hidden or metadata characteristics of a particular identified object or position, or chart-referenced object or position, intuitively so as to reduce risk of user confusion when viewing the AR display view.”, Col. 51, Lns. 43-51} for use in a watercraft {Figs. 1, 4 and 28}, the augmented reality device comprising: a display configured to allow the user to view a portion of a surrounding environment through the display {wearable portable imaging device 420, Fig. 4}, wherein the portion of the surrounding environment corresponds to a current viewpoint such that the user would otherwise see the portion of the surrounding environment if the augmented reality device was not present {as represented in Fig. 4, wearable portable imaging device 420 includes the lower shaded/augmented reality portion 432 and real imagery 434: “FIG. 4 includes some of the features above waterline 205 illustrated in scene 200 of FIGS. 2 and 34, and, in particular in the FOY of displays 426, includes detected waterlines 205b, portions 430 of the FOY that extend below respective waterlines 205b, and portions 434 of the FOY that extend above respective waterlines 205b. Portions 430 may include color and/or intensity shading 432 rendered by a controller (e.g., controller 221 of FIG. 2A) to distinguish depths, relative distances, various characteristics of bathymetric data, and/or other characteristics of various underwater features.”, Col. 29, Lns. 5-15}; one or more processors; and one or more memory devices {controller 221 and memory 222, Fig. 2A} comprising computer program code configured, when executed by the one or more processors, to cause the one or more processors {Col. 22, Lns. 26-35} to: determine a position and a line of sight {as represented in Fig. 4 for head-mounted, glass-type augmented reality device}; receive a live sonar image {use for real-time sonar data: “substantially real time sonar data may be rendered in color and prior-acquired and/or survey map sonar data may be rendered in greyscale. In some embodiments, a relative age of once real time sonar data may be indicated by reducing a chrominance level of the sonar data as the sonar data ages.”, Col. 27, Lns. 60-67};
correlate the position and the line of sight to a corresponding portion of the live sonar image {“Sonar data provided by the associated transducer assembly may be rendered using position data and/or orientation data provided by the OPS to correlate the sonar data with portion 430, for example, and/or to facilitate other rendering processing described herein.”, Col. 29, Lns. 36-40}; and cause presentation of the live sonar image on at least one portion of the display so that the portion of the surrounding environment and the corresponding portion of the live sonar image are both visible in the display from a perspective of the user {“portable imaging device 420 may be configured to determine portion 430 of the FOV of display 426 and use an OPS and actuator in an associated transducer assembly (e.g., actuator 116 coupled to transducer assembly 112 of sonar system 110 in FIG. 1B) to ensonify at least a subset of portion 430 substantially in real time as a user adjusts a position or orientation of wearable portable imaging device 420 by, for example, moving the user's head.”, Col. 29, Lns. 28-36 and Fig. 4}; determine a target within the live sonar image; cause presentation of a highlight feature over the target within the live sonar image such that the target is highlighted from the perspective of the user {1912/1913 corresponding to a fish location, Fig. 19: “As shown in FIG. 19 hidden waypoint flags 1910 and 1912 each include a flagpole base 1911 and 1913, respectively, and are positioned and/or oriented on surface 205a to indicate a range to (e.g., 56 m, or 71 m) or relative position of a submerged object type (e.g., navigational hazard, or fish) within display view 1900 corresponding to a waypoint stored in memory and/or a detected object”, Col. 53, Lns. 61-67; real-time sonar data and aged/sourced sonar data (and a method to visually distinguish between the two) is discussed in Col. 27, Lns. 60-67}; and cause the target within the live sonar image to remain highlighted while the target moves within the live sonar image {in Fig. 19, multiple objects are identified according to the distance from the mobile structure/AR-glasses-user, including an augmented reality overlay indicative of a fish (1912, Fig. 19) identified by sonar; “generate augmented reality (AR) display views…that include graphics or AR overlays designed to indicate hidden or metadata characteristics of a particular identified object or position, or chart-referenced object or position, intuitively so as to reduce risk of user confusion when viewing the AR display view…As shown in FIG. 19, hidden waypoint flags 1910 and 1912 each include a flagpole base 1911 and 1913, respectively, and are positioned and/or oriented on surface 205a to indicate a range to (e.g., 56 m, or 71 m) or relative position of a submerged object type (e.g., navigational hazard, or fish) within display view 1900 corresponding to a waypoint stored in memory and/or a detected object.”, Col., 51, Lns. 43-67}.
Regarding Claim 15, Johnson discloses all the limitations of Claim 14, as discussed supra. In addition, Johnson explicitly recites the limitation: wherein the augmented reality device is provided in the form of at least one of smart glasses or a headset, wherein the display is provided in the form of lenses {glasses mounted display in Fig. 4}, wherein the display is provided in the form of lenses, wherein causing the live sonar image to be presented in the display of the augmented reality device results in the live sonar image being shown in the lenses in a partially transparent manner so that both the portion of the surrounding environment and the corresponding portion of the live sonar image are visible {as represented in Fig. 4, wearable portable imaging device 420 includes the lower shaded/augmented reality portion 432 and real imagery 434: “FIG. 4 includes some of the features above waterline 205 illustrated in scene 200 of FIGS. 2 and 3, and, in particular in the FOY of displays 426, includes detected waterlines 205b, portions 430 of the FOY that extend below respective waterlines 205b, and portions 434 of the FOY that extend above respective waterlines 205b. Portions 430 may include color and/or intensity shading 432 rendered by a controller (e.g., controller 221 of FIG. 2A) to distinguish depths, relative distances, various characteristics of bathymetric data, and/or other characteristics of various underwater features.”, Col. 29, Lns. 5-15}.
Regarding Claim 16, Johnson discloses all the limitations of Claim 14, as discussed supra. In addition, Johnson explicitly recites the limitation: wherein the augmented reality device provided in the form of a phone, a tablet, a computer, a headset, or a marine electronic device {220, Fig. 2A and glasses mounted augmented display in Fig. 4}, wherein the augmented reality device comprises a screen {AR display view 2300, Fig. 23; 423, Fig. 4} and a camera {“FIG. 2B illustrates a diagram 201 of sensor fusion navigational system 100/220… sensor fusion navigational system 100/220 includes cameras/imaging modules (e.g., similar to visible spectrum imaging module 223 and/or infrared imaging module 224) mounted at positions 262, 264, and 266 providing a monitoring perimeter for mobile structure 101 corresponding roughly to the combined fields of view (FOVs) 263, 265, and 267, as shown.”, Col. 25, Lns. 14-28}, wherein the display is the screen {AR display view 2300, Fig. 23; imaging modules 423 and displays 426, Fig. 4}, wherein the camera is configured to generate one or more images of the portion of the surrounding environment {2330, Fig. 23}, and wherein the one or more images of the portion of the surrounding environment are presented on the screen alongside the live sonar image so that both the one or more images of the portion of the surrounding environment {“sonar system 110, which in turn includes transducer assembly 112…Transducer assembly 112 may be implemented with a sonar orientation and/or position sensor (OPS)”, Col. 17, Ln. 62 to Col. 18, Lns. 27, and “it can be beneficial to generate augmented reality (AR) display views (e.g., for rendering on a display of user interface 120 of system 100, display 226 of system 220, and/or displays 426 of portable imaging device/system 420) that include graphics or AR overlays designed to indicate hidden or metadata characteristics of a particular identified object or position, or chart-referenced object or position, intuitively so as to reduce risk of user confusion when viewing the AR display view.”, Col. 51, Lns. 43-51} and the corresponding portion of the live sonar image are visible on the screen {430, Fig. 4}.
Regarding Claim 17, Johnson discloses the limitations: a method for using an augmented reality device {system 220, Fig. 2A, and wearable portable imaging device 420, Fig. 4; “In various embodiments, it can be beneficial to generate augmented reality (AR) display views (e.g., for rendering on a display of user interface 120 of system 100, display 226 of system 220, and/or displays 426 of portable imaging device/system 420) that include graphics or AR overlays designed to indicate hidden or metadata characteristics of a particular identified object or position, or chart-referenced object or position, intuitively so as to reduce risk of user confusion when viewing the AR display view.”, Col. 51, Lns. 43-51}, the method comprising: determining a position and a line of sight of the augmented reality device {as represented in Fig. 4 for head-mounted, glass-type augmented reality device}; receiving a live sonar image {use for real-time sonar data: “substantially real time sonar data may be rendered in color and prior-acquired and/or survey map sonar data may be rendered in greyscale. In some embodiments, a relative age of once real time sonar data may be indicated by reducing a chrominance level of the sonar data as the sonar data ages.”, Col. 27, Lns. 60-67}; correlating the position and the line of sight of the augmented reality device to a corresponding portion of the live sonar image {“Sonar data provided by the associated transducer assembly may be rendered using position data and/or orientation data provided by the OPS to correlate the sonar data with portion 430, for example, and/or to facilitate other rendering processing described herein.”, Col. 29, Lns. 36-40}; causing presentation of the live sonar image on at least one portion of the display so that the portion of the surrounding environment and the corresponding portion of the live sonar image are both visible in the display from a perspective of the user {“portable imaging device 420 may be configured to determine portion 430 of the FOV of display 426 and use an OPS and actuator in an associated transducer assembly (e.g., actuator 116 coupled to transducer assembly 112 of sonar system 110 in FIG. 1B) to ensonify at least a subset of portion 430 substantially in real time as a user adjusts a position or orientation of wearable portable imaging device 420 by, for example, moving the user's head.”, Col. 29, Lns. 28-36 and Fig. 4}; determine a target within the live sonar image; cause presentation of a highlight feature over the target within the live sonar image such that the target is highlighted from the perspective of the user {1912/1913 corresponding to a fish location, Fig. 19: “As shown in FIG. 19 hidden waypoint flags 1910 and 1912 each include a flagpole base 1911 and 1913, respectively, and are positioned and/or oriented on surface 205a to indicate a range to (e.g., 56 m, or 71 m) or relative position of a submerged object type (e.g., navigational hazard, or fish) within display view 1900 corresponding to a waypoint stored in memory and/or a detected object”, Col. 53, Lns. 61-67; real-time sonar data and aged/sourced sonar data (and a method to visually distinguish between the two) is discussed in Col. 27, Lns. 60-67}; and cause the target within the live sonar image to remain highlighted while the target moves within the live sonar image {in Fig. 19, multiple objects are identified according to the distance from the mobile structure/AR-glasses-user, including an augmented reality overlay indicative of a fish (1912, Fig. 19) identified by sonar; “generate augmented reality (AR) display views…that include graphics or AR overlays designed to indicate hidden or metadata characteristics of a particular identified object or position, or chart-referenced object or position, intuitively so as to reduce risk of user confusion when viewing the AR display view…As shown in FIG. 19, hidden waypoint flags 1910 and 1912 each include a flagpole base 1911 and 1913, respectively, and are positioned and/or oriented on surface 205a to indicate a range to (e.g., 56 m, or 71 m) or relative position of a submerged object type (e.g., navigational hazard, or fish) within display view 1900 corresponding to a waypoint stored in memory and/or a detected object.”, Col., 51, Lns. 43-67}.
Regarding Claim 18, Johnson discloses all the limitations of Claim 17, as discussed supra. In addition, Johnson explicitly recites the limitation: wherein the augmented reality device is provided in the form of at least one of smart glasses or a headset, wherein the display is provided in the form of lenses {glasses mounted display in Fig. 4}, wherein the display is provided in the form of lenses, wherein causing the live sonar image to be presented in the display of the augmented reality device results in the live sonar image being shown in the lenses in a partially transparent manner so that both the portion of the surrounding environment and the corresponding portion of the live sonar image are visible {as represented in Fig. 4, wearable portable imaging device 420 includes the lower shaded/augmented reality portion 432 and real imagery 434: “FIG. 4 includes some of the features above waterline 205 illustrated in scene 200 of FIGS. 2 and 3, and, in particular in the FOY of displays 426, includes detected waterlines 205b, portions 430 of the FOY that extend below respective waterlines 205b, and portions 434 of the FOY that extend above respective waterlines 205b. Portions 430 may include color and/or intensity shading 432 rendered by a controller (e.g., controller 221 of FIG. 2A) to distinguish depths, relative distances, various characteristics of bathymetric data, and/or other characteristics of various underwater features.”, Col. 29, Lns. 5-15}.
Regarding Claim 19, Johnson discloses all the limitations of Claim 17, as discussed supra. In addition, Johnson explicitly recites the limitation: wherein the augmented reality device provided in the form of a phone, a tablet, a computer, a headset, or a marine electronic device {220, Fig. 2A and glasses mounted augmented display in Fig. 4}, wherein the augmented reality device comprises a screen {AR display view 2300, Fig. 23; 423, Fig. 4} and a camera {“FIG. 2B illustrates a diagram 201 of sensor fusion navigational system 100/220… sensor fusion navigational system 100/220 includes cameras/imaging modules (e.g., similar to visible spectrum imaging module 223 and/or infrared imaging module 224) mounted at positions 262, 264, and 266 providing a monitoring perimeter for mobile structure 101 corresponding roughly to the combined fields of view (FOVs) 263, 265, and 267, as shown.”, Col. 25, Lns. 14-28}, wherein the display is the screen {AR display view 2300, Fig. 23; imaging modules 423 and displays 426, Fig. 4}, wherein the camera is configured to generate one or more images of the portion of the surrounding environment {2330, Fig. 23}, and wherein the one or more images of the portion of the surrounding environment are presented on the screen alongside the live sonar image so that both the one or more images of the portion of the surrounding environment {“sonar system 110, which in turn includes transducer assembly 112…Transducer assembly 112 may be implemented with a sonar orientation and/or position sensor (OPS)”, Col. 17, Ln. 62 to Col. 18, Lns. 27, and “it can be beneficial to generate augmented reality (AR) display views (e.g., for rendering on a display of user interface 120 of system 100, display 226 of system 220, and/or displays 426 of portable imaging device/system 420) that include graphics or AR overlays designed to indicate hidden or metadata characteristics of a particular identified object or position, or chart-referenced object or position, intuitively so as to reduce risk of user confusion when viewing the AR display view.”, Col. 51, Lns. 43-51} and the corresponding portion of the live sonar image are visible on the screen {430, Fig. 4}.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 5 and 10 are rejected under 35 U.S.C. §103 as being unpatentable over the combination of Johnson and Snyder et al. (US 2021/0255627 A1 , henceforth Snyder).
Regarding Claim 5, Johnson discloses all the limitations of Claim 4, as discussed supra. Johnson does not appear to explicitly recite the limitation: wherein the augmented reality device is configured to enable a selection or deselection of the target by a user by detecting a particular hand gesture in a particular direction, a voice command, a particular eye movement, or another head or body movement.
However, Snyder explicitly recites the limitation: wherein the augmented reality device is configured to enable a selection or deselection of the target by a user by detecting a particular hand gesture in a particular direction, a voice command, a particular eye movement, or another head or body movement {“As shown in FIG. 10, the user may yell “Stop the boat!”, and the system 100 may cause marine devices 104 to take actions to stop the marine vessel 10. Some non-limiting examples of voice commands include volume-up, volume-down, display sonar, display chart, man overboard (MOB), record sonar, stop recording sonar, way point, new route, and various other words and/or phrases that may be associated with marine based applications.”, ¶[0119]}.
Johnson and Snyder are analogous art because the deal with augmented reality displaying of information to a watercraft operator.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Johnson and Snyder before them, to modify the teachings of Johnson to include the teachings of Snyder to provide hands-free control of watercraft vehicle operation to quickly kill the engine by voice command {¶[0119]}.
Regarding Claim 10, Johnson discloses all the limitations of Claim 1, as discussed supra. In addition, Johnson explicitly recites the limitation: adjust operation of a motor or a sonar transducer array based on the user input, wherein the operation of the motor or the sonar transducer array is adjusted by turning the motor or the sonar transducer array on or off, by changing an orientation of the motor or the sonar transducer array, or by changing a motor speed {sonar data used to affect mobile structure control: “In some embodiments, image data, orientation and/or position data, and/or sonar data acquired and/or processed in blocks 602-614 may be used to control operation of a mobile structure 101, such as by controlling steering sensor/actuator 150 and/or propulsion system 170 to steer mobile structure 101 to avoid hazards and/or to follow a provided route, to steer mobile structure 101 according to an orientation of display 226, for example, and/or according to positions and/or depths of floor 206, bottom feature 207, fish 208, and/or submerged objects 209 as well as other terrain and weather features.”, ¶[0151]}.
Johnson does not appear to explicitly recite the limitation: wherein the augmented reality device is configured to enable a selection or deselection of the target by a user by detecting a particular hand gesture in a particular direction, a voice command, a particular eye movement, or another head or body movement.
However, Snyder explicitly recites the limitation: wherein the augmented reality device is configured to enable a selection or deselection of the target by a user by detecting a particular hand gesture in a particular direction, a voice command, a particular eye movement, or another head or body movement {“As shown in FIG. 10, the user may yell “Stop the boat!”, and the system 100 may cause marine devices 104 to take actions to stop the marine vessel 10. Some non-limiting examples of voice commands include volume-up, volume-down, display sonar, display chart, man overboard (MOB), record sonar, stop recording sonar, way point, new route, and various other words and/or phrases that may be associated with marine based applications.”, ¶[0119]}.
Claim 13 is rejected under 35 U.S.C. §103 as being unpatentable over the combination of Johnson and Smith et al. (US 10,585,190 B2), henceforth Smith.
Regarding Claim 13, Johnson discloses all the limitations of Claim 12, as discussed supra. Johnson does not appear to explicitly recite the limitation: wherein the second image is presented on the display, wherein a particular position is emphasized in both the second image and in the live sonar image.
However, Smith explicitly recites the limitation: wherein the second image is presented on the display {right hand image in Fig. 6A, see below}, wherein a particular position is emphasized in both the second image and in the live sonar image {as represented in Fig. 6A, in which sonar identified fish are labeled with the corresponding sonar determined depth: “Fish identified based on the signals obtained from the various transducers of the sonar device are shown in their associated regions, along with depth information. The right image shows a depth-based view of the highlighted quadrant, with a visualization of the different fish depths.”, Col. 12, Lns. 23-28}.
Johnson and Smith are analogous art because the deal with displaying data and images captured with sonar technology.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Johnson and Smith before them, to modify the teachings of Johnson to include the teachings of Smith to provide additional context to sonar data displayed {Fig. 6A, see below}.
PNG
media_image1.png
317
556
media_image1.png
Greyscale
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
US 10,481,259 B2 – Tracking targets with live sonar, with the targets being highlighted on the display image using white triangles.
US 2020/0341140 A1 – Using icons to track the location of fish in a real-time sonar image, with the size of the fish, or school of fish, being represented by the size of the icon.;
WO 2021/119366 A1 - The use of bounding boxes and labels to identify stationary and moving objects in a real-time image of a marine environment, including sonar imaging.
US 2015/0078123 A1 – Supplementing an live image of a marine environment with information tags 700, which are displayed all the times on the screen, including tagging fish.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to RICHARD EDWIN GEIST whose telephone number is (703)756-5854. The examiner can normally be reached Monday-Friday, 9am-6pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christian Chace can be reached at (571) 272-4190. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/R.E.G./Examiner, Art Unit 3665 /CHRISTIAN CHACE/Supervisory Patent Examiner, Art Unit 3665