DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This is a Final rejection is in response to Applicant’s amendment of 22 July 2025. Claims 1-8 and 18-27 have been canceled. Claims 9-17 and 28-38 are currently pending, as discussed below.
Examiner Notes that the fundamentals of the rejections are based on the broadest reasonable interpretation of the claim language. Applicant is kindly invited to consider the reference as a whole. References are to be interpreted as by one of ordinary skill in the art rather than as by a novice. See MPEP 2141. Therefore, the relevant inquiry when interpreting a reference is not what the reference expressly discloses on its face but what the reference would teach or suggest to one of ordinary skill in the art.
Response to Arguments
Applicant's arguments filed 7/22/2025 have been fully considered and are persuasive in part. Arguments regarding 35 U.S.C. § 101 rejection has been fully considered and is persuasive. 35 U.S.C. § 101 rejection of claims 9-17 and 28-38 set forth in office action of 4/23/2025 is withdrawn. Interpretation of "sensor module", and “controller” all includes a general placeholder (module, controller, device), is followed by a functional language "configured to detect gestures…”, “configured to receive and store…” and is not modified by language that provide sufficient structure in Prong C of MPEP 2181 therefore "sensor module", and “controller” is interpretation under 35 U.S.C. 112(f) is sustained. 35 U.S.C. 112(f) interpretation of “marine device” is withdrawn. Arguments regarding 35 U.S.C. § 112(a) rejections to claims 9-17 and 28-38 have been fully considered and are persuasive. 35 U.S.C. § 112(a) rejections to claims 9-17 and 28-38 are withdrawn. Amendment to claim 31 has been fully considered and is persuasive. 35 U.S.C. § 112(a) rejections to claim 31 is withdrawn. Arguments regarding 35 U.S.C. § 112(b) rejection of claim 9, 12, 16, 28, 31, 35, 36, and 38 has been fully considered and is persuasive. 35 U.S.C. § 112(a) rejections to claims 9, 12, 16, 28, 31, 35, 36, and 38 is withdrawn. Arguments regarding obviousness rejection of the independent claims has been fully considered and is not persuasive. 35 U.S.C. § 103 rejection to claims 9, 11-12, 15-17, 28, 30-31, and 34-38 is sustained.
Examiner’s Response- Examiner has carefully considered Applicant’s arguments and respectfully disagrees. Estabrook teaches controlling operations of a marine vessel through gestures because Examiner interprets “gestures of a user” to encompass a user moving part of their body to move the helm. The argument that a person of ordinary skill in the art would be discouraged from formulating the proposed combination of Estabrook and Bertrand because the proposed combination would require addition of sensors is not persuasive because Estabrook does not prohibit the addition of sensors. Furthermore, Estabrook, nor Bertrand teach that the combination of detecting gestures to control the vessel would not work for its intended use and Estabrook does not restrict the addition of sensors to detect gestures, therefore it would be obvious to modify the helm control mechanism with Bertrand’s handheld device to improve the control the operation of the marine vessel.
Information Disclosure Statement
The information disclosure statement (IDS) filed on 7/11/2025 has been considered by examiner.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are:
Sensor module in claims 9, 10, 11, 28, 29 and 30
Controller in claims 9 and 28
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Upon reviewing of the specification, the following appears to be the corresponding structure for the sensor module:
" the sensor module may include a radar-based motion sensor to detect gestures and/or actions from the user ", [¶ 10]
" the sensor module may include a vibration sensor for detecting vibrations of the user (e.g., stomping, sounds, vocalizations)", [¶ 11]
“The sensor module 106 may transmit signals to the controller 102 indicating a detected condition, event, and/or user input. The sensor module 106 may include various sensors for detecting conditions, events, and/or user inputs. Some non-limiting examples of sensors that may be part of the sensor module 106 include tactile sensors (e.g., pressure sensors, strain gauges, capacitive touch sensors), three-axis sensors, analog sensors, vibration sensors, chemical sensors, electromagnetic sensors, environmental sensors, flow sensors, navigational sensors, position sensors, optical sensors, and temperature sensors. The sensor module 106 may be located anywhere relative to the marine vessel, such as within an MFD, mounted separately to the marine vessel, and/or within a mobile media device 50 (e.g., shown in FIG. 3)” [¶ 11]
“For example, as shown in FIG. 3, the sensor module 106 may be built-in to a mobile media or smart device, which senses, processes, and/or interprets a user input locally to determine what signal to send to the remote MFD, built-in to the marine vessel 10, which then uses a communication bus or integration hub to transmit a control signal to the marine device 104, which may receive and then further process the data and/or signal. In some embodiments, the sensor module 106 may include a camera and/or other optical or visual sensor as well as a local dedicated image processor for rapidly analyzing data captured via the one or more sensors.” [¶ 43]
“the sensor module 106 (e.g., microphone, accelerometer, vibration sensor) positioned in the foot pedal” [¶ 44]
“The sensor module 106 may include various sensors (e.g., touchscreen, touchpad, trackball, camera, orientation sensor, 3D laser, facial recognition sensor, facial expression or gesture recognition sensor, electroencephalography (EEG) sensor) configured to detect user inputs for controlling operation of the sonar system 120 and/or display via the controller 102. In some embodiments, the sensor module 106 may be positioned within or correspond with one or more devices/locations (e.g., MFD, handheld device or FOB, phone, tablet, computing device, adjustable device mount, VR headset, glasses frames or other headgear, trolling motor foot pedal or buttons, other user input assemblies). Some non-limiting examples of user inputs include direction of user's face; direction or orientation of FOB, mobile media device 50, and/or sensor module 106 in user's hand or worn on user's body, adjustably or permanently mounted to marine vessel 10, chair, trolling motor assembly, and/or other marine device 104 thereon; pinch-zoom-pan or other touch; and body or hand gestures.” [¶ 53]
Upon reviewing of the specification, the following appears to be the corresponding structure for the controller:
“the controller 102 may be an assembly or system of multiple processors and/or circuitry distributed across various devices.” [¶ 43]
“the controller 102 may be a processor built-in to the MFD of the marine vessel 10 and/or an integration hub 401 (which may be separate from or within the MFD)” [¶44]
“The controller 102 may include at least one processing component (e.g., processor) and memory including instructions configured to cause the processor to perform various actions and/or functions including display of images on the display 40 associated with the sonar system 120.” [¶ 61]
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 9, 11-12, 15-17, 28, 30-31, and 34-38 are rejected under 35 U.S.C. 103 as being unpatentable over Estabrook (US 20170205828 A1) in view of Bertrand et al. (US 20190137993 A1).
Regarding claim 9, Estabrook teaches A system for controlling operations of a marine vessel through gestures of a user, the system comprising: a sensor module mounted to the marine vessel(see at least [¶ 7, Estabrook]: “two or more sensor units may be implemented to detect helm movement. For example, one sensor unit may be mounted to the helm, as mentioned above, and a second, separate sensor unit may be mounted at another location on the marine vessel”) and configured to detect positioning of the helm (see at least [¶ 5, Estabrook]: “The present disclosure is directed to technology that encompasses the use of a sensor to detect helm movement. Once helm movement is detected, autonomous control of the marine vessel may be disengaged to allow a user to manually steer the marine vessel by providing input to the helm”) by: detecting and transmitting a first position of one or more objects within a field of view of the sensor module at a first time, and detecting and transmitting a second position of the one or more objects within the field of view at a second time, wherein the second time is a time differential later than the first time (position of the object/helm is measured at a certain time, and then position is measured 1 second later, see at least [¶ 53, Estabrook]: “helm movement may be determined based upon the measured rate of helm movement in the same direction over a time period that yields an accumulated helm movement exceeding the predetermined autopilot disengagement threshold helm angle. For example, if a gyroscope sensor measures helm 208 being turned at a rate of 30 degrees per second and this rate is maintained for a full second, then processing unit 220 may calculate that helm 208 has been turned 30 degrees, and compare this value to the disengagement threshold helm angle to determine whether movement of the helm 208 exceeds a particular disengagement threshold helm angle”); and a controller configured to: receive and store the detected first and second positions of the one or more objects from the sensor module in a memory (helm movement data is stored in the memory unit 356, see at least [¶ 71, Estabrook]: “processing unit 352 may be configured to retrieve, process, and/or analyze data stored in memory unit 356, to store data to memory unit 356, to replace data stored in memory unit 356, to analyze helm movement data and/or marine vessel movement data received from sensor units (e.g., sensor units 218.1-218.3), to process helm movement data and marine vessel movement data”), generate a difference data set by comparing the first and second positions of the one or more objects, modify the difference data set by filtering difference data falling within a marine vessel movement data set, wherein the marine vessel movement data set is based on movement of the marine vessel between the first and second times, process the modified difference data set to identify helm movement (the difference data is the helm movement data and is modified by subtracting the marine vessel movement data resulting in the actual helm movement by the user, see at least [¶ 7, 56, Estabrook]: “the first sensor unit measures helm movement and the second sensor unit measures the overall movement of the marine vessel. The second sensor unit may transmit marine vessel movement data to one or more marine vessel components. The marine vessel movement data may provide a base reference movement that is utilized by the one or more marine vessel components and/or the first sensor unit to subtract overall marine vessel movement from the helm movement. The resulting difference may better indicate actual helm movement”), determine a desired action based on the amount of helm movement (disengage autopilot action is determined based on the identified gesture of the helm, see at least [¶ 7, Estabrook]: “CCU 206 may disengage the autonomous steering of the rudder 216 when the movement of the helm exceeds a predetermined autopilot disengagement threshold. For example, the predetermined autopilot disengagement threshold may be an amount of rotation (e.g., accelerometer data from sensor unit indicating rotation in excess of 45 degrees will cause disengagement of the autonomous steering), rate of rotation (e.g., gyroscope data from sensor unit indicating rotation in excess of 3 degrees/second for 2 seconds), or rate of rotational acceleration (e.g., gyroscope data from sensor unit indicating acceleration of rotation in excess of 3 degrees/second for 2 seconds). CCU 206 may cause autopilot 204 to enter a standby mode or to temporarily pause sending control signals to ECU 208 when autopilot functionality is disengaged”), and transmit a signal to a marine device to cause the marine device to operate according to the desired action (autopilot steering system 200 sends a signal to a marine device (CCU 206) to disengage the autopilot, see at least [¶ 44, Estabrook]: “one or more other components of autopilot steering system 200 may determine when the movement of the helm exceeds a predetermined autopilot disengagement threshold and transmit a command to autopilot 204, which is received by CCU 206 and acted upon to disengage autopilot 204. In such a case, CCU 206 may disengage autopilot 204 without necessarily analyzing the movement data”).
Estabrook does not explicitly teach a sensor module configured to detect gestures of the user, identify one or more gestures, and determine a desired action based on the identified one or more gestures.
Bertrand, directed to a handheld device for navigating a marine vessel teaches a sensor module configured to detect gestures of the user, identify one or more gestures, and determine a desired action based on the identified one or more gestures (autopilot steering system 200 sends a signal to a marine device (CCU 206) to disengage the autopilot, see at least [¶ 57, Bertrand]: “The handheld device 200 is configured to control the configuration and operation of the one or more marine vessel 100 motors (e.g., trolling motor(s) 120, primary motor(s) 122, and/or thruster(s) 124) based on a current position or orientation of the handheld device 200, user movements (e.g., gestures), and/or inputs received from user interface 220”).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, with a reasonable expectation of success, to have modified Estabrook’s method of providing user input to the helm to incorporate the teachings of Bertrand which teaches a sensor module configured to detect gestures of the user, identify one or more gestures, and determine a desired action based on the identified one or more gestures since they are both related to controlling marine vessels and incorporation of the teachings of Bertrand increase the versatility and safety of the overall system by utilizing gesture detection to steer the vessel in the case where the user is unable to physically reach the helm due to an emergency.
Regarding claim 28, Estabrook teaches A marine electronic device for controlling operations of a marine vessel, the marine electronic device comprising: a display (chart plotter contains a display, see at least [¶ 57, Estabrook]: “A marine vessel component (e.g., course control unit, chart plotter, etc.) may receive the helm movement data and cause disengagement of the autonomous steering of the rudder when the helm movement data exceeds the predetermined autopilot disengagement threshold”); a sensor module configured to detect positioning of the helm by (see at least [¶ 5, Estabrook]: “The present disclosure is directed to technology that encompasses the use of a sensor to detect helm movement. Once helm movement is detected, autonomous control of the marine vessel may be disengaged to allow a user to manually steer the marine vessel by providing input to the helm”): detecting and transmitting a first position of one or more objects within a field of view of the sensor module at a first time, and detecting and transmitting a second position of the one or more objects within the field of view at a second time, wherein the second time is a time differential later than the first time (position of the object/helm is measured at a certain time, and then position is measured 1 second later, see at least [¶ 53, Estabrook]: “helm movement may be determined based upon the measured rate of helm movement in the same direction over a time period that yields an accumulated helm movement exceeding the predetermined autopilot disengagement threshold helm angle. For example, if a gyroscope sensor measures helm 208 being turned at a rate of 30 degrees per second and this rate is maintained for a full second, then processing unit 220 may calculate that helm 208 has been turned 30 degrees, and compare this value to the disengagement threshold helm angle to determine whether movement of the helm 208 exceeds a particular disengagement threshold helm angle”); and a controller configured to: receive and store the detected first and second positions of the one or more objects from the sensor module in a memory (helm movement data is stored in the memory unit 356, see at least [¶ 71, Estabrook]: “processing unit 352 may be configured to retrieve, process, and/or analyze data stored in memory unit 356, to store data to memory unit 356, to replace data stored in memory unit 356, to analyze helm movement data and/or marine vessel movement data received from sensor units (e.g., sensor units 218.1-218.3), to process helm movement data and marine vessel movement data”), generate a difference data set by comparing the first and second positions of the one or more objects, modify the difference data set by filtering difference data falling within a marine vessel movement data set, wherein the marine vessel movement data set is based on movement of the marine vessel between the first and second times, process the modified difference data set to identify helm movement (the difference data is the helm movement data and is modified by subtracting the marine vessel movement data resulting in the actual helm movement by the user, see at least [¶ 7, 56, Estabrook]: “the first sensor unit measures helm movement and the second sensor unit measures the overall movement of the marine vessel. The second sensor unit may transmit marine vessel movement data to one or more marine vessel components. The marine vessel movement data may provide a base reference movement that is utilized by the one or more marine vessel components and/or the first sensor unit to subtract overall marine vessel movement from the helm movement. The resulting difference may better indicate actual helm movement”), determine a desired action based on the amount of helm movement (disengage autopilot action is determined based on the identified gesture of the helm, see at least [¶ 7, Estabrook]: “CCU 206 may disengage the autonomous steering of the rudder 216 when the movement of the helm exceeds a predetermined autopilot disengagement threshold. For example, the predetermined autopilot disengagement threshold may be an amount of rotation (e.g., accelerometer data from sensor unit indicating rotation in excess of 45 degrees will cause disengagement of the autonomous steering), rate of rotation (e.g., gyroscope data from sensor unit indicating rotation in excess of 3 degrees/second for 2 seconds), or rate of rotational acceleration (e.g., gyroscope data from sensor unit indicating acceleration of rotation in excess of 3 degrees/second for 2 seconds). CCU 206 may cause autopilot 204 to enter a standby mode or to temporarily pause sending control signals to ECU 208 when autopilot functionality is disengaged”), and operate a marine device according to the desired action (autopilot steering system 200 sends a signal to a marine device (CCU 206) to disengage the autopilot, see at least [¶ 44, Estabrook]: “one or more other components of autopilot steering system 200 may determine when the movement of the helm exceeds a predetermined autopilot disengagement threshold and transmit a command to autopilot 204, which is received by CCU 206 and acted upon to disengage autopilot 204. In such a case, CCU 206 may disengage autopilot 204 without necessarily analyzing the movement data”).
Estabrook does not explicitly teach a sensor module configured to detect gestures of the user, identify one or more gestures, and determine a desired action based on the identified one or more gestures.
Bertrand, directed to a handheld device for navigating a marine vessel teaches a sensor module configured to detect gestures of the user, identify one or more gestures, and determine a desired action based on the identified one or more gestures (autopilot steering system 200 sends a signal to a marine device (CCU 206) to disengage the autopilot, see at least [¶ 57, Bertrand]: “The handheld device 200 is configured to control the configuration and operation of the one or more marine vessel 100 motors (e.g., trolling motor(s) 120, primary motor(s) 122, and/or thruster(s) 124) based on a current position or orientation of the handheld device 200, user movements (e.g., gestures), and/or inputs received from user interface 220”).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, with a reasonable expectation of success, to have modified Estabrook’s method of providing user input to the helm to incorporate the teachings of Bertrand which teaches a sensor module configured to detect gestures of the user, identify one or more gestures, and determine a desired action based on the identified one or more gestures since they are both related to controlling marine vessels and incorporation of the teachings of Bertrand increase the versatility and safety of the overall system by utilizing gesture detection to steer the vessel in the case where the user is unable to physically reach the helm due to an emergency.
Regarding claim 38, Estabrook teaches a method for controlling operations of a marine vessel, the method comprising: receiving, from a sensor module (see at least [¶ 7, Estabrook]: “two or more sensor units may be implemented to detect helm movement. For example, one sensor unit may be mounted to the helm, as mentioned above, and a second, separate sensor unit may be mounted at another location on the marine vessel”), a detected first position at a first time and a detected second position at a second time of the one or more objects, wherein the first time is different than the second time (position of the object/helm is measured at a certain time, and then position is measured 1 second later, see at least [¶ 53, Estabrook]: “helm movement may be determined based upon the measured rate of helm movement in the same direction over a time period that yields an accumulated helm movement exceeding the predetermined autopilot disengagement threshold helm angle. For example, if a gyroscope sensor measures helm 208 being turned at a rate of 30 degrees per second and this rate is maintained for a full second, then processing unit 220 may calculate that helm 208 has been turned 30 degrees, and compare this value to the disengagement threshold helm angle to determine whether movement of the helm 208 exceeds a particular disengagement threshold helm angle”), generating a difference data set by comparing the first and second positions of the one or more objects, modifying the difference data set by filtering difference data falling within a marine vessel movement data set, wherein the marine vessel movement data set is based on movement of the marine vessel between the first and second times, processing the modified difference data set to identify helm movement (the difference data is the helm movement data and is modified by subtracting the marine vessel movement data resulting in the actual helm movement or gesture by the user, see at least [¶ 7, 56, Estabrook]: “the first sensor unit measures helm movement and the second sensor unit measures the overall movement of the marine vessel. The second sensor unit may transmit marine vessel movement data to one or more marine vessel components. The marine vessel movement data may provide a base reference movement that is utilized by the one or more marine vessel components and/or the first sensor unit to subtract overall marine vessel movement from the helm movement. The resulting difference may better indicate actual helm movement”), determining a desired action based on the amount of helm movement (disengage autopilot action is determined based on the identified gesture of the helm, see at least [¶ 7, Estabrook]: “CCU 206 may disengage the autonomous steering of the rudder 216 when the movement of the helm exceeds a predetermined autopilot disengagement threshold. For example, the predetermined autopilot disengagement threshold may be an amount of rotation (e.g., accelerometer data from sensor unit indicating rotation in excess of 45 degrees will cause disengagement of the autonomous steering), rate of rotation (e.g., gyroscope data from sensor unit indicating rotation in excess of 3 degrees/second for 2 seconds), or rate of rotational acceleration (e.g., gyroscope data from sensor unit indicating acceleration of rotation in excess of 3 degrees/second for 2 seconds). CCU 206 may cause autopilot 204 to enter a standby mode or to temporarily pause sending control signals to ECU 208 when autopilot functionality is disengaged”), and causing operation of a marine device according to the desired action (autopilot steering system 200 sends a signal to a marine device (CCU 206) to disengage the autopilot, see at least [¶ 44, Estabrook]: “one or more other components of autopilot steering system 200 may determine when the movement of the helm exceeds a predetermined autopilot disengagement threshold and transmit a command to autopilot 204, which is received by CCU 206 and acted upon to disengage autopilot 204. In such a case, CCU 206 may disengage autopilot 204 without necessarily analyzing the movement data”).
Estabrook does not explicitly teach identify one or more gestures, and determining a desired action based on the identified one or more gestures.
Bertrand, directed to a handheld device for navigating a marine vessel teaches identify one or more gestures, and determining a desired action based on the identified one or more gestures (handheld device identifies gestures, see at least [¶ 57, Bertrand]: “The handheld device 200 is configured to control the configuration and operation of the one or more marine vessel 100 motors (e.g., trolling motor(s) 120, primary motor(s) 122, and/or thruster(s) 124) based on a current position or orientation of the handheld device 200, user movements (e.g., gestures), and/or inputs received from user interface 220”).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, with a reasonable expectation of success, to have modified Estabrook’s method of providing user input to the helm to incorporate the teachings of Bertrand which teaches identify one or more gestures, and determining a desired action based on the identified one or more gestures since they are both related to controlling marine vessels and incorporation of the teachings of Bertrand increase the versatility and safety of the overall system by utilizing gesture detection to steer the vessel in the case where the user is unable to physically reach the helm due to an emergency.
Regarding claim 11 and 30, Estabrook in view of Bertrand teach the system of Claim 9 (re-claim 11), and the marine electronic device of Claim 28 (re-claim 30), wherein the sensor module that determines gestures based on one or more of a position change and an orientation change of helm within the modified difference data set.
Bertrand, directed to a handheld device for navigating a marine vessel further teaches, wherein the sensor module uses an external tracked device as the only object detected in the field of view and determines gestures based on one or more of a position change and an orientation change of the external tracked device (external tracked device is the handheld device 200 that determines gestures based on the position and orientation of the external tracked device, see at least [¶ 39, Bertrand]: “The handheld device 200 is configured to generate at least one control signal for the one or more motors based on a position (e.g., GPS coordinates, altitude, etc.), a pointing direction (e.g., North, East, South, West, etc.), and/or an orientation (e.g., pitch, roll, yaw, etc.) of the handheld device 200. In some embodiments, the handheld device 200 is also configured to generate at least one control signal for a motor based on a gesture performed with the handheld device 200. For example, a gesture can correspond to a series of detected position, direction, and/or orientation measurements”).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, with a reasonable expectation of success, to have modified Estabrook in view of Bertrand’s method of providing user input to the helm to further incorporate the teachings of Bertrand which teaches wherein the sensor module uses an external tracked device as the only object detected in the field of view and determines gestures based on one or more of a position change and an orientation change of the external tracked device since they are both related to controlling marine vessels and incorporation of the teachings of Bertrand increase the versatility and safety of the overall system by utilizing gesture detection device to steer the vessel in the case where the user is unable to physically reach the helm due to an emergency.
Regarding claims 12 and 31, Estabrook in view of Bertrand teach the system of Claim 9 (re-claim 12), and the marine electronic device of claim 28 (re-claim 31).
Bertrand, directed to a handheld device for navigating a marine vessel further teaches wherein the marine device is the display, and the display is a multi-functional display (see at least [¶ 70, Fig. 7A and 7B, Bertrand]: “As shown in FIGS. 7A and 7B, the marine vessel display system 300 can include at least one input 314 for receiving data from one or more marine input sources 316; a display 308 for presenting information representative of at least some of the data from the marine input sources 316; and a processing system 302 in communication with the inputs 314 and the display 308.”).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, with a reasonable expectation of success, to have modified Estabrook in view of Bertrand’s method of providing user input to the helm to further incorporate the teachings of Bertrand which teaches wherein the marine device is the display, and the display is a multi-functional display since they are both related to controlling marine vessels and incorporation of the teachings of Bertrand increase the utility of the overall system by providing options for displaying information from the multiple marine data input sources.
Regarding claims 15 and 34, Estabrook in view of Bertrand teach the system of Claim 9 (re-claim 15) and the marine electronic device of Claim 28 (re-claim 34), wherein the movement of the marine vessel between the first and second times is captured by an accelerometer (marine vessel movement measured by an accelerometer, see at least [¶ 30, Estabrook]: “Sensor array 224 may be implemented as any suitable number and/or type of sensors configured to measure, monitor, and/or quantify one or more environmental characteristics such as helm movement or marine vessel movement. For example, sensor array 224 may include one or more accelerometers, gyroscopes, and/or magnetometers configured to measure sensor metrics in one or more axes”).
Regarding claims 16 and 35, Estabrook in view of Bertrand teach the system of claim 9 (re-claim 16, and the marine electronic device of claim 28 (re-claim 35),
Bertrand, directed to a handheld device for navigating a marine vessel further teaches, wherein the marine device is a trolling motor (marine device is a trolling motor, see at least [¶ 48, Bertrand]: “The handheld device 200 can include a transmitter 218A and a receiver 218B. Controller 202 may control the transmitter 218A to send communications (e.g., control signals, directional and/or orientation measurements, etc.) to a motor (e.g., trolling motor 120, primary motor 122, and/or thruster 124, as shown in FIGS. 5B through 5D)”).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, with a reasonable expectation of success, to have modified Estabrook in view of Bertrand’s method of providing user input to the helm to further incorporate the teachings of Bertrand which teaches wherein the marine device is a trolling motor since they are both related to controlling marine vessels and incorporation of the teachings of Bertrand increase the versatility and safety of the overall system since Bertrand teaches that “a user may prefer to use the trolling motor instead of the primary motor when navigating the marine vessel at low speeds or through environments that require precision (e.g., navigating around obstacles and/or in shallow water)”, [¶2, Bertrand].
Regarding claims 17 and 37, Estabrook in view of Bertrand teach the system of claim 9 (re-claim 16, and the marine electronic device of claim 28 (re-claim 35), wherein the determined desired action is no action based on the identified one or more gestures being unintended commands (Method 400, block 408 determines if the helm and marine vessel movement has exceeded the disengagement threshold and if not, continues autopilot and does no action, see at least [¶ 84, Fig. 4, Estabrook]: “Method 400 may include determining whether movement of the helm exceeds a predetermined autopilot disengagement threshold (block 408). This may include, for example, receiving a helm steering indicator from another marine vessel component, as discussed herein with reference to FIGS. 2 and 3 (block 408). This may also include, for example, analyzing the helm movement data or a combination of helm movement data and marine vessel movement data to determine whether the helm has been steered beyond a predetermined autopilot disengagement threshold helm angle (block 408). If so, method 400 may proceed to disengage the autopilot (block 410). If not, method 400 may revert back to continuing to control the rudder using the autopilot (block 404)”).
Regarding claim 36, Estabrook in view of Bertrand teach the marine electronic device of claim 28 (re-claim 35), wherein the marine device is the marine electronic device (the marine device is the autopilot steering system 200, see at least [¶ 11, Fig. 2, Estabrook]: “FIG. 2 is a block diagram example illustrating interconnected marine vessel components of an autopilot steering system 200”).
Claims 10 and 29 are rejected under 35 U.S.C. 103 as being unpatentable over Estabrook (US 20170205828 A1) in view of Bertrand et al. (US 20190137993 A1) as applied to claims 9, 11-12, 15-28, 30-31, and 34-38 and further in view of Di Censo et al. (US 20160185385 A1).
Regarding claims 10 and 29, Estabrook in view of Bertrand teach the system of claim 9 (re- claim 10) and the marine electronic device of claim 28 (re-claim 29).
Estabrook in view of Bertrand does not explicitly teach, wherein the sensor module is configured to detect objects within the field of view using a projected light pattern
De Censo, directed to human-vehicle interfaces to a steering wheel control system teaches, wherein the sensor module is configured to detect objects within the field of view using a projected light pattern (hand gestures are detected using laser or structured light sensors which use a projected light pattern, see at least [¶ 42, Fig. 5, Di Censo]: “FIG. 5 illustrates techniques for modifying different vehicle parameters via one of the input regions 210 of FIG. 2 and one or more hand gestures, according to various embodiments. As described above, the system 100 may include one or more sensors (e.g., visual sensors, depth sensors, infrared sensors, time-of-flight sensors, ultrasound sensors, radar sensors, laser sensors, thermal sensors, structured light sensors) that track the location of the hand(s) of a user. In such embodiments, a user hand gesture may be detected by the control application 130 via the sensor(s), and a selected vehicle parameter may be modified based on the gesture”).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, with a reasonable expectation of success, to have modified Estabrook in view of Bertrand’s method of providing user input to the helm to further incorporate the teachings of Di Censo which teaches wherein the sensor module is configured to detect objects within the field of view using a projected light pattern since they are both related to controlling steering of vehicles and incorporation of the teachings of Di Censo would increase the utility and safety of the steering system since Di Censo teaches that “the steering wheel controls can be operated with a low cognitive load, reducing the degree to which operation of vehicle systems distracts the driver from driving tasks. Additionally, the techniques described herein enable the multiple sets of physical buttons typically found in conventional vehicle systems to be replaced with a simpler and less expensive interface”, [¶50, Di Censo].
Claims 13, 14, 32, and 33 are rejected under 35 U.S.C. 103 as being unpatentable over Estabrook (US 20170205828 A1) in view of Bertrand et al. (US 20190137993 A1) as applied to claims 9, 11-12, 15-28, 30-31, and 34-38 and further in view of Yanai (US 20140022171 A1).
Regarding claims 13 and 32, Estabrook in view of Bertrand teach the system of Claim 12 (re-claim 13) and the marine electronic device of claim 28 (re-claim 32).
Bertrand, directed to a handheld device for navigating a marine vessel, teaches wherein the identified one or more gestures is a touch input (touch input gestures are identified, see at least [¶ 66, Fig. 6H, Bertrand]: “as shown in FIG. 6H, the touch-sensitive input device 224 of user interface 220 may provide touch inputs, such as a swipe in the lateral direction (swipe left or swipe right), a swipe in the vertical direction (swipe up or swing down), or any combination thereof”).
Estabrook in view of Bertrand does not explicitly teach wherein the identified one or more gestures is a hand wave.
Yanai, directed to a system and method for implementing a remote-controlled user interface using close range object tracking are described teaches, wherein the identified one or more gestures is a hand wave (see at least [¶ 58, Yanai]: “the user may control an external system, platform or device by waving his hands or making predefined gestures near the control device”).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, with a reasonable expectation of success, to have modified Estabrook in view of Bertrand’s method of providing user input to multifunctional display further incorporate the teachings of Yanai which teaches wherein the identified one or more gestures is a hand wave to control the display, since they are both related to controlling electronic devices using gestures and incorporation of the teachings of Yanai would enhance user experience of the overall system in a case where the user is unable to touch the display, the user can control the display using hand wave gestures since Yanai teaches “field of gesture recognition have shown the benefits of using gestures or movement tracking to enhance the user experience for controlling an electronic device”, [¶9, Yanai].
Regarding claims 14 and 33, Estabrook in view of Bertrand and Yanai teaches the system of Claim 13 (re-claim 14), and the marine electronic device of Claim 32 (re-claim 33).
Bertrand, directed to a handheld device for navigating a marine vessel teaches, wherein the desired action is changing a display mode, (see at least [Claim 1, Waeller]: “As shown in FIGS. 7A and 7B, the marine vessel display system 300 can include at least one input 314 for receiving data from one or more marine input sources 316; a display 308 for presenting information representative of at least some of the data from the marine input sources 316; and a processing system 302 in communication with the inputs 314 and the display 308. As described in more detail below, the processing system 302 may implement a plurality of modes of operation, each of which may cause the display 308 to present information representative of data from predetermined ones of the marine input sources 316 and in selected formats. The marine vessel display system 300 may further comprise a location determining component 312 that furnishes geographic position data for the marine vessel 100 (similar in function to location determining components 232, 174). The processing system 302 may implement a mode selector 304 configured to select between a plurality of modes of operation, respective ones of which present information representative of data from selected marine input sources 316 on the display 308”).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention, with a reasonable expectation of success, to have modified Estabrook in view of Bertrand and Yanai’s method of providing user input to the multi-functional display using a hand wave gesture to further incorporate the teachings of Bertrand which teaches wherein the desired action is changing a display mode, since they are both related to controlling marine electronic devices and incorporation of the teachings of Bertrand would enhance user experience of the overall system by displaying relevant information to the user by switching the screen between multiple display modes.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to IRENE C KHUU whose telephone number is (703)756-1703. The examiner can normally be reached Monday - Friday 0900-1730.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.Khuu, Irene
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Rachid Bendidi can be reached on (571)272-4896. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visi