Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This Office Action is in response to Applicant Amendment and Arguments filed on 11/25/2025.
Claim(s) 1-7 and 10-20 are pending for examination.
This Action is made FINAL.
Response to Arguments
With regards to claim(s) 1 and 5-14 previously rejected under 35 U.S.C. 102 and claim(s) 2-4 and 15-20 previously rejected under 35 U.S.C. 103, Applicant's arguments have been fully considered, but are deemed moot in view of new grounds of rejection.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are a “a provisional initial information specifier configured to…” as recited in claim 12.
Because these claim limitations are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, they are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
Regarding the provisional initial information specifier the specification states in para [0038-0040] “As illustrated in Fig. 2, the provisional initial information specifier20 may include a camera21, an operational input interface22, and a provisional initial information setting sub-module23. The provisional initial information setting sub-module23 can be implemented as provisional initial information setting circuitry.
The camera21 may be connected to the operational input interface22. The camera21 may be, for example, a monocular camera, which images an area including the object (for example, a quay). The camera21 may output the captured image to the operational input interface22.
The operational input interface22 may be, for example, realized by a touch panel. The operational input interface22 may display the inputted image. The operational input interface22 may accept an operational input from a user, and detect an operated position on the image (a locus of the operation). The operational input interface22 may output the operated position (the locus of the operation) to the provisional initial information setting sub-module23.”
The structure of the provisional initial information specifier will be interpreted as a camera, touchpad, and/or circuitry.
If applicant does not intend to have these limitations interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitations to avoid them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recites sufficient structure to perform the claimed function so as to avoid them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1, 5-7, 13 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Akuzawa et al. (US 20190308713 A1, hereinafter known as Akuzawa) in view of Hara et al. (US 20220001970 A1, hereinafter known as Hara).
Regarding claim 1, Akuzawa teaches A ship navigation assistance system, comprising: a measurement sensor configured to acquire measurement information on a quay using a ranging result of an area including the quay that is an anchorage target of a ship;
{Para [0055-0056] “The sensing device 46 detects the shapes of objects surrounding the boat body 2 and the positional relationship between the objects and the boat body 2. The positional relationship between the objects and the boat body 2 includes the distance between the objects and the boat body 2 and the direction in which the object is positioned with respect to the boat body 2. Objects surrounding the boat body 2 include, for example, piers, wharves, other boats, obstructions, or the like.
The sensing device 46 includes one type of sensor among a radar, a laser, a camera or an ultrasonic sensor, or includes a plurality of types of sensors. The sensing device 46 may include a plurality of radars, a plurality of lasers, a plurality of cameras, or a plurality of ultrasonic sensors. The radar includes a millimeter wave radar, a microwave radar, or another radar of a different wavelength. The sensing device 46 detects and outputs environment information during a below-described automatic shore arrival control.”
}
and processing circuitry configured to determine provisional initial information based on a
{para [0068] “In step S203, the controller 41 evaluates whether there is an input of the target position for the shore arrival. Here, the input of the target position on the environment map 62 is accepted by the input 44. The operator touches the possible shore arrival position on the environment map 62, such that the touched position is inputted as the target position. The input 44 outputs target position information which indicates the target position to the controller 41.”
Para [0070] “In step S205, the controller 41 corrects the target position. The controller 41 corrects the target position based on the possible shore arrival space SP1. For example, as illustrated in FIG. 10, when an inputted target position IP1 is outside of the possible shore arrival space SP1, the controller 41 corrects a target position Tp so that the target position is within the possible shore arrival space SP1. When an inputted target position IP2 is inside the possible shore arrival space SP1, the controller 41 corrects the target position Tp so that the target position becomes the center position of the possible shore arrival space SP1.”
Para [0074] “In step S207, the controller 41 displays the target position and the target bearing with an icon 71′ on the environment map 62. Here, as illustrated in FIG. 9, the controller 41 sets the target position corrected in step S205 or the target position automatically set in step S206 as the target position, and displays the icon 71′ which indicates the host boat in the position on the environment map 62. The icon 71′ is displayed in the target bearing determined by the controller 41 in the initial state. The controller 41 determines the target bearing of the boat body 2 based on the shape of the shore arrival location, the current bearing, the distance to the target position, or the like. For example, when the shore arrival location is a pier, the controller 41 determines a direction along the edge of the shore arrival location as the target bearing. Alternatively, the controller 41 may determine a direction that defines a predetermined angle with the direction along the edge of the shore arrival location, as the target bearing. Moreover, the controller 41 may change the target bearing in response to the current bearing or the distance to the target position.”
Para [0074] “In step S206, the controller 41 automatically sets the target position. Here, as illustrated in FIG. 12, the controller 41 sets the closest position in the current bow direction among the positions along the shore arrival location, as the target position.”
Meaning there are multiple possible positions
}
update characteristic information on the quay using the initial characteristic information on the quay and characteristic information before updating on the object, and the measurement information.
{Para [0056] “The sensing device 46 detects and outputs environment information during a below-described automatic shore arrival control.”
Para [0063-0064] “The shapes of the shore arrival location, the obstructions, and the surrounding structures recognized by the controller 41 are displayed on the environment map 62. While not illustrated in FIG. 9, other boats recognized by the controller 41 are also displayed on the environment map 62. The controller 41 displays the current position and the current bearing of the boat body 2 obtained from the position information on the environment map 62 with an icon 71 of the boat body 2.
The environment map 62 is updated in real time due to the repeated detection of the position information by the positional sensor 45 and the repeated detection of the environment information by the sensing device 46. The plurality of operating keys include a scale changing key 63. By operating the scale changing key 63, the displayed scale of the environment map 62 is enlarged or reduced.”
Where a map is being updated based on the environmental information which includes the measurement information.
In order to update the map, objects are being recognized fig. 5 and para [0061] “In step S104, the controller 41 or the FPGA 49 recognizes a shore arrival location, another boat, an obstruction, or a surrounding structure based on the environment information. The shore arrival location is, for example, a pier. The controller 41 or the FPGA 49 recognizes another boat or an obstruction based on the shape of the object detected by the sensing device 46. For example, the controller 41 or the FPGA 49 recognizes the shore arrival location and the surrounding structure based on the height and length of the object detected by the sensing device 46.”
In order to recognize an object some initial/prior data on the object must be used.
}
wherein the characteristic information on the quay includes a quay line including a vector quantity determined by a spatial relationship between the ship and the object.
{Para [0055-0057] “The sensing device 46 detects the shapes of objects surrounding the boat body 2 and the positional relationship between the objects and the boat body 2. The positional relationship between the objects and the boat body 2 includes the distance between the objects and the boat body 2 and the direction in which the object is positioned with respect to the boat body 2. Objects surrounding the boat body 2 include, for example, piers, wharves, other boats, obstructions, or the like.
The sensing device 46 includes one type of sensor among a radar, a laser, a camera or an ultrasonic sensor, or includes a plurality of types of sensors. The sensing device 46 may include a plurality of radars, a plurality of lasers, a plurality of cameras, or a plurality of ultrasonic sensors. The radar includes a millimeter wave radar, a microwave radar, or another radar of a different wavelength. The sensing device 46 detects and outputs environment information during a below-described automatic shore arrival control.
The environment information indicates the shape of the shore arrival location and the positional relationship between the shore arrival location and the boat body 2. The environment information may indicate the shore arrival location or other boats surrounding the boat body 2. The environment information may indicate the shore arrival location or structures or obstructions surrounding the boat body 2. The environment information is indicated, for example, by coordinates of point groups indicating the position of an object detected by the sensing device 46. Alternatively, the environment information may be the shape and position of an object captured by image recognition.”
It should be noted that vectors are merely the difference between two points and thus a point group can be trivially converted into a vector quantity. It should be also noted that a quay line vector is contemplated by as Akuzawa as the point groups are determined and the angle between the boat and object are determined as well. Additionally by rendering a line between two points (as done in fig. 9) one can say a vector has been established.
Para [0065] “FIG. 6 is a flow chart illustrating processing to set a target position of the shore arrival. As illustrated in step S201 in FIG. 6, the controller 41 determines a possible shore arrival space. The controller 41 determines the possible shore arrival space based on the environment information. As illustrated in FIG. 10, the controller 41 determines a position along the object recognized as the shore arrival location, as a possible shore arrival space SP1.”
PNG
media_image1.png
674
454
media_image1.png
Greyscale
}
the processing circuitry is further configured, based on the updated characteristic information, to suppress errors in a distance and a direction between the ship and the quay line when the ship is moving.
{Para [0009] “In addition, a possible shore arrival space of the boat body is determined in the surrounding environment based on the environment information, and the shore arrival target position is corrected based on the possible shore arrival space. As a result, even if an inexperienced user makes an error while inputting a shore arrival target position, the shore arrival target position is corrected to a suitable position. As a result, the boat is able to arrive at the shore easily even in an unspecified harbor.”
}
Akuzawa does not teach, determine provisional initial information based on a provisional quay line and a provisional quay reference point
However, Hara teaches determine provisional initial information based on a provisional quay line and a provisional quay reference point
{Para [0073-0074] “The docking point setting unit 41 in FIG. 1 detects, from the local map 36, a candidate location for automatic docking of the ship 95. As illustrated in FIG. 2, on the local map 36, a group of points that are considered to represent docking facilities appears so as to line up in one direction in front of the occlusion area 38. Therefore, the docking point setting unit 41 uses an appropriate calculation algorithm to detect a straight line 39 along the group of points. The straight line 39 represents the direction of a docking facility.
The user may set a target point (may be referred to as a docking point B1 in the following description) for the actual automatic docking of the ship 95 at a location near the straight line 39 detected by the docking point setting unit 41 via the interface unit 81 described below. The user sets the docking point B1 near the docking facility (specifically, the straight line 39) in which docking is considered to be possible in consideration of the entire length of the ship 95. FIG. 3 illustrates an example of the docking point B1, although the land form is different from that in the example of FIG. 2. The ship 95 is docked in a direction along the direction of the docking facility (the direction of the straight line 39). The user makes a selection as to which direction the bow is to be turned upon docking, in other words, whether the portside is to be brought into the dock or the bow side is to be brought into the dock.”
}
It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Akuzawa to incorporate the teachings of Hara to set a provisional quay line as it lets the user determine which direction they want boat to be docked in (para [0074] “The user may set a target point (may be referred to as a docking point B1 in the following description) for the actual automatic docking of the ship 95 at a location near the straight line 39 detected by the docking point setting unit 41 via the interface unit 81 described below. The user sets the docking point B1 near the docking facility (specifically, the straight line 39) in which docking is considered to be possible in consideration of the entire length of the ship 95. FIG. 3 illustrates an example of the docking point B1, although the land form is different from that in the example of FIG. 2. The ship 95 is docked in a direction along the direction of the docking facility (the direction of the straight line 39). The user makes a selection as to which direction the bow is to be turned upon docking, in other words, whether the portside is to be brought into the dock or the bow side is to be brought into the dock.”)
Regarding claim 5, Akuzawa in view of Hara teaches The ship navigation assistance system of claim 1. Akuzawa further teaches wherein the measurement sensor is further comprising: a rangefinder configured to perform three-dimensional ranging of the area including the object; and measurement information generation circuitry configured to generate the measurement information using a result of the three-dimensional ranging.
{Para [0055-0057] “The sensing device 46 detects the shapes of objects surrounding the boat body 2 and the positional relationship between the objects and the boat body 2. The positional relationship between the objects and the boat body 2 includes the distance between the objects and the boat body 2 and the direction in which the object is positioned with respect to the boat body 2. Objects surrounding the boat body 2 include, for example, piers, wharves, other boats, obstructions, or the like.
The sensing device 46 includes one type of sensor among a radar, a laser, a camera or an ultrasonic sensor, or includes a plurality of types of sensors. The sensing device 46 may include a plurality of radars, a plurality of lasers, a plurality of cameras, or a plurality of ultrasonic sensors. The radar includes a millimeter wave radar, a microwave radar, or another radar of a different wavelength. The sensing device 46 detects and outputs environment information during a below-described automatic shore arrival control.
The environment information indicates the shape of the shore arrival location and the positional relationship between the shore arrival location and the boat body 2. The environment information may indicate the shore arrival location or other boats surrounding the boat body 2. The environment information may indicate the shore arrival location or structures or obstructions surrounding the boat body 2. The environment information is indicated, for example, by coordinates of point groups indicating the position of an object detected by the sensing device 46. Alternatively, the environment information may be the shape and position of an object captured by image recognition.”
}
Regarding claim 6, Akuzawa in view of Hara teaches The ship navigation assistance system of claim 5. Akuzawa further teaches wherein the rangefinder is further comprising an optical rangefinder.
{Para [0056] “The sensing device 46 includes one type of sensor among a radar, a laser, a camera or an ultrasonic sensor, or includes a plurality of types of sensors. The sensing device 46 may include a plurality of radars, a plurality of lasers, a plurality of cameras, or a plurality of ultrasonic sensors. The radar includes a millimeter wave radar, a microwave radar, or another radar of a different wavelength. The sensing device 46 detects and outputs environment information during a below-described automatic shore arrival control.”
}
Regarding claim 7, Akuzawa in view of Hara teaches The ship navigation assistance system of claim 5. Akuzawa further teaches wherein the measurement sensor is further comprising an attitude measurement sensor configured to measure an attitude of the ship,
{Para [0054] “The positional sensor 45 detects the current position and the current bearing of the boat body 2 and outputs position information indicating the current position and the current bearing. The positional sensor 45 is, for example, an inertial navigation device and includes a global navigation satellite system (GNSS) device 47 and an inertial measurement unit (IMU) 48. The GNSS device 47 detects the current position and the boat speed of the boat body 2. The IMU 48 detects the angular speed and the acceleration of the boat body 2. In addition, the current bearing of the boat body 2 is detected by the GNSS device 47 and the IMU 48. The current bearing may be detected by a plurality of GNSS devices, a magnetic bearing sensor, or an electronic compass.”
}
and wherein the measurement information generation circuitry is further configured to generate the measurement information using the result of the three-dimensional ranging and the attitude.
{Para [0064] “The environment map 62 is updated in real time due to the repeated detection of the position information by the positional sensor 45 and the repeated detection of the environment information by the sensing device 46.”
}
Regarding claim 10, Akuzawa in view of Hara teaches The ship navigation assistance system of claim 9. Akuzawa further teaches wherein the characteristic information on the object contains coordinates of a quay reference point.
{Para [0057] “The environment information indicates the shape of the shore arrival location and the positional relationship between the shore arrival location and the boat body 2. The environment information may indicate the shore arrival location or other boats surrounding the boat body 2. The environment information may indicate the shore arrival location or structures or obstructions surrounding the boat body 2. The environment information is indicated, for example, by coordinates of point groups indicating the position of an object detected by the sensing device 46. Alternatively, the environment information may be the shape and position of an object captured by image recognition.”
}
Regarding claim 11, Akuzawa in view of Hara teaches The ship navigation assistance system of claim 10. Akuzawa further teaches wherein the measurement sensor is further comprising a position measurement sensor configured to measure a position of the ship,
{Para [0054] “The positional sensor 45 detects the current position and the current bearing of the boat body 2 and outputs position information indicating the current position and the current bearing. The positional sensor 45 is, for example, an inertial navigation device and includes a global navigation satellite system (GNSS) device 47 and an inertial measurement unit (IMU) 48. The GNSS device 47 detects the current position and the boat speed of the boat body 2. The IMU 48 detects the angular speed and the acceleration of the boat body 2. In addition, the current bearing of the boat body 2 is detected by the GNSS device 47 and the IMU 48. The current bearing may be detected by a plurality of GNSS devices, a magnetic bearing sensor, or an electronic compass.”
}
and wherein the processing circuitry is further configured to update the quay reference point using the attitude and the position of the ship, and the quay line.
{ Para [0064] “The environment map 62 is updated in real time due to the repeated detection of the position information by the positional sensor 45 and the repeated detection of the environment information by the sensing device 46.”
Para [0065] “As illustrated in FIG. 10, the controller 41 determines a position along the object recognized as the shore arrival location, as a possible shore arrival space SP1. For example, the controller 41 detects the disposition of the pier from the environment information and determines a predetermined range along the pier as the possible shore arrival space SP1.”
Where fig. 10 is an example of the environmental map.
}
Regarding claim 12, Akuzawa in view of Hara teaches The ship navigation assistance system of claim 1. Akuzawa further teaches further comprising: a provisional initial information specifier configured to accept a specification of provisional initial information for the characteristic information on the object; wherein: the processing circuitry is further configured to set the initial characteristic information on the object using the provisional initial information and the measurement information.
{Para [0056] “The sensing device 46 detects and outputs environment information during a below-described automatic shore arrival control.”
Para [0063-0064] “The shapes of the shore arrival location, the obstructions, and the surrounding structures recognized by the controller 41 are displayed on the environment map 62. While not illustrated in FIG. 9, other boats recognized by the controller 41 are also displayed on the environment map 62. The controller 41 displays the current position and the current bearing of the boat body 2 obtained from the position information on the environment map 62 with an icon 71 of the boat body 2.
The environment map 62 is updated in real time due to the repeated detection of the position information by the positional sensor 45 and the repeated detection of the environment information by the sensing device 46. The plurality of operating keys include a scale changing key 63. By operating the scale changing key 63, the displayed scale of the environment map 62 is enlarged or reduced.”
Where a map is being updated based on the environmental information which includes the measurement information.
In order to update the map, objects are being recognized fig. 5 and para [0061] “In step S104, the controller 41 or the FPGA 49 recognizes a shore arrival location, another boat, an obstruction, or a surrounding structure based on the environment information. The shore arrival location is, for example, a pier. The controller 41 or the FPGA 49 recognizes another boat or an obstruction based on the shape of the object detected by the sensing device 46. For example, the controller 41 or the FPGA 49 recognizes the shore arrival location and the surrounding structure based on the height and length of the object detected by the sensing device 46.”
In order to recognize an object some Initial data on the object must be used. In this case the provisional information is used to recognize the object and then when it is recognized the measurement information is used to include it on the map.
}
Regarding claim 13, it recites a method having limitations similar to those of claim 1 and therefore is rejected on the same basis.
Regarding claim 14, it recites A non-transitory computer readable medium having limitations similar to those of claim 1 and therefore is rejected on the same basis.
Additionally Akuzawa teaches A non-transitory computer readable medium storing instruction that, when executed by processing circuitry, cause a computer system to perform a method comprising:
{Para [0038] “Next, the boat operating mechanism and the control system of the boat 1 will be explained. FIG. 4 is a schematic view illustrating the boat operating mechanism and the control system of the boat 1. As illustrated in FIG. 4, the boat 1 includes a controller 41. The controller 41 includes a computation device such as a CPU and a storage device such as a RAM or a ROM, and is configured or programmed so as to control the boat 1.”
}
Claim(s) 2-4 and 15-20 are rejected under 35 U.S.C. 103 as being unpatentable over Akuzawa et al. (US 20190308713 A1, hereinafter known as Akuzawa) in view of Hara et al. (US 20220001970 A1, hereinafter known as Hara) and Lambert et al. (US 20200309529 A1, hereinafter known as Lambert).
Regarding Claim 2, Akuzawa in view of Hara teaches The ship navigation assistance system of claim 1
Akuzawa in view of Hara does not teach, wherein the processing circuitry is further configured: to calculate a difference between the initial characteristic information or the characteristic information before updating and each of a plurality of measurement information; to set a weighting coefficient to each of the plurality of measurement information using the difference; and to calculate updated characteristic information using the weighting coefficient and the plurality of measurement information.
However, Lambert et al. teaches wherein the processing circuitry is further configured: to calculate a difference between the initial characteristic information or the characteristic information before updating and each of a plurality of measurement information; to set a weighting coefficient to each of the plurality of measurement information using the difference; and to calculate updated characteristic information using the weighting coefficient and the plurality of measurement information.
{Para [0028] “In some embodiments in which lidars are used as exteroceptive sensors, the image matching may use an iterative closest point (ICP) algorithm. An ICP algorithm may seek to minimize the difference between two point clouds (e.g., acquired by lidars). One point cloud, referred to as the reference or target, may be fixed, while the other point cloud, referred to as the source, is transformed to best match the reference. The algorithm iteratively revises the transformation needed to minimize an error metric. The transformation may include a combination of translation and rotation. The error metric may be a distance from the source point cloud to the reference point cloud, such as the sum of squared differences between coordinates of the matched pairs. A final optimal transformation may be used to estimate the incremental relative motion of the exteroceptive sensors 140 (thus the incremental relative motion of the dynamic platform to which the exteroceptive sensor is attached). In some other embodiments, other correlation methods, such as optical flow, may be used to estimate the incremental relative motions cross the series of images.”
Para [0040] “the sensor fusion engine 120 may use the inputs from the INS 110 and the SLAM unit 130 to produce estimates of current positions and estimates of current orientations that tend to be more accurate than those based solely on the INS 110. In some embodiments, the sensor fusion engine may work in a two-step process. In a prediction step, the sensor fusion engine 120 may produce estimates of current positions and estimates of current orientations, along with their uncertainties. Once the outcome of the next measurements (e.g., the inputs from the INS 110 and the SLAM unit 130) are observed, the estimates may be updated using a weighted average of the measurements. The weighting in the average may be based on the errors associated with measurements. For example, more weights may be given to measurements with higher certainly (i.e., less errors). The sensor fusion engine 120 may also estimate errors of the estimates of current positions and errors of the estimates of current orientations. The sensor fusion engine 120 may further estimate sensor errors, such as the errors of linear accelerations and errors of the angular velocities measured by the three-axis accelerometer 112 and the three-axis gyroscope 114, respectively, as well as errors of the GNSS data and/or errors of the wheel odometer data.”
}
It would have been prima facie obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Akuzawa in view of Hara to incorporate the teachings of Lambert to weigh measurements based on the difference with a reference because it allows for measurements with high error to be weighted less which would improve accuracy. (Para [0040] “the sensor fusion engine 120 may use the inputs from the INS 110 and the SLAM unit 130 to produce estimates of current positions and estimates of current orientations that tend to be more accurate than those based solely on the INS 110. In some embodiments, the sensor fusion engine may work in a two-step process. In a prediction step, the sensor fusion engine 120 may produce estimates of current positions and estimates of current orientations, along with their uncertainties. Once the outcome of the next measurements (e.g., the inputs from the INS 110 and the SLAM unit 130) are observed, the estimates may be updated using a weighted average of the measurements. The weighting in the average may be based on the errors associated with measurements. For example, more weights may be given to measurements with higher certainly (i.e., less errors).”)
Regarding Claim 3, Akuzawa in view of Hara and Lambert teaches The ship navigation assistance system of claim 2.
Lambert further teaches wherein the processing circuitry is further configured to set, as the weighting coefficient, a first weighting coefficient set to a distance between the object and the ship, and a second weighting coefficient set to a direction of the object on the basis of the ship, and to calculate the updated characteristic information using the first weighting coefficient and the second weighting coefficient.
{Para [0040] “the sensor fusion engine 120 may use the inputs from the INS 110 and the SLAM unit 130 to produce estimates of current positions and estimates of current orientations that tend to be more accurate than those based solely on the INS 110. In some embodiments, the sensor fusion engine may work in a two-step process. In a prediction step, the sensor fusion engine 120 may produce estimates of current positions and estimates of current orientations, along with their uncertainties. Once the outcome of the next measurements (e.g., the inputs from the INS 110 and the SLAM unit 130) are observed, the estimates may be updated using a weighted average of the measurements. The weighting in the average may be based on the errors associated with measurements. For example, more weights may be given to measurements with higher certainly (i.e., less errors). The sensor fusion engine 120 may also estimate errors of the estimates of current positions and errors of the estimates of current orientations.”
Were both position and orientation have error calculated separately and the measurements position and orientation measurements are weighted. Thus it is implied that the position average would be weighted based on position error (a first coefficient) and the orientation average would be weighted based on orientation error (a second coefficient)
Akuzawa already teaches that the position is a distance between the object and the ship and the orientation is a direction between the ship and the object in para [0063] and para [0080]
}
Regarding Claim 4, Akuzawa in view of Hara and Lambert teaches The ship navigation assistance system of claim 2.
Akuzawa further teaches wherein the processing circuitry is further configured to calculate the updated characteristic information using the characteristic information before updating and the calculated characteristic information.
{Para [0063-0064] “The shapes of the shore arrival location, the obstructions, and the surrounding structures recognized by the controller 41 are displayed on the environment map 62. While not illustrated in FIG. 9, other boats recognized by the controller 41 are also displayed on the environment map 62. The controller 41 displays the current position and the current bearing of the boat body 2 obtained from the position information on the environment map 62 with an icon 71 of the boat body 2.
The environment map 62 is updated in real time due to the repeated detection of the position information by the positional sensor 45 and the repeated detection of the environment information by the sensing device 46. The plurality of operating keys include a scale changing key 63. By operating the scale changing key 63, the displayed scale of the environment map 62 is enlarged or reduced.”
Where a map is being updated based on the environmental information which includes the measurement information.
In order to update the map, objects are being recognized fig. 5 and para [0061] “In step S104, the controller 41 or the FPGA 49 recognizes a shore arrival location, another boat, an obstruction, or a surrounding structure based on the environment information. The shore arrival location is, for example, a pier. The controller 41 or the FPGA 49 recognizes another boat or an obstruction based on the shape of the object detected by the sensing device 46. For example, the controller 41 or the FPGA 49 recognizes the shore arrival location and the surrounding structure based on the height and length of the object detected by the sensing device 46.”
In order to recognize an object some initial/prior data on the object must be used.
}
Regarding Claim 15, Akuzawa in view of Hara and Lambert teaches The ship navigation assistance system of claim 3.
Akuzawa further teaches wherein the processing circuitry is further configured to calculate the updated characteristic information using the characteristic information before updating and the calculated characteristic information.
{Para [0063-0064] “The shapes of the shore arrival location, the obstructions, and the surrounding structures recognized by the controller 41 are displayed on the environment map 62. While not illustrated in FIG. 9, other boats recognized by the controller 41 are also displayed on the environment map 62. The controller 41 displays the current position and the current bearing of the boat body 2 obtained from the position information on the environment map 62 with an icon 71 of the boat body 2.
The environment map 62 is updated in real time due to the repeated detection of the position information by the positional sensor 45 and the repeated detection of the environment information by the sensing device 46. The plurality of operating keys include a scale changing key 63. By operating the scale changing key 63, the displayed scale of the environment map 62 is enlarged or reduced.”
Where a map is being updated based on the environmental information which includes the measurement information.
In order to update the map, objects are being recognized fig. 5 and para [0061] “In step S104, the controller 41 or the FPGA 49 recognizes a shore arrival location, another boat, an obstruction, or a surrounding structure based on the environment information. The shore arrival location is, for example, a pier. The controller 41 or the FPGA 49 recognizes another boat or an obstruction based on the shape of the object detected by the sensing device 46. For example, the controller 41 or the FPGA 49 recognizes the shore arrival location and the surrounding structure based on the height and length of the object detected by the sensing device 46.”
In order to recognize an object some initial/prior data on the object must be used.
}
Regarding Claim 16, Akuzawa in view of Hara and Lambert teaches The ship navigation assistance system of claim 2.
Akuzawa further teaches wherein the measurement sensor is further comprising: a rangefinder configured to perform three-dimensional ranging of the area including the object; and measurement information generation circuitry configured to generate the measurement information using a result of the three-dimensional ranging.
{Para [0055-0057] “The sensing device 46 detects the shapes of objects surrounding the boat body 2 and the positional relationship between the objects and the boat body 2. The positional relationship between the objects and the boat body 2 includes the distance between the objects and the boat body 2 and the direction in which the object is positioned with respect to the boat body 2. Objects surrounding the boat body 2 include, for example, piers, wharves, other boats, obstructions, or the like.
The sensing device 46 includes one type of sensor among a radar, a laser, a camera or an ultrasonic sensor, or includes a plurality of types of sensors. The sensing device 46 may include a plurality of radars, a plurality of lasers, a plurality of cameras, or a plurality of ultrasonic sensors. The radar includes a millimeter wave radar, a microwave radar, or another radar of a different wavelength. The sensing device 46 detects and outputs environment information during a below-described automatic shore arrival control.
The environment information indicates the shape of the shore arrival location and the positional relationship between the shore arrival location and the boat body 2. The environment information may indicate the shore arrival location or other boats surrounding the boat body 2. The environment information may indicate the shore arrival location or structures or obstructions surrounding the boat body 2. The environment information is indicated, for example, by coordinates of point groups indicating the position of an object detected by the sensing device 46. Alternatively, the environment information may be the shape and position of an object captured by image recognition.”
}
Regarding Claim 17, Akuzawa in view of Hara and Lambert teaches The ship navigation assistance system of claim 3.
Akuzawa further teaches wherein the measurement sensor is further comprising: a rangefinder configured to perform three-dimensional ranging of the area including the object; and measurement information generation circuitry configured to generate the measurement information using a result of the three-dimensional ranging.
{Para [0055-0057] “The sensing device 46 detects the shapes of objects surrounding the boat body 2 and the positional relationship between the objects and the boat body 2. The positional relationship between the objects and the boat body 2 includes the distance between the objects and the boat body 2 and the direction in which the object is positioned with respect to the boat body 2. Objects surrounding the boat body 2 include, for example, piers, wharves, other boats, obstructions, or the like.
The sensing device 46 includes one type of sensor among a radar, a laser, a camera or an ultrasonic sensor, or includes a plurality of types of sensors. The sensing device 46 may include a plurality of radars, a plurality of lasers, a plurality of cameras, or a plurality of ultrasonic sensors. The radar includes a millimeter wave radar, a microwave radar, or another radar of a different wavelength. The sensing device 46 detects and outputs environment information during a below-described automatic shore arrival control.
The environment information indicates the shape of the shore arrival location and the positional relationship between the shore arrival location and the boat body 2. The environment information may indicate the shore arrival location or other boats surrounding the boat body 2. The environment information may indicate the shore arrival location or structures or obstructions surrounding the boat body 2. The environment information is indicated, for example, by coordinates of point groups indicating the position of an object detected by the sensing device 46. Alternatively, the environment information may be the shape and position of an object captured by image recognition.”
}
Regarding Claim 18, Akuzawa in view of Hara and Lambert teaches The ship navigation assistance system of claim 4.
Akuzawa further teaches wherein the measurement sensor is further comprising: a rangefinder configured to perform three-dimensional ranging of the area including the object; and measurement information generation circuitry configured to generate the measurement information using a result of the three-dimensional ranging.
{Para [0055-0057] “The sensing device 46 detects the shapes of objects surrounding the boat body 2 and the positional relationship between the objects and the boat body 2. The positional relationship between the objects and the boat body 2 includes the distance between the objects and the boat body 2 and the direction in which the object is positioned with respect to the boat body 2. Objects surrounding the boat body 2 include, for example, piers, wharves, other boats, obstructions, or the like.
The sensing device 46 includes one type of sensor among a radar, a laser, a camera or an ultrasonic sensor, or includes a plurality of types of sensors. The sensing device 46 may include a plurality of radars, a plurality of lasers, a plurality of cameras, or a plurality of ultrasonic sensors. The radar includes a millimeter wave radar, a microwave radar, or another radar of a different wavelength. The sensing device 46 detects and outputs environment information during a below-described automatic shore arrival control.
The environment information indicates the shape of the shore arrival location and the positional relationship between the shore arrival location and the boat body 2. The environment information may indicate the shore arrival location or other boats surrounding the boat body 2. The environment information may indicate the shore arrival location or structures or obstructions surrounding the boat body 2. The environment information is indicated, for example, by coordinates of point groups indicating the position of an object detected by the sensing device 46. Alternatively, the environment information may be the shape and position of an object captured by image recognition.”
}
Regarding Claim 19, Akuzawa in view of Hara and Lambert teaches The ship navigation assistance system of claim 16.
Akuzawa further teaches herein the rangefinder is further comprising an optical rangefinder.
{Para [0056] “The sensing device 46 includes one type of sensor among a radar, a laser, a camera or an ultrasonic sensor, or includes a plurality of types of sensors. The sensing device 46 may include a plurality of radars, a plurality of lasers, a plurality of cameras, or a plurality of ultrasonic sensors. The radar includes a millimeter wave radar, a microwave radar, or another radar of a different wavelength. The sensing device 46 detects and outputs environment information during a below-described automatic shore arrival control.”
}
Regarding Claim 20, Akuzawa in view of Hara and Lambert teaches The ship navigation assistance system of claim 17.
Akuzawa further teaches herein the rangefinder is further comprising an optical rangefinder.
{Para [0056] “The sensing device 46 includes one type of sensor among a radar, a laser, a camera or an ultrasonic sensor, or includes a plurality of types of sensors. The sensing device 46 may include a plurality of radars, a plurality of lasers, a plurality of cameras, or a plurality of ultrasonic sensors. The radar includes a millimeter wave radar, a microwave radar, or another radar of a different wavelength. The sensing device 46 detects and outputs environment information during a below-described automatic shore arrival control.”
}
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Dake et al. (US 20210347449 A1) teaches in the abstract “A LiDAR included in this automatic docking device measures the distance to a surrounding object at each predetermined angle by irradiating the object with light and receiving the light reflected by the object. When a ship offshore is instructed to perform automatic docking, the ship navigates to some extent by automatic navigation based on satellite positioning, and is then switched to automatic navigation based on the LiDAR. Before switching to the automatic navigation based on the LiDAR, the LiDAR performs preparatory measurement for measuring the distance to an object around a docking position. In this preparatory measurement, a control unit controls to change, for example, the orientation of the ship such that light emitted from the LiDAR can be reflected by the object around the docking position and can be received by the LiDAR.”
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALEXANDER MATTA whose telephone number is (571)272-4296. The examiner can normally be reached Mon - Fri 10:00-6:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James Lee can be reached on (571) 270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/A.G.M./Examiner, Art Unit 3668
/JAMES J LEE/Supervisory Patent Examiner, Art Unit 3668