DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This action is in reply to the application filed on 4/10/2024.
No claims have been amended.
No claims have been added.
No claims have been cancelled.
Claims 1-12 are currently pending and have been examined.
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Information Disclosure Statement
The information disclosure statement(s) (IDS(s)) submitted on 4/10/2024 has been received and considered.
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-11 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
Step 1
Step 1 of the Alice/Mayo framework considers whether the claims are directed to one of the four statutory classes of invention – method/process, machine/apparatus, manufacture, or composition of matter. Claim 1 is directed to a system (an apparatus). Claims 11 is directed to a method. Accordingly, claims 1 and 11 are within at least one of the four statutory categories.
Step 2A
Step 2A of the Alice/Mayo framework considers whether claims are “directed to” an abstract idea. That is, whether the claims recite an abstract idea (Prong 1) and fail to integrate the abstract idea into a practical application (Prong 2).
Step 2A Prong 1
Regarding Prong One of Step 2A of the Alice/Mayo test (which collectively includes the guidance in the January 7, 2019 Federal Register notice and the October 2019 update issued by the USPTO as now incorporated into the MPEP, as supported by relevant case law), the claim limitations are to be analyzed to determine whether, under their broadest reasonable interpretation, they “recite” a judicial exception or in other words whether a judicial exception is “set forth” or “described” in the claims. MPEP 2106.04(II)(A)(1). An “abstract idea” judicial exception is subject matter that falls within at least one of the following groupings: a) certain methods of organizing human activity, b) mental processes, and/or c) mathematical concepts. MPEP 2106.04(a).
Specifically, independent claim 1 recites the following, with the abstract idea emphasized. (Additional elements (Prong 2, to be discussed in the subsequent section) are italicized):
An information processing apparatus, comprising: a calculation unit that calculates a self-position of an own device that moves with a moving object, in accordance with a first movement state of a moving object and a second movement state of the own device, on a basis of first movement information relating to the moving object and second movement information relating to the own device.
The above limitations constitute “a mental process” because they comprise observation/evaluation/judgment/analysis that can, at the currently claimed high level of generality, be practically performed in the human mind (e.g., with pen and paper). For instance, a person could calculate a self-position of an own device that moves with a moving object mentally. Accordingly, the claim recites at least one abstract idea.
Claim 11 is an independent claim that follows substantially the same mental processes in a separate embodiment. Claim 12 also follows substantially the same mental process in a separate embodiment but has been rejected separately under 35 USC 101 as directed to non-statutory subject matter.
Step 2A Prong 2
Regarding Prong Two of Step 2A of the Alice/Mayo test, it must be determined whether the claim as a whole integrates the abstract idea into a practical application. As noted at MPEP §2106.04(II)(A)(2), it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements such as merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.” MPEP §2106.05(I)(A).
For the following reasons, the above-identified additional limitations, which are indicated in italics when considered as a whole with the limitations reciting the at least one abstract idea, do not integrate the above-noted at least one abstract idea into a practical application.
Regarding the following additional elements, these additional elements are all recited at a high level of generality:
An information processing apparatus, comprising: a calculation unit
in accordance with a first movement state of a moving object and a second movement state of the own device, on a basis of first movement information relating to the moving object and second movement information relating to the own device.
The following additional limitations amount to merely extra solution activity (i.e. sending and receiving data): in accordance with a first movement state of a moving object and a second movement state of the own device, on a basis of first movement information relating to the moving object and second movement information relating to the own device.
MPEP 2106.05(f)(1) states (1) Whether the claim recites only the idea of a solution or outcome i.e., the claim fails to recite details of how a solution to a problem is accomplished. The recitation of claim limitations that attempt to cover any solution to an identified problem with no restriction on how the result is accomplished and no description of the mechanism for accomplishing the result, does not integrate a judicial exception into a practical application or provide significantly more because this type of recitation is equivalent to the words “apply it”. Here, An information processing apparatus, comprising: a calculation unit does not meaningfully restrict how the result is accomplished and provides no description of the mechanism for accomplishing the task, and thus does not integrate the exception into a practical application.
Thus, taken alone, the additional elements do not integrate the at least one abstract idea into a practical application. Looking at the additional limitations as an ordered combination adds nothing that is not already present when looking at the elements taken individually. MPEP §2106.05(I)(A) and §2106.04(II)(A)(2).
For these reasons, Claims 1 and 11 do not recite additional elements that integrate the judicial exception into a practical application.
Step 2B
Regarding Step 2B of the Alice/Mayo test, claims 1 and 11 do not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for reasons the same as those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application.
The claims, individually or in combination, do not include additional elements that are sufficient to amount to significantly more than the judicial exception because as discussed with respect to Step 2A Prong Two, the additional elements in the claim of in accordance with a first movement state of a moving object and a second movement state of the own device, on a basis of first movement information relating to the moving object and second movement information relating to the own device are all recited at a high level of generality amounting to sending and receiving data, which are well understood routine and conventional activity (see MPEP 2106.05(d) (II)).
As discussed above, the calculation unit additional element amounts to mere instructions to apply the exception. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. Use of a computer or other machinery in its ordinary capacity or simply adding a general-purpose computer or computer components after the fact to an abstract idea does not provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit).
Regarding the calculation unit, the recitation of claim limitations that attempt to cover any solution to an identified problem with no restriction on how the result is accomplished and no description of the mechanism for accomplishing the result, does not integrate a judicial exception into a practical application or provide significantly more because this type of recitation is equivalent to the words “apply it.” See Electric Power Group, LLC v. Alstom, S.A., 830 F.3d 1350, 1356, 119 USPQ2d 1739, 1743-44 (Fed. Cir. 2016); Intellectual Ventures I v. Symantec, 838 F.3d 1307, 1327, 120 USPQ2d 1353, 1366 (Fed. Cir. 2016); Internet Patents Corp. v. Active Network, Inc., 790 F.3d 1343, 1348, 115 USPQ2d 1414, 1417 (Fed. Cir. 2015).
Thus, claim 1 does not amount to significantly more than the judicial exception.
As noted above, the limitations of claim 11 are analogous to the limitations of claim 1 and thus the analysis of claim 1 is applied to claim 11.
Dependent Claims
The dependent claims 2-8 do not provide additional elements or a practical application to become eligible under 35 U.S.C. 101.
Claim 2: The information processing apparatus according to claim 1, wherein the first movement information includes a self-position of the moving object and a movement vector of the moving object, and the second movement information includes the self-position of the own device and a movement vector of the own device.
Claim 3: The information processing apparatus according to claim 1, wherein the first movement state includes at least one of movement, rotation, or stopping of the moving object, and the second movement state includes movement and stopping of the own device.
Claim 4: The information processing apparatus according to claim 3, wherein the calculation unit calculates, in a case where the moving object is moving and the own device is stopped in contact with the moving object, the self-position of the own device by subtracting a movement vector of the moving object from a movement vector of the own device.
Claim 5: The information processing apparatus according to claim 1, wherein the first movement information is acquired by an external sensor and an internal sensor mounted on the moving object, and the second movement information is acquired by an external sensor and an internal sensor mounted on the own device.
Claim 6: The information processing apparatus according to claim 5, wherein the own device is a moving object capable of flight, and the calculation unit calculates, in a case where the moving object is moving and the own device is stopped in air, the self-position of the own device by adding or reducing weighting of the internal sensor mounted on the own device.
Claim 7: The information processing apparatus according to claim 5, wherein the external sensor includes at least one of a LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), a ToF (Time of Flight) camera, or a stereo camera.
Claim 8: The information processing apparatus according to claim 5, wherein the internal sensor includes at least one of an IMU (Inertial Measurement Unit) or a GPS (Global Positioning System).
Claim 9: The information processing apparatus according to claim 7, further comprising an imaging correction unit that controls, in a case where the own device is in contact with the moving object, the external sensor on a basis of a vibration system of the moving object and a vibration system of the own device.
Claim 10: The information processing apparatus according to claim 9, wherein the imaging correction unit performs, in a case where a subject in contact with the moving object is imaged, control to match the vibration system of the moving object and the vibration system of the own device with each other.
These additional claim limitations recite mental processes and further narrow the abstract idea. They do not constitute a practical application of the abstract idea and do not amount to significantly more than the judicial exception. The sensor systems and position and movement data are all recited at a high level of generality. Thus, the claims generally link the use of the abstract idea to a particular technological environment and do not integrate the judicial exception into a practical application because they do not impose any meaningful limits on practicing the abstract idea. While claims 9 and 10 recite controlling a sensor, this control is similarly recited at a high level of generality with no restriction on how the result is accomplished and no description of the mechanism for accomplishing the result, amounting to “apply it.” The claims, individually or in combination, do not include additional elements that are sufficient to amount to significantly more than the judicial exception at Step 2A or provide an inventive concept in Step 2B. For these reasons, there is no inventive concept in the claim, and thus it is ineligible.
Claim 12 is rejected under 35 U.S.C. § 101 because the claimed invention is directed to non-statutory subject matter. The claim is directed to program. As explained in U.S. Patent & Trademark Office, Subject Matter Eligibility of Computer-Readable Media, 1351 Off. Gaz. Pat. Office 212 (Feb. 23, 2010):
The United States Patent and Trademark Office (USPTO) is obliged to give claims their broadest reasonable interpretation consistent with the specification during proceedings before the USPTO. See In re Zietz, 893 F.2d 319 (Fed. Cir. 1989) (during patent examination the pending claims must be interpreted as broadly as their terms reasonably allow). The broadest reasonable interpretation of a claim drawn to a computer readable medium (also called machine readable medium and other such variations) typically covers forms of non-transitory tangible media and transitory propagating signals per se in view of the ordinary and customary meaning of computer readable media, particularly when the specification is silent. See MPEP 2111.01. When the broadest reasonable interpretation of a claim covers a signal per se, the claim must be rejected under 35 U.S.C. § 101 as covering non-statutory subject matter. See In re Nuijten, 500 F.3d 1346, 1356-57 (Fed. Cir. 2007) (transitory embodiments are not directed to statutory subject matter) and Interim Examination Instructions for Evaluating Subject Matter Eligibility Under 35 U.S.C. § 101, Aug. 24, 2009; p. 2.
The USPTO recognizes that applicants may have claims directed to computer readable media that cover signals per se, which the USPTO must reject under 35 U.S.C. § 101 as covering both non-statutory subject matter and statutory subject matter. In an effort to assist the patent community in overcoming a rejection or potential rejection under 35 U.S.C. § 101 in this situation, the USPTO suggests the following approach. A claim drawn to such a computer readable medium that covers both transitory and non-transitory embodiments may be amended to narrow the claim to cover only statutory embodiments to avoid a rejection under 35 U.S.C. § 101 by adding the limitation "non-transitory" to the claim. Cf Animals - Patentability, 1077 Off Gaz. Pat. Office 24 (April 21, 1987) (suggesting that applicants add the limitation "non-human" to a claim covering a multi¬ cellular organism to avoid a rejection under 35 U.S.C. § 101). Such an amendment would typically not raise the issue of new matter, even when the specification is silent because the broadest reasonable interpretation relies on the ordinary and customary meaning that includes signals per se. The limited situations in which such an amendment could raise issues of new matter occur, for example, when the specification does not support a non-transitory embodiment because a signal per se is the only viable embodiment such that the amended claim is impermissibly broadened beyond the supporting disclosure. See, e.g., Gentry Gallery, Inc. v. Berkline Corp., 134 F.3d 1473 (Fed. Cir. 1998).
Accordingly, claim 12 is rejected under 35 U.S.C. § 101 as being directed to non-statutory subject matter.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1, 3-4, 11, and 12 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Rooney (US 20110282536, hereinafter “Rooney).
Regarding Claim 1, Rooney describes:
An information processing apparatus, comprising: a calculation unit that calculates a self-position of an own device that moves with a moving object, (Rooney ¶ 0015 lines 5-9 “A fix subsystem communicates position fix data to the robot. There are also means for determining vessel motion. […] The navigation processor is configured to determine the position of the robot on the hull,” describing determining the position of the robot (self-position of own device) on the hull of a vessel (moving object))
in accordance with a first movement state of a moving object and a second movement state of the own device, on a basis of first movement information relating to the moving object and second movement information relating to the own device. (Rooney ¶ 0015 lines 5-11 “A navigation processor onboard the robot is responsive to […] the position fix data, and the means for determining the vessel motion. The navigation processor is configured to determine the position of the robot on the hull by canceling, from the sensor subsystem output data combining both robot and vessel motion, the determined vessel motion,” describing the use of position fix data (a second movement state of an own device including information relating to the own device) and vessel motion (first movement state including information relating to the moving object) to determine the robot position )
Regarding Claim 3, Rooney describes the elements of Claim 1 as described above and further describes:
wherein the first movement state includes at least one of movement, rotation, or stopping of the moving object, and the second movement state includes movement and stopping of the own device. (Rooney ¶ 0054 lines 4-7 “Since both the robot and the hull are moving, sensor subsystem 66 outputs data to navigation processor 62 reflecting a combination of the motion of the robot on the vessel hull and the motion of the vessel hull itself,” describing motion data of both robot (own device) motion and vessel hull (moving object) motion )
Regarding Claim 4, Rooney describes the elements of Claim 3 as described above and further describes:
wherein the calculation unit calculates, in a case where the moving object is moving and the own device is stopped in contact with the moving object, (Rooney ¶ 0057 lines 1-4 “Thus, means for determining vessel motion may include either a sensing subsystem 70 on-board the vessel and/or sensor subsystem 66 of the robot when the robot is not moving,” describing the below calculation when the robot is not moving )
the self-position of the own device by subtracting a movement vector of the moving object from a movement vector of the own device. (Rooney ¶ 0055 “To address this situation, there are means for determining vessel motion, for example, a multi-axis sensing system 70 on the vessel which transmits vessel motion data to navigation subsystem processor 62 via receiver 72. Processor 62 is then configured (e.g., programmed) to subtract or cancel vessel motion from the data obtained from sensor subsystem 66 (data reflecting combined robot and vessel motion). The result is data concerning only the motion of the robot on the vessel hull. Based on the robot motion data, the position of the robot on the hull can be ascertained” describing subtracting the motion (analogous to movement vector) of the vessel from the motion of the robot)
Regarding Claim 11, Rooney describes:
An information processing method for a computer system to execute the step of: calculating a self-position of an own device that moves with a moving object, (Rooney ¶ 0015 lines 5-9 “A fix subsystem communicates position fix data to the robot. There are also means for determining vessel motion. […] The navigation processor is configured to determine the position of the robot on the hull,” describing determining the position of the robot (self-position of own device) on the hull of a vessel (moving object))
in accordance with a first movement state of a moving object and a second movement state of the own device, on a basis of first movement information relating to the moving object and second movement information relating to the own device. (Rooney ¶ 0015 lines 5-11 “A navigation processor onboard the robot is responsive to […] the position fix data, and the means for determining the vessel motion. The navigation processor is configured to determine the position of the robot on the hull by canceling, from the sensor subsystem output data combining both robot and vessel motion, the determined vessel motion,” describing the use of position fix data (a second movement state of an own device including information relating to the own device) and vessel motion (first movement state including information relating to the moving object) to determine the robot position)
Regarding Claim 12, Rooney describes:
A program (Rooney ¶ 0012 lines 7-9 “the robot is also programmed to subtract the motion of the vessel from its navigation calculations,”)
that causes a computer system to execute the step of: calculating a self-position of an own device that moves with a moving object, (Rooney ¶ 0015 lines 5-9 “A fix subsystem communicates position fix data to the robot. There are also means for determining vessel motion. […] The navigation processor is configured to determine the position of the robot on the hull,” describing determining the position of the robot (self-position of own device) on the hull of a vessel (moving object))
in accordance with a first movement state of a moving object and a second movement state of the own device, on a basis of first movement information relating to the moving object and second movement information relating to the own device. (Rooney ¶ 0015 lines 5-11 “A navigation processor onboard the robot is responsive to […] the position fix data, and the means for determining the vessel motion. The navigation processor is configured to determine the position of the robot on the hull by canceling, from the sensor subsystem output data combining both robot and vessel motion, the determined vessel motion,” describing the use of position fix data (a second movement state of an own device including information relating to the own device) and vessel motion (first movement state including information relating to the moving object) to determine the robot position)
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 2, 5-8 are rejected under 35 U.S.C. 103 as being unpatentable over Rooney in view of Wang (US 9056676, hereinafter “Wang”).
Regarding Claim 2, Rooney describes the elements of Claim 1 as described above and further describes:
wherein the first movement information includes […] a movement vector of the moving object, (Rooney ¶ 0083 lines 1-2 “FIG. 8 shows ship's vessel motion data at 120 (based on, for example, sensing subsystem 70, FIG. 2)” describing the ship’s vessel motion data (movement of the moving object) shown in Fig 8 to be a movement vector determined by a 3-axis acceleration and angular sensor)
and the second movement information includes the self-position of the own device (Rooney ¶ 0017 “The fix subsystem typically includes at least two hull mounted transmitters and a receiver onboard the robot receiving transmissions from the at least two hull transmitters. The navigation processor is responsive to the receiver and is configured to determine, by triangulation, robot position fix data based on the transmissions transmitted by the at least two hull transmitters.”)
and a movement vector of the own device (Rooney ¶ 0083 line 3 “robot motion data as shown at 122” describing the robot’s motion data (movement of the own device) shown in Fig 8 to be a movement vector determined by a 3-axis acceleration and angular sensor)
Rooney does not teach:
a self-position of the moving object
Within the same field of endeavor as Rooney, Wang teaches:
a self-position of the moving object (Wang Col 13 lines 54-57 “The command signal may be generated in response to data about the motion of the vehicle and/or the UAV. For example, information about the position and/or velocity of the vehicle may be provided. Information about the direction of travel of the vehicle may be provided,” teaching the use of vehicle position data alongside the vehicle velocity and direction of travel data analogous to Rooney’s motion data)
Rooney and Wang are considered analogous because they both relate to control of autonomous vehicles in relation to larger vehicles. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the Ship Vessel’s motion data of Rooney with the addition of the vehicle position information of Wang. This modification would be made with a reasonable expectation of success as motivated by an increased ability to define the vessel and robot locations in a context beyond the robot strictly relative to the vessel as would be obvious to someone of ordinary skill in the art, by combining prior art elements (Rooney’s vessel motion data and Wang’s vehicle position) according to known methods to obtain predictable results (known position and velocity) according to MPEP 2143(I)(A).
Regarding Claim 5, Rooney describes the elements of Claim 1 as described above and further describes:
wherein the first movement information is acquired by […] an internal sensor mounted on the moving object, (Rooney ¶ 0018 lines 1-3 “The means for determining vessel motion may include a multi-axis inertial sensing subsystem on the vessel outputting data representing motion of the vessel,” and ¶ 0055 lines 1-3 “there are means for determining vessel motion, for example, a multi-axis sensing system 70 on the vessel which transmits vessel motion data,” describing internal sensors for sensing vessel motion, interpreted as sensing internal properties)
and the second movement information is acquired by an external sensor (Rooney ¶ 0017 “The fix subsystem typically includes at least two hull mounted transmitters and a receiver onboard the robot receiving transmissions from the at least two hull transmitters. The navigation processor is responsive to the receiver and is configured to determine, by triangulation, robot position fix data based on the transmissions transmitted by the at least two hull transmitters,” describing an external sensor, interpreted as a sensor for sensing external properties)
and an internal sensor mounted on the own device. (Rooney ¶ 0054 lines 1-2 “Sensor subsystem 66 onboard robot 10 typically includes a multi-axis sensing system,” describing internal sensors for sensing robot motion, interpreted as sensing internal properties)
Rooney does not teach:
[…] an external sensor and […]
Within the same field of endeavor as Rooney, Wang teaches:
[…] wherein the first movement information is acquired by an external sensor […] (Wang Col 14 lines 19-22 “In one example, a vehicle may transmit its coordinates to the UAV in real time. The vehicle may have a location unit that may aid in determining a location of the vehicle. In one example, the location unit may utilize GPS,” and Col 66 lines 43-49 “The sensing module 1802 can utilize different types of sensors that collect information relating to the movable objects in different ways. Different types of sensors may sense different types of signals or signals from different sources. For example, the sensors can include inertial sensors, GPS sensors, proximity sensors (e.g., lidar), or vision/image sensors (e.g., a camera),” teaching determination of vehicle coordinates by GPS and an alternative of using LIDAR or cameras to determine motion data)
[…] and the second movement information is acquired by an external sensor […] (Wang Col 14 lines 25-32 “In some instances, there may be some error to the GPS coordinates so additional aids may be provided for landing the UAV on the vehicle. For example, a marker may be provided as described in greater detail elsewhere herein. The marker may be a vision based marker which will utilize a camera on board the UAV to provide more accurate positioning. The marker may be any other type of marker as described elsewhere herein,” teaching the use of a camera to determine relative UAV positioning)
Rooney and Wang are considered analogous because they both relate to control of autonomous vehicles in relation to larger vehicles. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the multi-axis motion sensors for the vessel and robot of Rooney with the addition of the lidar or camera-based vision-based positioning of Wang. This modification would be made with a reasonable expectation of success as motivated by an increased ability to define the vessel and robot locations in a context beyond the robot strictly relative to the vessel as would be obvious to someone of ordinary skill in the art, by combining prior art elements (Rooney’s vessel motion data and Wang’s vehicle position) according to known methods to obtain predictable results (known position and velocity) according to MPEP 2143(I)(A).
Regarding Claim 6, the combination of Rooney and Wang teaches the elements of Claim 5 as described above. Rooney further describes:
wherein the own device is a moving object […] and the calculation unit calculates, in a case where the moving object is moving and the own device is stopped […] the self-position of the own device by adding or reducing weighting of the internal sensor mounted on the own device. (Rooney ¶ 0057 lines 1-7 “Thus, means for determining vessel motion may include either a sensing subsystem 70 on-board the vessel and/or sensor subsystem 66 of the robot when the robot is not moving. Vessel motion data, in one example, can be output by both subsystem 70 and subsystem 66 and compared to verify both subsystems and to compensate for any errors or differences between the two subsystems,” emphasis added, describing a compensation analogous to a change in weighting between sensors to verify the robot position.)
Rooney does not teach:
[…]capable of flight, […]
[…] in air, […]
Within the same field of endeavor as Rooney, Wang teaches:
[…]capable of flight, […]
[…] in air, […] (Wang Col 2 lines 15-17 “In some implementations, the UAV may be a rotorcraft. The altitude of the UAV can be decreased by decreasing the speed of rotation of one or more rotors,” teaching an unmanned aerial vehicle which flies)
Rooney and Wang are considered analogous because they both relate to control of autonomous vehicles in relation to larger vehicles. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the comparison and compensation of vessel and robot sensor data while surface-bound robot of Rooney is stationary with the substitution of the UAV of Wang for the robot for Rooney. This modification would be made with a reasonable expectation of success as motivated by an increased ability to more accurately determine relative position between the vessel and the aerial robot as would be obvious to someone of ordinary skill in the art, by applying a known technique (Wang’s use of a UAV docking to a vehicle) to a known method (Rooney’s vessel and robot motion data compensation when the surface-bound robot is stopped) ready for improvement (use of a flying robot expands the capabilities of Rooney’s surface-bound robot) to yield predictable results (accurate UAV positioning in relation to Rooney’s vessel) according to MPEP 2143(I)(D).
Regarding Claim 7, the combination of Rooney and Wang teaches the elements of Claim 5 as described above. Rooney does not teach:
wherein the external sensor includes at least one of a LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), a ToF (Time of Flight) camera, or a stereo camera.
Within the same field of endeavor as Rooney, Wang teaches:
wherein the external sensor includes at least one of a LiDAR (Light Detection and Ranging, Laser Imaging Detection and Ranging), a ToF (Time of Flight) camera, or a stereo camera.
(Wang Col 66 lines 43-49 “The sensing module 1802 can utilize different types of sensors that collect information relating to the movable objects in different ways. Different types of sensors may sense different types of signals or signals from different sources. For example, the sensors can include[…] proximity sensors (e.g., lidar), or vision/image sensors (e.g., a camera),”)
Rooney and Wang are considered analogous because they both relate to control of autonomous vehicles in relation to larger vehicles. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the multi-axis motion sensors for the vessel and robot and the triangulation position fix sensor for the robot of Rooney with the addition/substitution of the lidar or camera-based vision-based positioning of Wang. This modification would be made with a reasonable expectation of success as motivated by an increased ability to define the vessel and robot locations in a context beyond the robot strictly relative to the vessel as would be obvious to someone of ordinary skill in the art, by combining prior art elements (Rooney’s vessel motion data and Wang’s vehicle position) according to known methods to obtain predictable results (known position and velocity) according to MPEP 2143(I)(A).
Regarding Claim 8, the combination of Rooney and Wang teaches the elements of Claim 5 as described above. Rooney further describes:
wherein the internal sensor includes at least one of an IMU (Inertial Measurement Unit) or a GPS (Global Positioning System). (Rooney ¶ 0018 lines 1-3 “The means for determining vessel motion may include a multi-axis inertial sensing subsystem on the vessel outputting data representing motion of the vessel,” and ¶ 0055 lines 1-3 “there are means for determining vessel motion, for example, a multi-axis sensing system 70 on the vessel which transmits vessel motion data,” describing navigation using an inertial multi-axis sensing system, equivalent to an IMU)
Claim(s) 9-10 are rejected under 35 U.S.C. 103 as being unpatentable over Rooney in view of Wang and further in view of Kamiyama (JP H01211408, hereinafter “Kamiyama,” all citations and excerpts taken from the attached machine translation).
Regarding Claim 9, the combination of Rooney and Wang teaches the elements of Claim 7 as described above. Rooney further teaches:
further comprising […] controls, in a case where the own device is in contact with the moving object, the […] sensor […] (Rooney ¶ 0057 lines 1-7 “Thus, means for determining vessel motion may include either a sensing subsystem 70 on-board the vessel and/or sensor subsystem 66 of the robot when the robot is not moving. Vessel motion data, in one example, can be output by both subsystem 70 and subsystem 66 and compared to verify both subsystems and to compensate for any errors or differences between the two subsystems,” emphasis added, describing a control of compensation between sensors of the robot and vessel while the robot is in contact on the vessel)
Rooney does not teach:
[…] an imaging correction unit that […]
[…] external […]
[…] on a basis of a vibration system of the moving object
and a vibration system of the own device.
Within the same field of endeavor as Rooney, Kamiyama teaches:
further comprising an imaging correction unit that controls, […] the external sensor on a basis of a vibration system of the moving object and a vibration system of the own device. (Kamiyama Pg 5 lines 1-11 “According to this configuration, even if the imaging means vibrates together with the vibration of the traveling vehicle, the correction circuit in the image processing device restores the image information that has been blurred or deteriorated by the vibration to the original clear image (repair 1!). t), and since this restoration work is performed by an electrical correction circuit, the correction contents can be freely designed, and the image as an instantaneous value can be quickly corrected. The image data corrected and restored in this way allows accurate calculations to identify the position of the planting crop, and subsequent image processing to detect the arrangement status, such as the position and direction, relative to the row of planted seedlings can be performed accurately, and ultimately automatic steering control and other processing can also be performed accurately,” teaching imaging compensation for vibration between the imaging sensor and the vehicle it is traveling on, which can be applied to both the robot and the vessel of Rooney with the camera systems of Wang)
Rooney, Wang, and Kamiyama are all considered analogous because they all relate to vehicle positioning systems. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified the comparison and compensation of vessel and robot sensor data of the surface-bound robot of Rooney and Wang’s camera positioning with the addition of Kamiyama’s vibration compensation of the imaging positioning system to Wang’s camera positioning using Rooney’s compensation between positioning systems. This modification would be made with a reasonable expectation of success as motivated by quickly correcting images to allow accurate calculations to identify position and direction and improved accuracy in automatic steering control and processing (Kamiyama Pg 5 lines 6-11).
Regarding Claim 10, the combination of Rooney, Wang, and Kamiyama teaches the elements of Claim 9 as described above. Rooney further teaches:
wherein the imaging correction unit performs, in a case where a subject in contact with the moving object is imaged, control to match the vibration system of the moving object and the vibration system of the own device with each other. (Rooney ¶ 0057 lines 1-7 “Thus, means for determining vessel motion may include either a sensing subsystem 70 on-board the vessel and/or sensor subsystem 66 of the robot when the robot is not moving. Vessel motion data, in one example, can be output by both subsystem 70 and subsystem 66 and compared to verify both subsystems and to compensate for any errors or differences between the two subsystems,” emphasis added, describing a control of compensation between sensors of the robot and vessel while the robot is in contact on the vessel, applied to the vibration correction system of Kamiyama)
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZACHARY E GLADE whose telephone number is (703)756-1502. The examiner can normally be reached 4-5-9 7:30-16:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kito Robinson can be reached at (571) 270-3921. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ZACHARY E. F. GLADE/Examiner, Art Unit 3664
/KITO R ROBINSON/Supervisory Patent Examiner, Art Unit 3664