DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This is a Non-Final rejection on the merits of this application. Claims 1-18 are currently pending, as discussed below.
Examiner Notes that the fundamentals of the rejections are based on the broadest reasonable interpretation of the claim language. Applicant is kindly invited to consider the reference as a whole. References are to be interpreted as by one of ordinary skill in the art rather than as by a novice. See MPEP 2141. Therefore, the relevant inquiry when interpreting a reference is not what the reference expressly discloses on its face but what the reference would teach or suggest to one of ordinary skill in the art.
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. JP2023-200673, filed on 11/28/2023.
Information Disclosure Statement
The information disclosure statement (IDS) filed on 11/22/2024 and 05/20/2025 are being considered by the examiner.
Claim Objections
Claim 5 is objected to because of the following informalities: Claim 5, Line 2: “comprising further comprising” should read –comprising—
Appropriate correction is required.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-2, 4-10, 12 and 16-18 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Weinheber (US 2024/0248477 A1).
Regarding Claim 1 (similarly claims 16-18), Weinheber teaches An information processing system (see at least Fig. 2A Abstract [0077]: BVLOS system for controlling drones and/or other autonomous vehicles) comprising:
a mobile object; (see at least Fig. 2A-3C Abstract [0047, 0077]: The first drone and/or the one or more imaging sensors of the first drone are operated based on analysis of the first image stream to track the second drone around a center of a field of view (FOV) of the one or more imaging sensors of the first drone. Moreover, the second drone and/or the one or more imaging sensors of the second drone are operated based on the analysis of the second image stream to track the first drone around a center of a FOV of the one or more imaging sensors of the second drone.)
a mobile image capturing apparatus that captures an image of the mobile object; (see at least Fig. 2A-3C Abstract [0047, 0077]: The first drone and/or the one or more imaging sensors of the first drone are operated based on analysis of the first image stream to track the second drone around a center of a field of view (FOV) of the one or more imaging sensors of the first drone. Moreover, the second drone and/or the one or more imaging sensors of the second drone are operated based on the analysis of the second image stream to track the first drone around a center of a FOV of the one or more imaging sensors of the second drone.)
a memory device that stores a set of instructions; and at least one processor (see at least Fig. 2A-3C [0141-0144]: each drone may include a drone remote control unit, a processor and storage for storing data and/or programs.)that executes the set of instructions to:
receive a detection result indicating whether the mobile object is included in an image capturing field angle of the mobile image capturing apparatus from the mobile image capturing apparatus; (see at least Fig. 2A-3C Abstract [0047, 0077]: The first drone and/or the one or more imaging sensors of the first drone are operated based on analysis of the first image stream to track the second drone around a center of a field of view (FOV) of the one or more imaging sensors of the first drone. Moreover, the second drone and/or the one or more imaging sensors of the second drone are operated based on the analysis of the second image stream to track the first drone around a center of a FOV of the one or more imaging sensors of the second drone.) and
issue an instruction relating to steering of at least one of the mobile object and the mobile image capturing apparatus in accordance with the detection result. (see at least Fig. 2A-3C Abstract [0047, 0077, 0128-0237]: Moreover, the drone remote control engine 220 may further operate the first drone 202 A and/or the second drone 202 B to track the second drone 202 B around a center of the FOV of the imaging sensor(s) 214 of the first drone 202 A capturing the first image stream thus capturing the second drone 202 B substantially in the center of the images of the first image stream. This may be done to ensure that the surrounding of the second drone 202 B are effectively seen in the first image stream in sufficient distances in all directions of the second drone 202 B. This may be essential, since in case the second drone 202 B is tracked at the edges of the first image stream (i.e., not centered), at least some areas in close proximity to the second drone 202 B may not be effectively monitored for potential hazards, objects and/or obstacles. The drone remote control engine may dynamically adjust the selected flight information (e.g. the route, the position, the altitude, the speed, flight parameters and the like) to adjust the position of the first drone and/or position of the second drone with respect to each other.)
Regarding Claim 2, Weinheber teaches The information processing system according to claim 1,
Weinheber further teaches wherein the at least one processor executes the set of instructions to issues the instruction relating to the steering of at least one of the mobile object and the mobile image capturing apparatus in accordance with the detection result so that the whole of the mobile object will be included in the image capturing field angle of the mobile image capturing apparatus. (see at least Fig. 2A-3C [0077, 0096]: Companion drones may be operated automatically to maintain VLOS with each other including dynamically adjusting one or more of the flight parameters of one or more of the companion drones, for example, position, speed, altitude and/or the like to overcome visibility limitations imposed by, for example, blocking object(s), poor visibility (e.g. low illumination, bad weather, etc.) and/or the like. Moreover, companion drones may be operated automatically to track each other around a center of field of View (FOV) of the imaging sensor(s) of the drone(s) to ensure that the tracked companion drones and their surrounding environment are effectively seen in the image streams.)
Regarding Claim 4, Weinheber teaches The information processing system according to claim 1,
Weinheber further teaches further comprising a controller configured to control the mobile object, (see at least Fig. 2A-3C [0141-0144]: each drone may include a drone remote control unit, a processor and storage for storing data and/or programs.)
wherein the controller controls a movement of the mobile object based on the instruction relating to the steering of the mobile object. (see at least Fig. 2A-3C Abstract [0047, 0077, 0128-0237]: Moreover, the drone remote control engine 220 may further operate the first drone 202 A and/or the second drone 202 B to track the second drone 202 B around a center of the FOV of the imaging sensor(s) 214 of the first drone 202 A capturing the first image stream thus capturing the second drone 202 B substantially in the center of the images of the first image stream. This may be done to ensure that the surrounding of the second drone 202 B are effectively seen in the first image stream in sufficient distances in all directions of the second drone 202 B. This may be essential, since in case the second drone 202 B is tracked at the edges of the first image stream (i.e., not centered), at least some areas in close proximity to the second drone 202 B may not be effectively monitored for potential hazards, objects and/or obstacles. The drone remote control engine may dynamically adjust the selected flight information (e.g. the route, the position, the altitude, the speed, flight parameters and the like) to adjust the position of the first drone and/or position of the second drone with respect to each other.)
Regarding Claim 5, Weinheber teaches The information processing system according to claim 1, further comprising further comprising
Weinheber further teaches a notification unit configured to notify an operator of the mobile object, (see at least Fig. 2A-3C [0084-0086, 0187, 0207]: the operator(s) at the remote control system(s) may be presented with one or more annotated image streams may be generated to enhance the image steams with further visual details (via screens), for examples, text, icons, bounding boxes, tracked paths and/or the like marking one or more objects and/or elements, for example, the drone(s), other drone(s), potential obstacles, in proximity objects, in collision course objects and/or the like.)
wherein the notification unit notifies the operator of information about the instruction related to the steering of the mobile object. (see at least Fig. 2A-3C [0084-0088, 0187, 0207-0212]:The route and/or one or more flight parameters of one or more of the drone may be monitored in the image stream(S) depicting the respective drone(s) and compared to predefined route and/or flight parameters as defined by the mission plan of the respective drone(s) and alert(s) may be generated in case a deviation is detected.)
Regarding Claim 6, Weinheber teaches The information processing system according to claim 1,
Weinheber further teaches wherein the memory device and the at least one processor are incorporated into the mobile image capturing apparatus, and the mobile image capturing apparatus issues the instruction relating to the steering to the mobile object. (see at least Fig. 1-2B [0141-0147, 0172-0184]: one or more of the drones may include a drone remote control unit 206 comprising a communication interface for communicating with the remote control system, a processor(s) 224 for executing the process 100 and/or part thereof, and a storage 226 for storing data and/or program)
Regarding Claim 7, Weinheber teaches The information processing system according to claim 1,
Weinheber further teaches wherein the memory device and the at least one processor are incorporated into the mobile object, and the mobile object issues the instruction relating to the steering to the mobile image capturing apparatus. (see at least Fig. 1-2B [0141-0147, 0172-0184]: one or more of the drones may include a drone remote control unit 206 comprising a communication interface for communicating with the remote control system, a processor(s) 224 for executing the process 100 and/or part thereof, and a storage 226 for storing data and/or program)
Regarding Claim 8, Weinheber teaches The information processing system according to claim 1,
Weinheber further teaches wherein the instruction relating to the steering of the mobile object is an instruction to change a speed of the mobile object. (see at least Fig. 2A-3C Abstract [0047, 0077, 0128-0237]: Moreover, the drone remote control engine 220 may further operate the first drone 202 A and/or the second drone 202 B to track the second drone 202 B around a center of the FOV of the imaging sensor(s) 214 of the first drone 202 A capturing the first image stream thus capturing the second drone 202 B substantially in the center of the images of the first image stream. This may be done to ensure that the surrounding of the second drone 202 B are effectively seen in the first image stream in sufficient distances in all directions of the second drone 202 B. This may be essential, since in case the second drone 202 B is tracked at the edges of the first image stream (i.e., not centered), at least some areas in close proximity to the second drone 202 B may not be effectively monitored for potential hazards, objects and/or obstacles. The drone remote control engine may dynamically adjust the selected flight information (e.g. the route, the position, the altitude, the speed, flight parameters and the like) to adjust the position of the first drone and/or position of the second drone with respect to each other.)
Regarding Claim 9, Weinheber teaches The information processing system according to claim 1,
Weinheber further teaches wherein the instruction relating to the steering of the mobile object is an instruction to change a route of the mobile object. (see at least Fig. 2A-3C Abstract [0047, 0077, 0128-0237]: Moreover, the drone remote control engine 220 may further operate the first drone 202 A and/or the second drone 202 B to track the second drone 202 B around a center of the FOV of the imaging sensor(s) 214 of the first drone 202 A capturing the first image stream thus capturing the second drone 202 B substantially in the center of the images of the first image stream. This may be done to ensure that the surrounding of the second drone 202 B are effectively seen in the first image stream in sufficient distances in all directions of the second drone 202 B. This may be essential, since in case the second drone 202 B is tracked at the edges of the first image stream (i.e., not centered), at least some areas in close proximity to the second drone 202 B may not be effectively monitored for potential hazards, objects and/or obstacles. The drone remote control engine may dynamically adjust the selected flight information (e.g. the route, the position, the altitude, the speed, flight parameters and the like) to adjust the position of the first drone and/or position of the second drone with respect to each other.)
Regarding Claim 10, Weinheber teaches The information processing system according to claim 1,
Weinheber further teaches wherein the instruction related to the steering of the mobile image capturing apparatus is an instruction to change a route of the mobile image capturing apparatus. (see at least Fig. 2A-3C Abstract [0047, 0077, 0128-0237]: Moreover, the drone remote control engine 220 may further operate the first drone 202 A and/or the second drone 202 B to track the second drone 202 B around a center of the FOV of the imaging sensor(s) 214 of the first drone 202 A capturing the first image stream thus capturing the second drone 202 B substantially in the center of the images of the first image stream. This may be done to ensure that the surrounding of the second drone 202 B are effectively seen in the first image stream in sufficient distances in all directions of the second drone 202 B. This may be essential, since in case the second drone 202 B is tracked at the edges of the first image stream (i.e., not centered), at least some areas in close proximity to the second drone 202 B may not be effectively monitored for potential hazards, objects and/or obstacles. The drone remote control engine may dynamically adjust the selected flight information (e.g. the route, the position, the altitude, the speed, flight parameters and the like) to adjust the position of the first drone and/or position of the second drone with respect to each other.)
Regarding Claim 12, Weinheber teaches The information processing system according to claim 1, wherein the at least one processor executes the set of instructions to:
Weinheber further teaches determine whether image capturing according to an image capturing scenario in which an image capturing condition and a location-and-orientation condition of the mobile image capturing apparatus to obtain a desired image capturing cut are described can be performed while the mobile image capturing apparatus performs the image capturing according to the image capturing scenario; and issue the instruction relating to the steering in a case where it is determined that the image capturing according to the image capturing scenario cannot be performed. (see at least [0095-0097, 0127-0156]: Companion drones may be operated automatically to maintain VLOS with each other including dynamically adjusting one or more of the flight parameters of one or more of the companion drones, for example, position, speed, altitude and/or the like to overcome visibility limitations imposed by, for example, blocking object(s), poor visibility (e.g. low illumination, bad weather, etc.) and/or the like. Moreover, companion drones may be operated automatically to track each other around a center of field of View (FOV) of the imaging sensor(s) of the drone(s) to ensure that the tracked companion drones and their surrounding environment are effectively seen in the image streams. One or more of the drones may include one or more gimbal mounted imaging sensors which may be dynamically positioned and adjusted to be face their companion drone such that the companion drone is in the FOV of the imaging sensors. One or mor of the drones may include wide FOV imaging sensors configured to monitor and capture imagery data (image stream) of a wide portion of the environment of the respective drone including their companion drone. One or more the drones may be operated and/or instructed to fly in a position to bring and/or put their companion drone in the FOV of their imaging sensor.)
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 3 is rejected under 35 U.S.C. 103 as being unpatentable over Weinheber in view of Chan et al. (US 2017/0301109 A1 hereinafter Chan).
Regarding Claim 3, Weinheber teaches The information processing system according to claim 1,
it may be alleged that Weinheber does not explicitly teach wherein the at least one processor executes the set of instructions to issues the instruction relating to the steering of at least one of the mobile object and the mobile image capturing apparatus in accordance with the detection result so that the mobile object is framed out from the image capturing field angle of the mobile image capturing apparatus.
Chan is directed to system and method for dynamically planning and operation of autonomous system using image observation, Chan teaches wherein the at least one processor executes the set of instructions to issues the instruction relating to the steering of at least one of the mobile object and the mobile image capturing apparatus in accordance with the detection result so that the mobile object is framed out from the image capturing field angle of the mobile image capturing apparatus. (see at least Fig. 4-8C [0046-0071]: The object model can be initializing using an initial image for which a detected object is confirmed and the vision processing module can continuously update the object model while the object is still detected within newly acquired images and can halt updating the object model upon a determination that the object of interest has left the field of view of the imaging devices as reflected in the object disappearance from newly acquired image and can resume updating upon a determination that the object has re-entered the field of view of the image device. The object model ceased updating during track loss when a track confidence level is low and resume updating until a track confidence level is reached after the object if interest is reacquired).
Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified Weinheber’s image-based autonomous control system to incorporate the technique of detection based steering mechanism that steering the mobile imaging capturing apparatus such that the tracked object is framed out of the image capturing field angle when a tracking confidence is low as taught by Chan with reasonable expectation of success because replanning can work to reduce the standoff distance from the object of interest or to improve viewing-angle diversity to improve the confidence and enhance the identification quality of the observation (Chan, [0047]).
Claim(s) 11 is rejected under 35 U.S.C. 103 as being unpatentable over Weinheber in view of Allard et al. (US 2019/0055015 A1 hereinafter Allard).
Regarding Claim 11, Weinheber teaches The information processing system according to claim 1, Weinheber further teaches an operation control instruction for a movable mechanism. (see at least Fig. 2A-3C Abstract [0047, 0077, 0128-0237]: The drone remote control engine may dynamically adjust the selected flight information (e.g. the route, the position, the altitude, the speed, flight parameters and the like) to adjust the position of the first drone and/or position of the second drone with respect to each other.)
It may be alleged that Weinheber does not explicitly teach wherein the instruction related to the steering of the mobile object includes a light control instruction for a lamp of the mobile object and an operation control instruction for a movable mechanism.
Allard is directed to system and method for intelligent inspection and interaction between a vehicle and a drone, Allard teaches wherein the instruction related to the steering of the mobile object includes a light control instruction for a lamp of the mobile object and an operation control instruction for a movable mechanism. (see at least Fig. 6 [0108-0118]: The drone control platform coordinates activation of at least one of the one or more sensors of the vehicles. The drone can be programmed to signal the vehicle to activate and test a selected sensor of the vehicle when the drone is positioned at the test location. The interaction function may include a vehicle maintenance function, the drone control platform determines, via the drone device that the vehicle has a need for maintenance supply item (e.g., window washer fluid, oil, light bulb, etc.) and the determination for a need of maintenance item can be based on a visual survey of the vehicle. That is, the UAV Is used for interactive autonomous diagnosis and inspection of autonomous vehicle where the UAV can initiate testing of vehicle sensors: testing a turn signal lamp necessary requires issuing a control instruction to activate the lamp such that the illumination output can be observed and verified and similarly, opening a vehicle door necessarily requires issuing an operation control instruction to a movable mechanism associated with the door.)
Accordingly, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified Weinheber’s image-based autonomous control system to incorporate the diagnostic and actuation capabilities as taught by Allard, including lamp activation and movable mechanism control, with reasonable expectation of success to enable automated verification of vehicle components and improve the robustness and operational functionality of autonomous systems.
Claim(s) 13-15 are rejected under 35 U.S.C. 103 as being unpatentable over Weinheber.
Regarding Claim 13, Weinheber teaches The information processing system according to claim 1,
Weinheber further teaches wherein, in a case where image capturing according to an image capturing scenario in which an image capturing condition and a location-and-orientation condition of the mobile image capturing apparatus to obtain a desired image capturing cut are described cannot be performed while the mobile image capturing apparatus performs the image capturing according to the image capturing scenario even after issuing the instruction relating to the steering of at least one of the mobile object and the mobile image capturing apparatus because steering control of at least one of the mobile object and the mobile image capturing apparatus is restricted, (see at least [0095-0097, 0127-0156]: Companion drones may be operated automatically to maintain VLOS with each other including dynamically adjusting one or more of the flight parameters of one or more of the companion drones, for example, position, speed, altitude and/or the like to overcome visibility limitations imposed by, for example, blocking object(s), poor visibility (e.g. low illumination, bad weather, etc.) and/or the like. Moreover, companion drones may be operated automatically to track each other around a center of field of View (FOV) of the imaging sensor(s) of the drone(s) to ensure that the tracked companion drones and their surrounding environment are effectively seen in the image streams. One or more of the drones may include one or more gimbal mounted imaging sensors which may be dynamically positioned and adjusted to be face their companion drone such that the companion drone is in the FOV of the imaging sensors. One or mor of the drones may include wide FOV imaging sensors configured to monitor and capture imagery data (image stream) of a wide portion of the environment of the respective drone including their companion drone. One or more the drones may be operated and/or instructed to fly in a position to bring and/or put their companion drone in the FOV of their imaging sensor. )
the at least one processor executes the set of instructions (see at least Fig. 1-2C) to:
change the image capturing scenario, and redetermine an instruction relating to the steering of at least one of the mobile object and the mobile image capturing apparatus based on the image capturing scenario after change. (see at least [0095-0097, 0127-0156]: Companion drones may be operated automatically to maintain VLOS with each other including dynamically adjusting one or more of the flight parameters of one or more of the companion drones, for example, position, speed, altitude and/or the like to overcome visibility limitations imposed by, for example, blocking object(s), poor visibility (e.g. low illumination, bad weather, etc.) and/or the like. Moreover, companion drones may be operated automatically to track each other around a center of field of View (FOV) of the imaging sensor(s) of the drone(s) to ensure that the tracked companion drones and their surrounding environment are effectively seen in the image streams. One or more of the drones may include one or more gimbal mounted imaging sensors which may be dynamically positioned and adjusted to be face their companion drone such that the companion drone is in the FOV of the imaging sensors. One or mor of the drones may include wide FOV imaging sensors configured to monitor and capture imagery data (image stream) of a wide portion of the environment of the respective drone including their companion drone. One or more the drones may be operated and/or instructed to fly in a position to bring and/or put their companion drone in the FOV of their imaging sensor. )
It may be alleged that Weinheber does not expressly recites determining that a desired image capturing scenario cannot be performed even after steering due to steering restrictions, and changing the image capturing scenario and redetermining steering instructions based on the changed scenario. Examiner notes that Weinheber discloses an autonomous imaging and control system in which companion drones are operated automatically based on analysis of mage streams to maintain effectively visual tracking. Since Weinheber teaches/allows: analyzing whether a tracked mobile object is effectively visible (center in the fov of imaging sensor) in an image stream; issuing steering instructions to adjust vehicle position, speed, altitude, and/or image sensor orientation (via gimbal mount); addressing operational limitations such as blocking objects, environmental conditions, and visual line of sight constraints; employing alternative imaging and operational strategies, including gimbal-mounted sensors, wide FOV sensors, and repositioning of the drones. Thus, the scope of Weinheber includes adaptive control and dynamic imaging strategy selection in response to constrained operating conditions. One having ordinary skill in the art before the effective filing date of the claimed invention would have been motivated to modify the system of Weinheber to explicitly determine when steering alone cannot achieve the desired image capture and to change the image capturing scenario accordingly because autonomous system must operate safely and effectively under constraints such as obstacles, limited maneuverability and visibility restrictions when steering adjustments are insufficient to achieve effective imaging.
Regarding Claim 14, Weinheber teaches The information processing system according to claim 13,
Weinheber further teaches wherein the at least one processor executes the set of instructions to instruct the mobile image capturing apparatus to stop the image capturing in a case where the image capturing according to the image capturing scenario cannot be performed. (see at least [0095-0097, 0127-0156]: Companion drones may be operated automatically to maintain VLOS with each other including dynamically adjusting one or more of the flight parameters of one or more of the companion drones, for example, position, speed, altitude and/or the like to overcome visibility limitations imposed by, for example, blocking object(s), poor visibility (e.g. low illumination, bad weather, etc.) and/or the like. Moreover, companion drones may be operated automatically to track each other around a center of field of View (FOV) of the imaging sensor(s) of the drone(s) to ensure that the tracked companion drones and their surrounding environment are effectively seen in the image streams. One or more of the drones may include one or more gimbal mounted imaging sensors which may be dynamically positioned and adjusted to be face their companion drone such that the companion drone is in the FOV of the imaging sensors. One or mor of the drones may include wide FOV imaging sensors configured to monitor and capture imagery data (image stream) of a wide portion of the environment of the respective drone including their companion drone. One or more the drones may be operated and/or instructed to fly in a position to bring and/or put their companion drone in the FOV of their imaging sensor. )
It may be alleged that Weinheber does not expressly recites instructing the mobile image capturing apparatus to stop image capturing when image capturing according to the image capturing scenario cannot be performed. Examiner notes that Weinheber disclose an autonomous information processing system in which companion drones are operated abased on analysis of image streams to maintain effective visibility of a mobile object. The reference teaches determining when effective image capturing cannot be achieved due to visibility limitations, environmental conditions, or operational constrains, and modifying system operation accordingly. One having ordinary skill in the art before the effective filing date of the claimed invention would have been motivated to modify the system of Weinheber to stop capturing when effective image capture cannot be performed because continuing to capture unusable images wastes computational resources and provide no operational benefits.
Regarding Claim 15, Weinheber teaches The information processing system according to claim 14, wherein the at least one processor executes the set of instructions to:
Weinheber further teaches obtain current states of the mobile image capturing apparatus and the mobile object as current information after the change of the image capturing scenario; and instruct the mobile image capturing apparatus to restart the image capturing in a case where it is determined that the image capturing according to the image capturing scenario after the change can be performed based on the current information. (see at least [0095-0097, 0127-0156]: Companion drones may be operated automatically to maintain VLOS with each other including dynamically adjusting one or more of the flight parameters of one or more of the companion drones, for example, position, speed, altitude and/or the like to overcome visibility limitations imposed by, for example, blocking object(s), poor visibility (e.g. low illumination, bad weather, etc.) and/or the like. Moreover, companion drones may be operated automatically to track each other around a center of field of View (FOV) of the imaging sensor(s) of the drone(s) to ensure that the tracked companion drones and their surrounding environment are effectively seen in the image streams. One or more of the drones may include one or more gimbal mounted imaging sensors which may be dynamically positioned and adjusted to be face their companion drone such that the companion drone is in the FOV of the imaging sensors. One or mor of the drones may include wide FOV imaging sensors configured to monitor and capture imagery data (image stream) of a wide portion of the environment of the respective drone including their companion drone. One or more the drones may be operated and/or instructed to fly in a position to bring and/or put their companion drone in the FOV of their imaging sensor. )
It may be alleged that Weinheber does not expressly recites obtaining current information after a change in the image capturing scenario and restarting image capturing when the image capturing scenario after the change can be performed. Examiner notes that Weinheber disclose an autonomous information processing system that continuously obtaining current state information of autonomous drones and tracked mobile objects, including position, orientation and relative location and determining whether effect image capture is achievable (e.g. making sure the companion drone is placed in the center of FOV with clear visibility of surrounding while avoiding any environment obstructions by adjusting drone’s operating parameters).
One having ordinary skill in the art before the effective filing date of the claimed invention would have been motivated to modify the system of Weinheber to obtain updated state information and restart image capturing after modifying an image capturing strategy because reassessing system state and resuming sensing upon feasibility and doing so would increase coverage and mission efficiency without operator intervention.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANA F ARTIMEZ whose telephone number is (571)272-3410. The examiner can normally be reached M-F: 9:00 am-3:30 pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Faris S. Almatrahi can be reached at (313) 446-4821. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DANA F ARTIMEZ/Examiner, Art Unit 3667
/FARIS S ALMATRAHI/Supervisory Patent Examiner, Art Unit 3667