Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Foreign Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 05/08/25, 02/11/25, 01/27/25 and 11/13/24 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
3.) Claim(s) 1-4 and 7-20 is/are rejected under 35 U.S.C. 102 (a1) (a2) as being anticipated by Sato et al. (US Pub No.: 2021/0218935A1).
Regarding Claim 1, Sato et al. disclose an information processing apparatus controlling a mobile image capturing apparatus equipped with an image capturing apparatus (Drone system 100 includes a drone 30 equipped with a camera 52, Paragraph 0042; Abstract and Figure 3), the information processing apparatus comprising:
a memory device that stores a set of instructions (Memory storing instructions, Paragraphs 0059-0060; Figures 3-4); and
at least one processor that executes the set of instructions (General purpose computer 80 includes a CPU 81 that executes the programs, Paragraphs 0059-0060; Figures 3-4) to:
obtain steering instruction information about a mobile object with steering (Vehicle 10 with steering) (In the drone system according to the present disclosure, the traveling information of the vehicle may include speed information, acceleration information, braking information, and steering information. The communication device 16 sends, to the drone 30, the traveling information including the speed information, the acceleration information, the braking information, and the steering information input from the vehicle controller 11, together with the navigation information including the latitude and longitude data of the current position of the vehicle 10 and the traveling direction data input from the navigation device 21, via the communication line 60. , Paragraphs 0014, 0044, 0050-0052; Figures 1-3);
control image capturing by the image capturing apparatus (The drone 30 includes a controller 31 for controlling flight and controlling the camera 52, and a communication device 45 connected to the vehicle controller 31 and communicating with the communication line 60, Paragraph 0051; Figures 1-3); and
control a location and an orientation of the mobile image capturing apparatus by driving the mobile image capturing apparatus based on the steering instruction information (The vehicle information acquisition unit 32 of the controller 31 acquires, from the communication device 45, the vehicle information including the traveling information that includes the speed information, the acceleration information, the braking information, and the steering information, and the navigation information that includes the latitude and longitude data of the current position and the traveling direction data of the vehicle 10 received by the communication device 45 from the vehicle 10 via the communication line 60, and outputs the information to the traveling position estimation unit 33. The traveling position estimation unit 33 estimates a future traveling position of the vehicle 10 based on the vehicle information input from the vehicle information acquisition unit 32 and outputs latitude and longitude information of the estimated traveling position to the flight path calculation unit 34. The storage unit 35 stores data for flight control and camera control, Paragraphs 0051-0054; Figures 1-4).
With regard to Claim 2, Sato et al. disclose the information processing apparatus according to claim 1, wherein the at least one processor executes the set of instructions to: obtain and hold an image capturing scenario list including a plurality of image capturing scenarios (scene information sets) in which conditions related to image capturing by the image capturing apparatus are described; and select one of the plurality of the image capturing scenarios in accordance with the steering instruction information (In the case of capturing such a video, a scene information set 36a of Number 001 in the scene information database 36 shown in FIG. 5 stores the scene 71 shown in FIG. 6, relative positions in the vehicle front-and-rear direction, in the vehicle right-and-left direction, and in the vehicle height direction constituting the relative position d1 shown in FIG. 11, and a duration of AA seconds of the scene 71. The scene information set 36a of Number 001 stores, as the relative positions in the directions constituting the relative position d1, data indicating that the relative position in the vehicle front-and-rear direction is located 100 m in front of the vehicle 10, the relative position in the vehicle right-and-left direction is located at the center of the vehicle 10, and that the relative position in the vehicle height direction is located at an altitude of 10 m from the road surface.
Similarly, a scene information set 36a of Number 002 stores the scene 72 shown in FIG. 7, relative positions in the vehicle front-and-rear direction, in the vehicle right-and-left direction, and in the vehicle height direction constituting the relative position d2 shown in FIG. 11, and a duration of BB seconds of the scene 72. The scene information set 36a of Number 002 stores, as the relative positions in the directions constituting the relative position d2, data indicating that the relative position in the vehicle front-and-rear direction is located 30 m in front of the vehicle 10, that the relative position in the vehicle right-and-left direction is located at the center of the vehicle 10, and that the relative position in the vehicle height direction is located at an altitude of 10 m from the road surface. Similarly, scene information sets 36a of Numbers 003 to 005 store the scenes 73 to 75 shown in FIGS. 8 to 10, relative positions in the vehicle front-and-rear direction, relative positions in the vehicle right-and-left direction, and relative positions in the vehicle height direction constituting the relative positions d3 to d5 shown in FIG. 11, and durations of CC to EE seconds of the scenes 73 to 75. The flight path calculation unit 34 of the controller 31 reads out the scene information sets 36a of Number 001 and Number 002 from the scene information database 36 stored in the storage unit 35, Paragraphs 0054, 0067-0070; Figures 5-11).
In regard to Claim 3, Sato et al. disclose the information processing apparatus according to claim 2, wherein an image capturing start condition of the image capturing apparatus and a location-and-orientation condition of the mobile image capturing apparatus (location and orientation of the vehicle) are described in each of the plurality of image capturing scenarios (The storage unit 35 stores data for flight control and camera control. The storage unit 35 also stores a scene information database 36. As shown in FIG. 5, the scene information database 36 contains a plurality of scene information sets 36a arranged in time series of a video to be captured, and in each of the scene information sets 36, a scene of the video, a relative position with respect to the vehicle 10 in capturing the scene, and a duration of the scene are associated with one another. The scene information database 36 will be described later with reference to FIGS. 5 to 11. It is also possible to adopt a scene information database 36 that stores a plurality of scene information sets 36a arranged in time series of a video to be captured, in each of which no data of a scene of the video is included, and a relative position with respect to the vehicle 10 in capturing the scene of the video and a duration of the scene are associated with each other. A specific example of the scene information database 36 will now be described. Here, a description will be given for the case where a video including five scenes 71 to 75 which are as shown in FIGS. 6 to 10 is captured. The video starts from the scene 71 where a road 70 and the vehicle 10 traveling on the road 70 are looked down on from the sky in front of the vehicle 10 as shown in FIG. 6, and then proceeds to the scene 72 in which the vehicle 10 approaches as shown in FIG. 7, Paragraphs 0054, 0063; Figures 5-11).
Regarding Claim 4, Sato et al. disclose the information processing apparatus according to claim 3, wherein the at least one processor executes the set of instructions to selects an image capturing scenario of which the image capturing start condition matches the steering instruction information obtained from among the plurality of image capturing scenarios (Image capturing is performed such that the scenes 73 to 75 start to be captured successively at intervals that are their respective predetermined durations. Here, the relative position d4 is a position located 12 m in front of the vehicle 10, 1 m on the left side of the vehicle 10, and at an altitude of 1.5 m, and the relative position d5 is a position located 13 m in front of the center of the vehicle 10 and at an altitude of 1.5 m. As such, the drone 30 is caused to fly to pass through the relative positions d1 to d5 of the drone 30 with respect to the vehicle 10 at intervals that are the respective durations of the scenes 71 to 75, and the drone 30 can thus capture a video in which the scenes 71 to 75 shown in FIGS. 6 to 10 are successive, Paragraphs 0062-0070; Claims 1 and 7).
With regard to Claim 7, Sato et al. disclose the information processing apparatus according to claim 3, wherein the location-and-orientation condition includes a relative position between the mobile image capturing apparatus and the mobile object (In each of the scene information sets, a relative position with respect to the vehicle in capturing a scene of the video and a duration of the scene are associated with each other, a vehicle information acquisition unit that receives the vehicle information from the vehicle, a traveling position estimation unit that estimates a future traveling position of the vehicle based on the received vehicle information, and a flight path calculation unit that, based on the future traveling position of the vehicle estimated by the traveling position estimation unit and the scene information database, calculates, for each of the scenes, a flight path that passes through the relative position with respect to the vehicle, Paragraphs 0006-0008; Figure 5), and wherein the at least one processor executes the set of instructions to start recording a moving image by the image capturing apparatus in a case where the location and orientation of the mobile image capturing apparatus controlled matches the relative position described in the image capturing scenario selected (The drone operation center includes a server, the server includes a storage unit that stores a scene information database containing a plurality of scene information sets arranged in time series of a video to be captured, in each of the scene information sets, a relative position with respect to the vehicle in capturing a scene of the video and a duration of the scene are associated with each other, a vehicle information acquisition unit that receives the vehicle information from the vehicle, a traveling position estimation unit that estimates a future traveling position of the vehicle based on the received vehicle information, and a flight path calculation unit that, based on the current flight position of the drone received from the drone, the future traveling position of the vehicle estimated by the traveling position estimation unit, and the scene information database, calculates, for each of the scenes, a flight path that passes through the relative position with respect to the vehicle and sends the result to the drone, and the drone captures an image of the vehicle with the camera while flying autonomously according to the flight path received from the server, Paragraphs 0019-0020, 0054-0055; Figures 5-11).
In regard to Claim 8, Sato et al. disclose the information processing apparatus according to claim 7, wherein the location-and-orientation condition further includes a camera work of the image capturing apparatus, and wherein the at least one processor executes the set of instructions to control the location and orientation of the mobile image capturing apparatus in accordance with the camera work described in the image capturing scenario selected while the moving image is recorded by the image capturing apparatus (The storage unit 35 stores data for flight control and camera control. The storage unit 35 also stores a scene information database 36. As shown in FIG. 5, the scene information database 36 contains a plurality of scene information sets 36a arranged in time series of a video to be captured, and in each of the scene information sets 36, a scene of the video, a relative position with respect to the vehicle 10 in capturing the scene, and a duration of the scene are associated with one another. The scene information database 36 will be described later with reference to FIGS. 5 to 11. It is also possible to adopt a scene information database 36 that stores a plurality of scene information sets 36a arranged in time series of a video to be captured, in each of which no data of a scene of the video is included, and a relative position with respect to the vehicle 10 in capturing the scene of the video and a duration of the scene are associated with each other. The camera control unit 38 controls the direction and angle of view of the camera 52 based on the flight information including the speed data, the altitude data, and the azimuth data input from the flight control unit 37 and the scene information sets 36a of the scene information database 36 stored in the storage unit 35. The operation of the camera control unit 38 will be described later with reference to FIGS. 12 to 14, Paragraphs 0054-0059; Figures 5-11).
Regarding Claim 9, Sato et al. disclose the information processing apparatus according to claim 7, wherein each of the plurality of the image capturing scenarios further includes an image capturing end condition of the image capturing apparatus (duration of a scene), and wherein the at least one processor executes the set of instructions to finish recording the moving image by the image capturing apparatus in a case where the steering instruction information matches the image capturing end condition described in the image capturing scenario selected (The scene information database 36 contains the plurality of scene information sets 36a arranged in time series of a video to be captured, and in each of the scene information sets 36, a scene of the video to be captured, a relative position with respect to the vehicle 10 in capturing the scene, and a duration of the scene are associated with one another. Here, the relative position with respect to the vehicle 10 means a group of relative positions including a relative position of the drone 30 with respect to the vehicle 10 in the vehicle front-and-rear direction, a relative position of the drone 30 with respect to the vehicle 10 in the vehicle right-and-left direction, and a relative position of the drone 30 with respect to the vehicle 10 in the vehicle height direction. As shown in FIG. 5, each scene information set 36a is a set including a number, a scene, relative positions of the drone 30 with respect to the vehicle 10 in the vehicle front-and-rear direction, in the vehicle right-and-left direction, and in the vehicle height direction in capturing the scene, and a duration of the scene. The numbers are given in time series of scenes of the video to be captured. The time t2 is end time of the scene 71 or start time of the scene 72, and the times t1 and t2 are times at which traveling positions are estimated, Paragraphs 0062-0075; Figure 5).
In regard to Claim 10, Sato et al. disclose the information processing apparatus according to claim 7, wherein each of the plurality of the image capturing scenarios further includes a main object to be captured by the image capturing apparatus (traveling vehicle 10 from the scene 71), and wherein the at least one processor executes the set of instructions to control the location and orientation of the mobile image capturing apparatus so that the main object described in the image capturing scenario selected will be included in a field angle of the image capturing apparatus while the moving image is recorded by the image capturing apparatus (The camera control unit 38 captures a video of the traveling vehicle 10 from the scene 71 until before the scene 72 while adjusting the direction of the camera 52 toward the vehicle 10 based on flight information of the drone 30 including, for example, the flight direction and the flying speed input from the flight control unit 37 and the information about the relative positions d1 and d2 with respect to the vehicle 10 stored in the scene information database 36, Paragraphs 0076-0079; Figures 5 and 13).
With regard to Claim 11, Sato et al. disclose the information processing apparatus according to claim 3, wherein the at least one processor executes the set of instructions to: obtain location information about the mobile object (traveling vehicle); repeat to obtain the steering instruction information until one image capturing start condition of the plurality of the image capturing scenarios matches the steering instruction information; and perform tracking movement of the mobile object based on the location information (If the flight control unit 37 makes a determination of “NO” in step S208, the flight control unit 37, the camera control unit 38, and the navigation device 21 of the vehicle 10 repeat steps S202 to S207 in FIG. 13. They receive, from the flight path calculation unit 34, data of the flight path for capturing a video from the scene 73 of Number 003 until before the scene 74 of Number 004 and the flight path for capturing a video from the scene 74 of Number 004 until before the scene 75 of Number 005 in the scene information database 36, capture videos of the scenes, and display the results on the display 27 of the vehicle 10 in real time, Paragraphs 0082-0085; Figure 13).
In regard to Claim 12, Sato et al. disclose the information processing apparatus according to claim 1, wherein the image capturing apparatus captures an image of the mobile object (The drone can thus capture images of the traveling vehicle from various angles in a more reliable manner, Paragraph 0013. An image of the traveling vehicle 10 is captured from the side of the vehicle 10 to the scenes 74 and 75 captured while the drone 30 gradually moves around to the front of the vehicle 10, Paragraphs 0062-0063; Figures 6-10).
With regard to Claim 13, Sato et al. disclose the information processing apparatus according to claim 1, wherein the steering instruction information is related to a steering instruction for at least one of a location and a motion of the mobile object (traveling position of vehicle 10) (The vehicle information acquisition unit 32 of the controller 31 acquires, from the communication device 45, the vehicle information including the traveling information that includes the speed information, the acceleration information, the braking information, and the steering information, and the navigation information that includes the latitude and longitude data of the current position and the traveling direction data of the vehicle 10 received by the communication device 45 from the vehicle 10 via the communication line 60, and outputs the information to the traveling position estimation unit 33. The traveling position estimation unit 33 estimates a future traveling position of the vehicle 10 based on the vehicle information input from the vehicle information acquisition unit 32 and outputs latitude and longitude information of the estimated traveling position to the flight path calculation unit 34. The storage unit 35 stores data for flight control and camera control, Paragraphs 0051-0058, 0070, 0086; Figures 1-4).
Regarding Claim 14, Sato et al. disclose the information processing apparatus according to claim 13, wherein the steering instruction information is related to at least one of steering instructions about a speed or a direction of the mobile object, an operation of a movable mechanism included in the mobile object, and notification by sound or light by a notification mechanism included in the mobile object (The vehicle information acquisition unit 32 of the controller 31 acquires, from the communication device 45, the vehicle information including the traveling information that includes the speed information, the acceleration information, the braking information, and the steering information, and the navigation information that includes the latitude and longitude data of the current position and the traveling direction data of the vehicle 10 received by the communication device 45 from the vehicle 10 via the communication line 60, and outputs the information to the traveling position estimation unit 33. The traveling position estimation unit 33 estimates a future traveling position of the vehicle 10 based on the vehicle information input from the vehicle information acquisition unit 32 and outputs latitude and longitude information of the estimated traveling position to the flight path calculation unit 34. The storage unit 35 stores data for flight control and camera control. The drone system 100 can therefore quickly calculate future traveling positions of the vehicle 10, and even when the traveling speed of the vehicle 10 is fast, the drone 30 can be ensured to fly to pass through the relative positions d1 to d5 or the absolute positions ad1 to ad5 with respect to the vehicle 10 in capturing images of the scenes of the video, Paragraphs 0051-0058, 0070, 0086; Figures 1-4).
In regard to Claim 15, Sato et al. disclose the information processing apparatus according to claim 1 that is included in the mobile image capturing apparatus (See controller 31 that is implemented by a computer, Paragraphs 0059-0060; Figures 2-4).
With regard to Claim 16, Sato et al. disclose the information processing apparatus according to claim 1 that is included in the mobile object (See vehicle controller 11 and general purpose computer, Paragraphs 0060-0061; Figures 2-4).
Regarding Claim 17, Sato et al. disclose the information processing apparatus according to claim 1 that is included in a server that communicable with the mobile image capturing apparatus and the mobile object (As shown in FIGS. 16 and 17, in the drone system 200, a server 191 of the drone operation center 190 has a storage unit 195 for storing the scene information database 36. A flight path calculation unit 194 of the server 191 calculates a flight path based on vehicle information acquired from the vehicle 10 and the scene information database 36 and transmits the result to the drone 130. The drone 130 captures a video of the traveling vehicle 10 while flying according to the flight path transmitted from the server 191, Paragraphs 0094, 0098-0101).
With regard to Claim 18, Sato et al. disclose an information processing system (Drone system 100 includes a drone 30 equipped with a camera 52, Paragraph 0042; Abstract and Figure 3), comprising:
a mobile object with steering (Vehicle 10 with steering) (In the drone system according to the present disclosure, the traveling information of the vehicle may include speed information, acceleration information, braking information, and steering information. The communication device 16 sends, to the drone 30, the traveling information including the speed information, the acceleration information, the braking information, and the steering information input from the vehicle controller 11, together with the navigation information including the latitude and longitude data of the current position of the vehicle 10 and the traveling direction data input from the navigation device 21, via the communication line 60. , Paragraphs 0014, 0044, 0050-0052; Figures 1-3);
a mobile image capturing apparatus equipped with an image capturing apparatus that captures an image of the mobile object (Drone system 100 includes a drone 30 equipped with a camera 52, Paragraph 0042; Abstract and Figure 3); and
an information processing apparatus configured to control the mobile image capturing apparatus, the information processing apparatus (General purpose computer 80 includes a CPU 81 that executes the programs, Paragraphs 0059-0060; Figures 3-4) comprising:
a memory device that stores a set of instructions (Memory storing instructions, Paragraphs 0059-0060; Figures 3-4); and
at least one processor that executes the set of instructions (General purpose computer 80 includes a CPU 81 that executes the programs, Paragraphs 0059-0060; Figures 3-4) to:
obtain steering instruction information about the mobile object ) (In the drone system according to the present disclosure, the traveling information of the vehicle may include speed information, acceleration information, braking information, and steering information. The communication device 16 sends, to the drone 30, the traveling information including the speed information, the acceleration information, the braking information, and the steering information input from the vehicle controller 11, together with the navigation information including the latitude and longitude data of the current position of the vehicle 10 and the traveling direction data input from the navigation device 21, via the communication line 60. , Paragraphs 0014, 0044, 0050-0052; Figures 1-3);
control image capturing by the image capturing apparatus (The drone 30 includes a controller 31 for controlling flight and controlling the camera 52, and a communication device 45 connected to the vehicle controller 31 and communicating with the communication line 60, Paragraph 0051; Figures 1-3); and
control a location and an orientation of the mobile image capturing apparatus by driving the mobile image capturing apparatus based on the steering instruction information (The vehicle information acquisition unit 32 of the controller 31 acquires, from the communication device 45, the vehicle information including the traveling information that includes the speed information, the acceleration information, the braking information, and the steering information, and the navigation information that includes the latitude and longitude data of the current position and the traveling direction data of the vehicle 10 received by the communication device 45 from the vehicle 10 via the communication line 60, and outputs the information to the traveling position estimation unit 33. The traveling position estimation unit 33 estimates a future traveling position of the vehicle 10 based on the vehicle information input from the vehicle information acquisition unit 32 and outputs latitude and longitude information of the estimated traveling position to the flight path calculation unit 34. The storage unit 35 stores data for flight control and camera control, Paragraphs 0051-0054; Figures 1-4).
Method Claim 19 and computer program storing Claim 20 correspond to apparatus claim 1 and are rejected as discussed in the above rejection to apparatus claim 1 (Also see Paragraph 0060).
4.) Allowable Subject Matter
Claims 5-6 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PRITHAM DAVID PRABHAKHER whose telephone number is (571)270-1128. The examiner can normally be reached Monday to Friday 8:00 am to 5:00 pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Lin Ye can be reached at 5712727372. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
Pritham David Prabhakher
Patent Examiner
Pritham.Prabhakher@uspto.gov
/PRITHAM D PRABHAKHER/Primary Examiner, Art Unit 2638