DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant's arguments filed December 8, 2025 have been fully considered but they are not persuasive.
In re pages 8-9, applicants argue, with respect to claim 1, that the cited references, and any combination thereof, fail to teach or suggest the claimed "…determining a range for providing notification of information regarding a specific object on the basis of a moving direction of the specific object detected on the basis of an image, a position where the image has been captured, and an elapsed time since the image has been captured... ". While HIEIDA uses the time-series information to estimate the movement range of objects, HIEIDA does not use it to determine a range for providing notification of information regarding the objects. Therefore, HIEIDA fails to teach or suggest "determining a range for providing notification of information regarding a specific object on the basis of an elapsed time since the image has been captured,".
In response, the examiner respectfully disagrees. As recognized by applicants that HIEIDA merely discloses estimating a moving range of each of the objects on the basis of at least either the information associated with the contact region of the corresponding object, or the time-series information associated with the contact region of the corresponding object (HIEIDA, 0122). HIEIDA further discloses in page 10, [0138] “In a case where the danger level determination unit 209 determines that there is a danger of a collision between the pedestrian A and the own vehicle, the drive assist control unit 210 plans an action of the own vehicle for avoiding a collision between the own vehicle and the pedestrian A, and supplies data indicating the planned action of the own vehicle to the acceleration/deceleration control unit 172 and the direction control unit 173 to achieve damage reduction braking. Moreover, the drive assist control unit 210 may also be configured to give warning, such as output of audio data containing a warning sound, a warning message, or the like from the output unit 106, as well as damage reduction braking”. From the above passages, it is clear that the claimed "…determining a range for providing notification of information regarding a specific object on the basis of a moving direction of the specific object detected on the basis of an image, a position where the image has been captured, and an elapsed time since the image has been captured...” is met by the information processing system 200 has a function of estimating a moving range of an object such as a pedestrian and a bicycle as stated in the last Office Action.
As discussed above, HIEIDA does disclose all the claimed limitations of claim 1.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-8, 11 and 15-16 are rejected under 35 U.S.C. 102(a)(2) as being anticipate by HIEIDA et al. (US 2022/0169245 A1).
In considering claim 1, HIEIDA et al. discloses all the claimed subject matter, note 1) the claimed at least one memory storing instructions, and at least one processor configured to execute the instructions to is met by the storage unit 111 and the drive control unit 107 (Fig. 1, page 5, paragraph #0079 to paragraph #0084), 2) the claimed determine a range for providing notification of information regarding a specific object on the basis of a moving direction of the specific object detected on the basis of an image, a position where the image has been captured, and an elapsed time since the image has been captured is met by the information processing system 200 has a function of estimating a moving range of an object such as a pedestrian and a bicycle (i.e., range of possible running out) on the basis of image information indicating surroundings of the own vehicle and captured by an in-vehicle camera, for example, and drive assistance such as warning to the driver, brake assist operation, or control of automatic operation is achievable on the basis of an estimation result obtained by the information processing system 200 (Fig. 2, page 8, paragraph #0113 to page 9, paragraph #0124), and 3) the claimed transmit the information regarding the specific object to a device corresponding to the range is met by transmit the information to the danger level determination unit 209 which determines a danger level of a collision with the own vehicle for each of the objects (Fig. 2, page 8, paragraph #0122 to page 9, paragraph #0126).
In considering claim 2, the claimed wherein the at least one processor is configured to transmit information regarding the specific object to a device corresponding to a determined range when the elapsed time is a first elapsed time, a
PNG
media_image1.png
12
17
media_image1.png
Greyscale
then transmit information regarding the specific object to a device corresponding to a determined range when the elapsed time is a second elapsed time after the first elapsed time is met by the time-series information associated with the contact region of each of the objects is represented as category information associated with the contact region of the corresponding object for each predetermined interval (time interval or distance interval) (first and second elapsed time) (Fig. 2, page 8, paragraph #0122 to page 9, paragraph #0126).
In considering claim 3, the claimed wherein the at least one processor is further configured to determine a range for providing notification of information regarding the specific object on the basis of transportation means of the specific object is met by the exterior information detection unit 141 performs a detection process for detecting information associated with the outside of the own vehicle, examples of the object as a detection target include a vehicle, a human, an obstacle, a structure, a road, a traffic light, a traffic sign, and a road sign (Fig. 1, page 5, paragraph #0085 to page 6, paragraph #0089).
In considering claim 4, the claimed wherein the at least one processor is further configured to determine a range for providing notification of information regarding the specific object on the basis of a moving speed of the specific object is met by the time-series information associated with the contact region of each of the objects contains speed information associated with the object, the object moving range estimation unit 207 may estimate the moving range in consideration of the speed information associated with the object (Fig. 2, page 8, paragraph #0121 to page 9, paragraph #0126).
In considering claim 5, the claimed wherein the at least one processor is further configured to determine a range for providing notification of information regarding the specific object on the basis of a degree of congestion of a road on which the specific object moves is met by the route planning unit 161 changes the route as necessary on the basis of a traffic jam, an accident, traffic restriction, a situation of construction or the like, a physical condition of the driver, and the like (Fig. 1, page 6, paragraph #0097 to page 7, paragraph #0101).
In considering claim 6, the claimed wherein the at least one processor is further configured to determine a range for providing notification of information regarding the specific object on the basis of information indicating a signal switching time of a traffic light of a road on which the specific object moves is met by the traffic rule recognition unit 152 performs a recognition process for recognizing traffic rules around the own vehicle include a behavior of a dynamic object around the own vehicle, a change of a traffic light state, and a change of an environment such as weather (Fig. 1, page 6, paragraph #0092 to paragraph #0097).
In considering claim 7, the claimed wherein the at least one processor is configured to transmit information regarding the specific object to a wireless communication terminal located in a base station installed in the determined range is met by the communication unit 103 which communicates with an apparatus (e.g., an application server or a control server) (Fig. 1, page 4, paragraph #0073 to paragraph #0075).
In considering claim 8, the claimed wherein the at least one processor is configured to, in a case where the specific object is a person, transmit a processed image in which at least a part of a face area of the person in the image is processed is met by the data acquisition unit 102 includes an imaging device for imaging a driver, a biosensor for detecting biological information associated with the driver, and a microphone for collecting sounds in the vehicle interior (Fig. 1, page 4, paragraph #0072 and page 5, paragraph #0085 to paragraph #0087).
In considering claim 11, the claimed wherein the at least one processor is configured to transmit information for increasing a display period of a signal for prohibiting progress in the moving direction to a traffic light corresponding to the moving direction is met by the conduct planning unit 162 plans a start, a stop, a traveling direction (e.g., forward movement, backward movement, left turn, right turn, and direction change), a traveling lane, a traveling speed, passing, and the like (Fig. 1, page 5, paragraph #0078 and page 7, paragraph #0100).
Claim 15 is rejected for the same reason as discussed in claim 1 above.
Claim 16 is rejected for the same reason as discussed in claim 1 above.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over HIEIDA et al. (US 2022/0169245 A1) in view of Takamatsu et al. (US Patent No. 10,891,495 B2).
In considering claim 9, HIEIDA et al. disclose all the limitations of the instant invention as discussed in claim 1 above, except for providing the claimed wherein the at least one processor is configured to, in a case where the specific object is a vehicle, transmit a processed image in which at least a part of a license plate area of the vehicle in the image is processed. Takamatsu et al. teach that the identification information may be an identification ID sent from the vehicle or the sensor apparatus, an automobile registration number obtained by performing image recognition on the license plate, or the like (Figs. 4-5, col. 8, line 65 to col. 9, line 18). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the license plate as taught by Takamatsu et al. into HIEIDA et al.’s system in order to accurately track the vehicle.
Allowable Subject Matter
7. Claims 10 and 12-14 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
8. THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
9. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TRANG U TRAN whose telephone number is (571)272-7358. The examiner can normally be reached M-F 10:00AM- 6:00PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JOHN W. MILLER can be reached at 571-272-7353. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
February 24, 2026
/TRANG U TRAN/Primary Examiner, Art Unit 2422