DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Remarks
This communication is in response to the Applicant’s Amendment filed on 01/22/2026. Claims 1-20 were pending. Claims 1, 3, and 19-20 have been amended. Claims 2, 7-8, 10-11 and 13-14 are cancelled. Claims 1, 3-6, 9, 12, and 15-20 are currently pending.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1, 3-6, 9, 12, and 15-20 are rejected under 35 U.S.C. 103 as being unpatentable over KUREHASHI et al. (US 20220406179 A1, hereinafter “KUREHASHI”) in view of LEE et al. (US 20180295474 A1, hereinafter “LEE”).
Regarding claim 1. (Currently Amended) KUREHASHI discloses an information processing device provided in a moving body, the information processing device comprising:
an acquiring unit which acquires information detected by an image-capturing device provided in a moving body (0015-0018, 0023 and 0049; Figures 1-3 and 5; wherein the control apparatus 64 of vehicle 60 determined the risk area using the sensor 69 that includes camera);
a predicting unit which predicts at least one of a movement or a moving direction of a moving object which exists in a vicinity of the moving body based on the information acquired by the acquiring unit (0016-0019; Figures 1-2);
a warning control unit which performs control to output a warning based on at least one of the movement or the moving direction of the moving object predicted by the predicting unit (0023; Figure 2; “[0021] FIG. 2 illustrates a state when the vehicle 20 has reached a position close to the area 110. When the vehicle 20 reaches a position in front of the position of the area 110 by a predetermined distance, the control apparatus 24 transmits risk area information including the stored position information of the area 110 by broadcasting using a wireless signal. When the risk area information transmitted from the control apparatus 24 is received, in a case where the current position of the terminal 82 is in the risk area, the terminal 82 outputs warning information through the human machine interface (HMI) function of the terminal 82. In addition, as response information to the risk area information, the terminal 82 transmits response information including the current position of the terminal 82 using a wireless signal. …”); and
a transmission control unit which performs control to transmit information indicating at least one of the movement or the moving direction of the moving object predicted by the predicting unit to another moving body around the moving body0018-0023; Figure 2; “[0023] In this manner, the control apparatus 24 can receive and store the position information of the risk area from the other vehicle 60 or the server 52 in advance, and can transmit the risk area information when the vehicle 20 reaches a position close to the risk area. With this configuration, even when the control apparatus 24 does not have a function of recognizing the risk area by sensing means such as a camera, the control apparatus 24 can output the warning to the terminal 82 or perform the travelling assistance of the vehicle 20 by using the risk area received from the other vehicle 60 or the server 52 through the wireless communication function. ...”), wherein
the acquiring unit acquires an image captured by the image-capturing device (0015, 0023 and 0049; Figures 1-2 and 5; “[0015] The vehicle 20 and the vehicle 60 are one example of a movable object. The vehicle 20 includes a control apparatus 24. The control apparatus 24 includes a communication function. The vehicle 60 includes a sensor 69 and a control apparatus 64. The sensor 69 is configured by including a camera. The control apparatus 64 includes a processing function of information obtained by the sensor 69, and a communication function.”).
KUREHASHI failed to disclose a predicting unit which predicts at least one of a movement or a moving direction of a moving object which is external to the moving body and exists in a vicinity of the moving body based on the information acquired by the acquiring unit;
a warning control unit which performs control to output a warning based on at least one of the movement or the moving direction of the moving object predicted by the predicting unit; and
a transmission control unit which performs control to transmit information indicating at least one of the movement or the moving direction of the moving object predicted by the predicting unit to another moving body around the moving body the another moving body being different from the moving body and the moving object, wherein
the predicting unit determines an attitude of the moving object based on the image acquired by the acquiring unit and predicts at least one of the movement or the moving direction of the moving object based on the attitude of the moving object that is determined.
LEE, however, in the same field of endeavor, shows an information processing device provided in a moving body, the information processing device comprising:
a predicting unit which predicts at least one of a movement or a moving direction of a moving object which is external to the moving body and exists in a vicinity of the moving body based on the information acquired by the acquiring unit (0017-0018, 0023, 0071; Figures 4; “[0017] The transmission frequency may be calculated based on a velocity of the VUE and a velocity of another VUE or a pedestrian UE (PUE).”, “[0071] The V2X communication described in the embodiments of the present application includes four types of V2X applications, such as (i) Vehicle to Vehicle (V2V), (ii) Vehicle to Infrastructure (V2I), (iii) Vehicle to Network (V2N), and (iv) Vehicle to Pedestrian (V2P).”);
a warning control unit which performs control to output a warning based on at least one of the movement or the moving direction of the moving object predicted by the predicting unit (0090; Figures 5a-c; “[0090] Referring to FIG. 5(a), in the V2P use case situations, there are two different ways of warning notifications (1) warning to pedestrian and (2) warning to vehicle. The warning to pedestrian can help pedestrians (or vulnerable road users) avoid potential risks associated with vehicles moving towards or around them, such as collision, and the warning to vehicle can help drivers avoid potential risk associated with pedestrians (or vulnerable road users) walking or moving around their vehicle, such as collision.”); and
a transmission control unit which performs control to transmit information indicating at least one of the movement or the moving direction of the moving object predicted by the predicting unit to another moving body around the moving body the another moving body being different from the moving body and the moving object (0090-0094; Figures 5a-c; “[0093] If the vehicle in FIG. 5(b) and that in FIG. 5 (c) have the same transmission power and transmission frequency for a V2X warning message, the user in FIG. 5 (c) (vehicle velocity is 25 miles/hr) may be exposed to higher level of risk of collision because her/his ample time to prepare is clearly shorter than that the user in FIG. 5 (b) has (vehicle velocity is 25 miles/hr). The user in FIG. 5 (b) and that in FIG. 5 (c) may have the similar ample time to prepare if the transmission range of vehicle in FIG. 5 (b) is twice as big as that in FIG. 5 (c).”), wherein
the predicting unit determines an attitude of the moving object based on the image acquired by the acquiring unit and predicts at least one of the movement or the moving direction of the moving object based on the attitude of the moving object that is determined (0090-0093; Figures 5a-c; “[0090] Referring to FIG. 5(a), in the V2P use case situations, there are two different ways of warning notifications (1) warning to pedestrian and (2) warning to vehicle. The warning to pedestrian can help pedestrians (or vulnerable road users) avoid potential risks associated with vehicles moving towards or around them, such as collision, and the warning to vehicle can help drivers avoid potential risk associated with pedestrians (or vulnerable road users) walking or moving around their vehicle, such as collision.”).
It would have been obvious to the person of having ordinary skilled in the art before the effective filing date of the invention to combine the risk calculation for potential accident or collision between the vehicle and external moving body (another vehicle or pedestrian) as taught by LEE in the risk analysis for moving objects of KUREHASHI in order to accurately and effectively predict possible accidents or collision and yield predictable result.
Regarding claim 3. (Currently Amended) KUREHASHI discloses the information processing device according to claim 1, wherein
the acquiring unit acquires the image captured by the image-capturing device (0015, 0023 and 0049; Figures 1-2 and 5; “[0015] The vehicle 20 and the vehicle 60 are one example of a movable object. The vehicle 20 includes a control apparatus 24. The control apparatus 24 includes a communication function. The vehicle 60 includes a sensor 69 and a control apparatus 64. The sensor 69 is configured by including a camera. The control apparatus 64 includes a processing function of information obtained by the sensor 69, and a communication function.”),
the predicting unit determines an attitude of the moving object based on the image acquired by the acquiring unit and predicts as the movement of the moving object whether the moving object will move toward a path of the moving body based on the attitude of the moving object that is determined (0034 and 0039-0040; Figures 2-3; Claims 3-4, 9 and 11-13; “[0034] When the determination unit 230 determines that the vehicle 20 is in the vicinity of the risk area, the transmission control unit 250 performs the control to transmit information related to the presence of the vehicle 20. ...”).
Regarding claim 4. (Original) KUREHASHI discloses the information processing device according to claim 1, wherein
the predicting unit predicts a movement velocity of the moving object based on the information acquired by the acquiring unit (0026, 0033-0034 and 0039-0040; Figure 3; “[0026] The sensor 29 includes a GNSS reception unit 25 and a vehicle speed sensor 26. The GNSS reception unit 25 is configured to receive radio waves emitted from a global navigation satellite system (GNSS) satellite. The GNSS reception unit 25 generates information indicating a current position of the vehicle 20 based on a signal received from the GNSS satellite. ...”).
Regarding claim 5. (Original) KUREHASHI discloses the information processing device according to claim 4,
wherein the warning control unit performs control to output the warning to an occupant of the moving body when it is determined, based on the movement velocity of the moving object predicted by the predicting unit, that the moving object will get into a path of the moving body before the moving body reaches a location of the moving object in a direction of travel of the moving body (0037, 0052 and 0063-0066; Figures 2 and 4-6; “[0037] The control unit 208 may execute driver assistance of the vehicle 20 or an alert for an occupant of the vehicle 20 based on the response information. … when the information output apparatus 40 includes a head-up display, the control unit 208 may cause the head-up display of the vehicle 20 to output light for forming a mark as warning information indicating that a pedestrian is present in the risk area. ...”).
Regarding claim 6. (Original) KUREHASHI discloses the information processing device according to claim 5,
wherein the transmission control unit performs control to transmit to surroundings of the moving body information indicating at least one of the movement or the moving direction of the moving object predicted by the predicting unit, even when the warning control unit determines that the moving object will not get into the path of the moving body before the moving body reaches the location of the moving object in the direction of travel of the moving body (0055 and 0063-0066; Figures 5-6; “[0055] … the terminal 82 determines whether the current position of the terminal 82 is in the risk area represented by the coordinates included in the risk area information (S426). When the current position of the terminal 82 is in the risk area, in S428, the terminal 82 transmits response information indicating the presence of the pedestrian in the risk area to the vehicle 20. …”).
Regarding claim 9. (Original) Claim 9 has similar limitations as to claim 4 treated in the rejection above, and is met by the references as discussed above, and has been rejected for the same reason of obviousness as used in the rejection to claim 4 above.
Regarding claim 12. (Original) Claim 12 has similar limitations as to claim 5 treated in the rejection above, and are met by the references as discussed above, and has been rejected for the same reason of obviousness as used in the rejection to claim 5 above.
Regarding claim 15. (Original) Claim 15 has similar limitations as to claim 6 treated in the rejection above, and is met by the references as discussed above, and has been rejected for the same reason of obviousness as used in the rejection to claim 6 above.
Regarding claim 16. (Original) KUREHASHI discloses the information processing device according to claim 1, wherein the moving body is a vehicle (0015; Figure 1; ‘vehicle 20’ or ‘vehicle 60’).
Regarding claims 17-18. (Original) KUREHASHI discloses a moving body comprising the information processing device according to claims 1 or 16 (0017-0028; Figures 1-4; ‘vehicle 20’ or ‘vehicle 60’).
Regarding claim 19. (Currently Amended) An information processing method is drawn to the method of using the corresponding information processing device claimed in claim 1. Therefore the information processing method claim 19 corresponds to information processing device claim 1 is rejected for same reason of obviousness as used above.
Regarding claim 20. (Currently Amended) A non-transitory computer-readable storage medium claim 20 is drawn to the non-transitory computer-readable storage medium of using the corresponding to the device of using the same as claimed in claim 1. Therefore, non-transitory computer-readable storage medium claim 20 corresponds to the device claim 1, and is rejected for the same reason of obviousness as used above.
Response to Arguments
Applicant’s arguments with respect to claims 1, 3-6, 9, 12, and 15-20 have been considered but are moot based.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Bai et al. as published on US 9786178 B1 hereinafter referred to as “Bai”.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ASMAMAW TARKO whose telephone number is (571)272-9205. The examiner can normally be reached Monday -Friday 9:00AM-5:00PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chris Kelley can be reached at (571) 272-7331. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ASMAMAW G TARKO/ Patent Examiner, Art Unit 2482