DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
No additional information disclosure statement(s) (IDS) were submitted for consideration.
Priority
Acknowledgment is made of applicant's claim for foreign priority based on an application filed in Republic of Hungary on 04/08/2022.
Status of Application
Claims 1-11, and 13-16 are pending.
Claims 8-11, and 13-14, are amended.
Claims 1-7, and 15-16, are withdrawn from consideration.
Claim 12 is cancelled.
No claims are added.
Claims 1 and 8 are independent claims.
Claims 8-11, and 13-14, will be examined.
This Final Office action is in response to “Applicant Arguments/Remarks” and “Amended Claims” dated 11/26/2025.
Response to Arguments
Applicant’s Remarks/Arguments and amended claims, filed 11/26/2025, with respect to claims 8-11, and 13-14, have been fully considered and Applicant' s remarks will be addressed in sequential order as they were presented.
Regarding Objection to Specification, the applicant’s response and amended specification have been fully considered and is persuasive. Therefore, the Objection to Specification is withdrawn.
Regarding Claim Interpretations, the applicant’s response and amended claims have been fully considered and is persuasive. Therefore, the Claim Interpretations to claim 8, for limitation “map-matching module adapted for calculating map-matched location information,” claim 11, for limitation “object prioritization module adapted for determining objects,” claim 14, for claim limitations “map-aware application,” “forward collision warning module,” and “intersection movement assist module,” have been fully considered and are persuasive. Therefore, the interpretation of “map-matching module adapted for calculating map-matched location information,” “object prioritization module adapted for determining objects,” “map-aware application,” “forward collision warning module,” and “intersection movement assist module,” under 35 U.S.C. § 112(f) is withdrawn.
Regarding Claim Objections, the applicant’s response and amended specification have been fully considered and is persuasive. Therefore, the Claim Objection is withdrawn.
Regarding Rejections under 35 U.S.C. 112(a), the applicant’s response, and amended claims 11 and 14, have been fully considered and is persuasive. Therefore, the Rejections under 35 U.S.C. 112(a) is withdrawn.
Regarding Rejections under 35 U.S.C. 112(b), the applicant’s response, and amended claims 11-14, have been fully considered and is persuasive. Therefore, the Rejections under 35 U.S.C. 112(b) is withdrawn.
Regarding Rejection under 35 U.S.C. 101, and the remarks, “amended claim 8 (e.g., map-matched object data, filtering rule application, generation of binary decision variables, etc.) amount to an improvement to the technical field of vehicle safety systems.,” the Office respectfully disagrees. The amended claim of “map matched object data” is still a mental process which can practically be performed in the human mind. Furthermore, the “filtering rule application (wherein computer filters are fundamentally based on mathematical data and operations, using algorithms to manipulate digital signals), and generation of binary decision variables,” are both mathematical concepts with the recitation of generic computer components, (i.e. “a computer, the computer adapted to execute instructions”). Furthermore, receiving steps from the sensors and from the external source (i.e. “receiving digital map data, and data indicative of a location of an object”) is recited at a high level of generality (i.e. as a general means of gathering vehicle and road condition data for use in the evaluating step), and amounts to mere data gathering, which is a form of insignificant extra-solution activity. Therefore, it is the Offices stance that the Rejection under 35 U.S.C. 101, is not withdrawn. The grounds for rejection in view of amended claims are provided below.
Regarding Rejections under 35 U.S.C. 102/103, the Applicant argument “Tran does not perform classification based on filtering rules or binary decision variables, nor does Tran apply map-aware topology-based logic,” and “rules are not taught by the combination of Tran and Bush,” has been fully considered and is persuasive. Therefore, the rejection of claims 8, and 10-14, under 35 U.S.C. § 102/103 is withdrawn. However, upon further consideration, and based on the applicants own amendments, a new ground(s) of rejection is made in view of newly found prior art reference(s) AINE, US 20180079420, ZAIDI, US 20200072969, LEE et al., US 20230228588, and previously disclosed prior art reference(s) TRAN and BUSH. The grounds for rejection in view of amended claims are provided below.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 8-11, and 13, is/are rejected under 35 U.S.C. 103 as being unpatentable over TRAN, US 20210108926, herein further known as Tran, in view of AINE, US 20180079420, herein further known as Aine, further in view of ZAIDI, US 20200072969, herein further known as Zaidi, further in view of QI et al., US 20220292974, herein further known as Qi.
Regarding claim 8, Tran discloses a system for providing map-related information (¶¶ [0071-0072], [0168]) for a traffic safety application (¶ [0068]), the system adapted for receiving; digital map data (¶¶ [0024], [0030], [0032], [0069], [0071-0077], 3D model, high definition map), data indicative of a location of an object, the object being a vehicle or other traffic participant, (¶¶ [0066], sensed geographic location of the vehicle, evaluations of individual objects and/or features in the environment, [0072], accurate location of the vehicle, [0073-0074], map objects that are stored in the geometric map, [0115], HD map stores objects or data structures, [0116], [0131] sign (i.e. object) will be located in its sensor data, [0147], camera capable of capturing an image of an object and representing that image in the form of digital data, [0168-0169], data fusion based on sensor based detection of objects, see also FIGS. 7A-7H, detection of objects outside of the vehicle, see also FIGS. 8A-8F, [0176-0177], detected objects), the system comprising a computer (¶¶ [0013], [0015], [0017], [0019], [0022], [0025-0026], a processor, [0065], system is controlled by a processor 202, see also FIG. 2A , the computer adapted to execute instructions (¶¶ [0003], program instructions, [0066], algorithm, [0199], computer program products) for calculating map-matched location information (¶¶ [0067], computer vision system to map the environment, track objects, [0069], recognition of objects, and generating high definition maps, [0182], processes (i.e. calculating) obstacle locations) based on the data indicative of the location of the object and the digital map data (¶¶ [0067], computer vision system to map the environment, track objects, [0107], [0127], depth map… of a physical object, [0110], [0112], HD map… of the road and all physical objects around the road [0182], matching to road in digital map), the system further comprising an object database that stores the data indicative of the location of the object (¶¶ [0066], [0072-0074], [0115-0116], [0131], [0147], [0168-0169], [0176-0177]) and the map-matched information of the object [0072-0074], wherein the computer is further adapted to apply filtering rules (¶¶ [0066], sensor fusion algorithm 744 may include, for example, a Kalman filter… to provide various assessments based on the data from the sensor system, ¶ [0118], scene filtering) to classify remote objects for which data is stored in the object database (¶¶ [0106], applying one or more models for classifying, classified attributes may be stored, [0116-0117], the HD map uses image based classification, classified attributes may be stored in the map, [0122], mage classification model, [0127], image classification model, [0152], [0154-0155], any pixel in an image can be classified as an “object” pixel… [0171], determining a classification and a state of the detected object, [0176-0177], the classification of the detected object includes the type of automobile, for each classification, the object data may also contain behavior information…), and to select at least a subset of the remote objects that satisfy at least one of the filtering rules (¶¶ [0072], type of landmark, [0106], plurality of classified attributes, [0109], semantic information, type of lane and type of all signage, [0116-0117], [0133], HD map selects the closest point that falls into each 2D cell, [0151-0154], image-analysis system 806 determines the position and/or motion of object (i.e. object selected), [0157-0158], object can be determined…[0176-0177], type of automobile, type of object, ), and applying the filtering rules generates filtered object data that is provided to an application accessible by the computer (¶¶ [0022], implementation of data being transferred includes… software application, an augmented reality, or a virtual reality application).
However, Tran does not explicitly state binary decision variables comprising: a binary decision variable indicating whether a remote object, of the remote objects, is ahead or behind the object, a binary decision variable indicating whether the remote object and the object are driving in a same or in an opposite direction, and a binary decision variable indicating whether the remote object approaches the object from left or right.
Aine teaches a binary decision variable indicating whether a remote object, of the remote objects, is ahead or behind the object (¶ [0024]).
It would have been obvious to person of ordinary skill in the art before the effective filing date of the invention, with a reasonable expectation of success, to incorporate in to Tran the binary decision variable indicating whether a remote object, of the remote objects, is ahead or behind the object as taught by Aine.
One would be motivated to modify Tran in view of Aine for the reasons stated in Aine paragraph [0003], more robust system and method to maneuver vehicles in a far safer and more efficient manner.
Furthermore, Zaidi teaches a binary decision variable indicating whether the remote object and the object are driving in a same or in an opposite direction (¶ [0020], directional indicator may be considered a binary decision).
It would have been obvious to person of ordinary skill in the art before the effective filing date of the invention, with a reasonable expectation of success, to incorporate in to Tran the binary decision variable indicating whether the remote object and the object are driving in a same or in an opposite direction as taught by Zaidi.
One would be motivated to modify Tran in view of Zaidi for the reasons stated in Zaidi paragraph [0046], more robust system and method wherein rules may update as the vehicle drives to enhance the safety mechanism.
Furthermore, Qi teaches a binary decision variable indicating whether the remote object approaches the object from left or right (¶ [0155], binary classifiers model 3, and model 4, lateral predictions).
It would have been obvious to person of ordinary skill in the art before the effective filing date of the invention, with a reasonable expectation of success, to incorporate in to Tran the binary decision variable indicating whether the remote object approaches the object from left or right as taught by Qi.
One would be motivated to modify Tran in view of Qi for the reasons stated in Qi paragraph [0003], more robust system and method wherein a prediction is to improve the accuracy of prediction, namely, to reduce the likelihood of taking unnecessary actions (e.g., emergency reporting) based on a false prediction.
Regarding claim 9, the combination of Tran, Aine, Zaidi, and Qi, disclose all elements of claim 8 above.
However Tran does not explicitly state binary decision variables further comprise one or more binary decision variables indicating whether a particular remote object and the object are driving in a same lane or how many lanes separate the particular remote object from the object.
Aine teaches binary decision variables further comprise one or more binary decision variables indicating whether a particular remote object and the object are driving in a same lane or how many lanes separate the particular remote object from the object (¶ [0024], decisions are determined for the vehicles (i.e. object) in the target lane (i.e. same lane)).
It would have been obvious to person of ordinary skill in the art before the effective filing date of the invention, with a reasonable expectation of success, to incorporate in to Tran the binary decision variables further comprise one or more binary decision variables indicating whether a particular remote object and the object are driving in a same lane as taught by Aine.
One would be motivated to modify Tran in view of Aine for the reasons stated in Aine paragraph [0003], more robust system and method to maneuver vehicles in a far safer and more efficient manner.
Regarding claim 10, the combination of Tran, Aine, Zaidi, and Qi, disclose all elements of claim 8 above.
Tran discloses further location of the object is provided by a remote vehicle and/or by an ego-positioning sub-system connected to the object database and/or to the map-matching module (¶ [0168], see also FIG 7A, process checks sensors for object detection, then checks for confirmations from other vehicles over V2V communication) .
Regarding claim 11, the combination of Tran, Aine, Zaidi, and Qi, disclose all elements of claim 8 above.
Tran discloses further applying the filtering rules, applies at least one of: a geometric distance-based filtering rule (¶ [0125], depth map is a 2D map which comprises a plurality of points with each point describing a distance of a physical object to the detection and ranging sensor, a road topology aided distance based filtering rule, a road-relative filtering rule, a directional filtering rule, and a filtering rule based on object characteristics including at least one of speed, hard-braking behavior, and identification as a special vehicle.
Regarding claim 13, the combination of Tran, Aine, Zaidi, and Qi, disclose all elements of claim 8 above.
Tran discloses further classification of the remote objects (¶¶ [0106], applying one or more models for classifying, classified attributes may be stored, [0116-0117], the HD map uses image based classification, classified attributes may be stored in the map, [0122], mage classification model, [0127], image classification model, [0152], [0154-0155], any pixel in an image can be classified as an “object” pixel… [0171], determining a classification and a state of the detected object, [0176-0177], the classification of the detected object includes the type of automobile, for each classification, the object data may also contain behavior information…) by the filtering rules are performed based on a distance of the location of the remote object and the location of the object (¶¶ [0125], [0152]).
Claim(s) 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over the combination of Tran, Aine, Zaidi, and Qi, further in view of ZHANG et al., US 20200210461, herein further known as Zhang.
Regarding claim 14, the combination of Tran, Aine, Zaidi, and Qi, disclose all elements of claim 8 above.
the object database (¶¶ [0066-0067], data from the sensor system, [0073], objects that are stored in the geometric map, [0115], HD map stores objects or data structures, [0177]) is in communication with the application executed by the computer (¶¶ [0022], implementation of data being transferred includes… software application, an augmented reality, or a virtual reality application), the application being configured to perform at least one map-aware safety function (¶ [0077], map data may include obstacles… safety buffers to protect bicyclists from parked and moving vehicles), including: a function that evaluates whether a forward collision warning condition is present (¶ [0065], prevent collision with a vehicle ahead, issue braking or steering commands to minimize the damage resulting from a collision) based on the map-matched location information of the object and map-matched location information (¶¶ [0072-0074]). and motion parameters of at least one remote object located in front of the object in a same lane (¶¶ [0014], determining an obstacle in the lane (i.e. same lane) and changing the vehicle's path to avoid the obstacle, [0072], [0136], [0153], analyzing sequences of images allows image-analysis system to reconstruct 3D motion of object [0188], [0190], [0195]).
However, Tran does not explicitly state a function that evaluates whether the object or at least one of the remote objects is at or approaching an intersection region using the map-matched location information of the object and the map-matched location information of the remote objects, including information indicating whether the remote objects are approaching the intersection, located on a crossing road, or performing a turning movement, and a function that applies the one or more filtering rules to the remote objects, the one or more filtering rules comprising at least one of: a geometric distance-based filtering rule, a road topology aided distance based filtering rule, a road-relative filtering rule, a directional filtering rule, and a filtering rule based on object characteristics including at least one of speed, hard-braking behavior, and identification as a special vehicle, wherein the function that applies one or more filtering rules is to classify the remote objects into groups including top priority objects, high priority objects, medium priority objects, or low priority objects.
Zhang teaches function that applies one or more filtering rules is to classify the remote objects into groups including top priority objects, high priority objects, medium priority objects, or low priority objects (¶¶ [0043-0044]).
It would have been obvious to person of ordinary skill in the art before the effective filing date of the invention, with a reasonable expectation of success, to incorporate in to Tran the function that applies one or more filtering rules is to classify the remote objects into groups including top priority objects, high priority objects, medium priority objects, or low priority objects as taught by Zhang.
One would be motivated to modify Tran in view of Zhang for the reasons stated in Zhang paragraph [0003], more robust system and methods for driving safety of vehicles which is always expected to be fully guaranteed and robust sensing of a variety of conditions in the surrounding environment that may threaten safe driving, and greater provision of danger warning for the vehicle may help to ensure traffic safety and improve traffic efficiency.
Conclusion
The prior art made of record attached PTO 892 form, and not relied upon is considered pertinent to applicant's disclosure as described below. Prior art Lee discloses a route provision apparatus outputting map information that provides a route to a vehicle. Furthermore, Lee discloses (¶ [0263]) eHorizon may be understood as software, a module, a device or a system that performs the role of generating a vehicle's forward route information using high-definition (HD) map data, and transmitting the configured data to an application (e.g., an ADAS application, a map application, etc.) of the vehicle or in the vehicle requiring map information.
THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Terry Buse whose telephone number is (313)446-6647. The examiner can normally be reached Monday - Friday 8-5 PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott Browne can be reached at (571) 270-0151. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TERRY C BUSE/ Examiner, Art Unit 3666
/SCOTT A BROWNE/ Supervisory Patent Examiner, Art Unit 3666