DETAILED ACTION
Status of Claims
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This action is in response to the applicant’s filing on August 27, 2024. Claims 1-20 are pending and examined below.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 6, 8-12, 14-15, 17 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Takada, JP 2005-227947 A (see English translation – IDS filed on 8/27/2024) in view of Tsuda, US 11,273,847 B2.
As to claim 1, Takada teaches a vehicle navigation method, performed by an intelligent vehicle terminal, the method comprising (abstract and Fig. 3):
displaying a navigation interface, the navigation interface comprising a virtual map, the virtual map presenting a road scene of an environment in which a current vehicle is located, and the navigation interface comprising a vehicle identification object identifying the current vehicle, a warning sign display region for displaying a warning sign being set around the vehicle identification object, when the environment in which the current vehicle is located comprises an obstacle, the navigation interface further comprising an obstacle identification object identifying the obstacle (¶ 28-34 and Figs. 3-4; e.g. position mark 44 identifying the host vehicle 40, a warning sign display region 52 that displays a warning sign 45, the host vehicle 40 surrounding with obstacle 50, and the obstacle 50 is identified by a geometric shape);
setting a display attribute of the warning sign display region when the obstacle enters an early warning region of the current vehicle, to display a warning sign about the obstacle in the navigation interface (¶ 23-34, 39 and Figs. 3-4; e.g. displaying the warning sign 45 when the obstacle in the regions 52); and
outputting the warning sign, the warning sign indicating at least one of the following: that a distance between the current vehicle and the obstacle is less than or equal to an early warning distance corresponding to the early warning region, or a directional relationship between the current vehicle and the obstacle (¶ 25, 39, 42-43 and Figs. 3-4; e.g. output a notification when the distance between the host vehicle and obstacle is within a predetermined distance).
Takada does not specifically teach causing a vehicle navigation system of the current vehicle to control a travel status of the current vehicle according to a warning sign. However, Tsuda teaches this matter by maneuvering the current vehicle based on a warning information (column 15 line 63 – column 16 line 13), such as stopping the vehicle when a red signal is displayed. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate Tsuda’s teaching of controlling vehicle operation in response to warning signs into Takada’s vehicle navigation system in order to enhance safe and timely maneuvering of the vehicle.
As to claim 6, Takada further teaches the navigation interface is displayed based on a world coordinate system; and the method further comprises: obtaining position data of the obstacle in a vehicle coordinate system when the environment in which the current vehicle is located comprises the obstacle, the vehicle coordinate system is a coordinate system established using the current vehicle as a coordinate origin; performing coordinate transformation on the position data of the obstacle in the vehicle coordinate system, to obtain position data of the obstacle in the world coordinate system; and displaying, according to the position data of the obstacle in the world coordinate system, an obstacle identification object for identifying the obstacle in the navigation interface (¶ 32; e.g. obtaining the absolute position data of the obstacle).
As to claim 8, Takada teaches the displaying, according to the position data of the obstacle in the world coordinate system, an obstacle identification object for identifying the obstacle in the navigation interface comprises: identifying an obstacle type of the obstacle; and displaying an obstacle identification object in the obstacle type in the navigation interface according to the position data of the obstacle in the world coordinate system, obstacle identification objects in different obstacle types being different (¶ 29, 32, 39 and Fig. 4; e.g. different shape corresponds to different obstacle type).
As to claim 9, Takada teaches the performing coordinate transformation on the position data of the obstacle in the vehicle coordinate system, to obtain position data of the obstacle in the world coordinate system comprises: obtaining position data of the current vehicle in the world coordinate system; and performing coordinate transformation on the position data of the obstacle in the vehicle coordinate system according to the position data of the current vehicle in the world coordinate system, to obtain the position data of the obstacle in the world coordinate system (¶ 32).
As to claim 10, Takada teaches wherein the performing coordinate transformation on the position data of the obstacle in the vehicle coordinate system according to the position data of the current vehicle in the world coordinate system, to obtain the position data of the obstacle in the world coordinate system comprises: obtaining a transformation relationship between the world coordinate system and the vehicle coordinate system; performing calculation on the position data of the obstacle in the vehicle coordinate system based on the transformation relationship, to obtain a position variation of the obstacle relative to the current vehicle in the world coordinate system; and determining the position data of the obstacle in the world coordinate system according to the position data of the current vehicle in the world coordinate system and the position variation (¶ 32).
As to claim 11, Takada teaches determining a collision detection region of the obstacle; performing intersection detection on the early warning region of the current vehicle and the collision detection region of the obstacle; and if the early warning region of the current vehicle intersects with the collision detection region of the obstacle, determining that the obstacle enters the early warning region of the current vehicle (¶ 31-32, 42-43 and Figs. 3-4).
As to claim 12, Takada teaches wherein the performing intersection detection on the early warning region of the current vehicle and the collision detection region of the obstacle comprises: obtaining early warning region data of the early warning region of the current vehicle in the vehicle coordinate system, the vehicle coordinate system being a coordinate system established using the current vehicle as a coordinate origin; obtaining detection region data of the collision detection region of the obstacle in the vehicle coordinate system; and if the early warning region data and the detection region data satisfy a preset condition, determining that the early warning region of the current vehicle docs not intersect with the collision detection region of the obstacle (¶ 31-32, 39 and Fig. 4; e.g. determining if the obstacle within the region 51 is approaching the host vehicle or moving away).
As to claim 14, Takada teaches wherein the collision detection region of the obstacle is determined according to an enclosed region of the obstacle; and the obtaining detection region data of the collision detection region of the obstacle in the vehicle coordinate system comprises: obtaining position data of a feature point of the enclosed region of the obstacle in an obstacle coordinate system, the obstacle coordinate system being a coordinate system established using the obstacle as a coordinate origin; performing coordinate transformation on the position data of the feature point of the enclosed region of the obstacle in the obstacle coordinate system, to obtain position data of the feature point of the enclosed region of the obstacle in the vehicle coordinate system; and determining the detection region data of the collision detection region of the obstacle in the vehicle coordinate system according to the position data of the feature point of the enclosed region of the obstacle in the vehicle coordinate system (¶ 32; e.g. relative position and absolute position).
As to claim 15, Takada teaches wherein the early warning region of the current vehicle is formed by enlarging an enclosed region of the current vehicle according to an early warning distance; and the obtaining early warning region data of the early warning region of the current vehicle in the vehicle coordinate system comprises: obtaining position data of a feature point of the enclosed region of the current vehicle in the vehicle coordinate system; and determining the early warning region data of the early warning region of the current vehicle in the vehicle coordinate system according to the position data of the feature point of the enclosed region of the current vehicle in the vehicle coordinate system and the early warning distance (¶ 31-32, 39, 42 and Figs. 3-4).
Claims 17 and 19 are rejected based on the same rationale as used for claim 1 above.
Claims 2, 18 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Takada, JP 2005-227947 A (see English translation – IDS filed on 8/27/2024) in view of Tsuda, US 11,273,847 B2, and in further view of Li et al., CN 113734048 A (see English translation with annotated page and paragraph numbers – attached PTO-892).
As to claim 2, Takada in view of Tsuda teaches an early warning region as discussed in claim 1 above. Takada in view of Tsuda does not specifically teach that the early warning region of the current vehicle comprises early warning regions of N warning levels, the early warning regions of the N warning levels being formed by enlarging an enclosed region of the current vehicle according to N different early warning distances; and the early warning region that the obstacle enters is an early warning region of an ith warning level in the N warning levels, N being a positive integer, and i being a positive integer less than or equal to N; and the setting a display attribute of the warning sign display region, to display a warning sign about the obstacle in the navigation interface comprises: setting the display attribute of the warning sign display region according to the iᵗʰ warning level, to display a warning sign corresponding to the ith warning level in the navigation interface, to indicate that the distance between the current vehicle and the obstacle is less than or equal to an ith early warning distance, different warning levels corresponding to warning signs with different colors.
Li, on the other hand, teaches all these limitations (page 8 ¶ 2 - page 9 ¶ 11 and Fig. 3) except that it does not expressively specify that the various displayed warning signs are in different colors. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the feature of displaying the warning signs with different colors into Li’s teaching of displaying different warning signs based on the different warning levels, so as to provide an alternative sensory mode of warning notification to the vehicle operator. Furthermore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the modified teachings of Li into the navigation system of Takada in view of Tsuda to effectively provide obstacle warning to the vehicle operator.
Claims 18 and 20 are rejected based on the same rationale as used for claim 2 above.
Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over Takada, JP 2005-227947 A (see English translation – IDS filed on 8/27/2024) in view of Tsuda, US 11,273,847 B2, and in further view of Gogna et al., US 12,055,942 B1.
As to claim 16, Takada in view of Tsuda teaches a vehicle navigation system as discussed in claim 1 above. Takada in view of Tsuda does not specifically teach the additional feature of obtaining speed information of the current vehicle, predicting a motion trajectory of the current vehicle according to the speed information of the current vehicle, and displaying a mapped trajectory of the motion trajectory in the navigation interface. However, Gogna teaches these features (column 6 line 52 – column 7 line 11 and column 10 lines 43-56 and Figs. 1-2). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the motion-trajectory prediction taught by Gogna into the navigation system of Takada in view of Tsuda to enhance vehicle maneuvering.
Allowable Subject Matter
Claims 3-5, 7 and 13 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Inquiry
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Mary Cheung whose telephone number is (571) 272-6705. The examiner can normally be reached on Monday, Tuesday and Thursday from 10:00 AM to 7:00 PM. If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Christian Chace, can be reached on (571) 272-4190.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
The fax phone numbers for the organization where this application or proceedings is assigned are as follows:
(571) 273-8300 (Official Communications; including After Final Communications labeled “BOX AF”)
(571) 273-6705 (Draft Communications)
/MARY CHEUNG/ Primary Examiner, Art Unit 3665 February 9, 2026