DETAILED ACTION
Status of the Application
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of the Claims
This action is in response to the applicant’s filing on April 03, 2024. Claims 1 – 20 are pending and examined below.
Information Disclosure Statement
The information disclosure statement(s) (IDS) submitted on April 03, 2024 has been considered by the Examiner.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. § 102 and 103 (or as subject to pre-AIA 35 U.S.C. § 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. § 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. § 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 3, 5 - 7,13, 15 and 19 - 20 are rejected under 35 U.S.C. § 103 as being unpatentable over U.S. Patent Application Publication No. US 2021/0233390 A1 to GEORGIOU et al. (herein after "Georgiou") in view of U.S. Patent Application Publication No. US 2020/0081445 A1 to STETSON et al. (herein after "STETSON").
(Note: Claim language is in bold typeface, and the Examiner’s comments and cited passages from the prior art reference(s) are in normal typeface.)
As to Claim 1,
Georgiou’s system for updating maps based on traffic object detection discloses a computer-implemented method for controlling motion of an autonomous vehicle (see ¶0160 ~ a method of controlling autonomous vehicles 102 relative to determination of discrepancy between the traffic light data of each of plurality of autonomous vehicles and the known traffic light data), the method comprising:
an edge between the first node and the second node indicative of a relationship between the first object and the traffic element (see ¶0147 ~ "In an autonomous vehicle environment..., the environment... continuously changes over time... Traffic objects... include traffic signs, traffic signals, traffic lights..., or any other traffic object", ¶0151 ~ "at step 401… searching for a predetermined traffic sign within a predetermined geographic location by analyzing a region of an image frame associated with the traffic sign"; thus teaching edge determination and distinction between the first node and the second node [0029] of the disclosure, indicative of a relationship between the first object and the traffic element per [0028] of the disclosure);
wherein the interaction type is predicted from a predetermined set of discrete interaction types (see ¶0142 ~ differentiation of interaction types by discriminate analysis, ¶0148, ¶0151, and ¶0160; thereby teaching differentiation of interaction types by discriminate analysis);
processing the interaction type between the first node and the second node to predict a trajectory of the first object (see Fig. 4A-4B ~ outlines process flow charts for managing traffic signal location discrepancies and
PNG
media_image1.png
814
572
media_image1.png
Greyscale
PNG
media_image2.png
812
568
media_image2.png
Greyscale
see ¶0133 ~ traffic elements ~ lane markers - can be predicted wherein in combination local map generation indicated in process method step S350 as in ¶0135, a described assisted cruise control teaches prediction of a trajectory of the first object); and
controlling motion of the autonomous vehicle based on a motion plan determined based on the trajectory of the first object. (See Fig. 1 and ¶0019 ~ autonomous vehicle motion is controlled based upon constraints imposed influence by objects).
However, Georgiou does not explicitly disclose an autonomous vehicle motion control method comprising obtaining graph data associated with a first node indicative of a first object in an environment of the autonomous vehicle and a second node indicative of a traffic element in the environment of the autonomous vehicle, the graph data further comprising
processing the graph data to predict an interaction type for the edge between the first node and the second node,
On the other hand, Stetson’s system for graph-based AI training discloses obtaining graph data associated with a first node indicative of a first object in an environment of the autonomous vehicle and a second node indicative of a traffic element in the environment of the autonomous vehicle (see Fig. 8 and
PNG
media_image3.png
796
748
media_image3.png
Greyscale
see ¶0115 ~ "FIG. 8… set of training data for an autonomous vehicle deployment during an unprotected left as a traffic light turns red developed with a graph interface system… incorporated into a graph, with each parameter being represented by nodes… linked in any of a number of ways… to generate hybrid scenarios’), the graph data further comprising processing the graph data to predict an interaction type for the edge between the first node and the second node. (See Fig. 8, Fig. 9 ~ outlines a process flow chart wherein traffic scene data is acquired, a knowledge graph is generated, and
PNG
media_image4.png
768
582
media_image4.png
Greyscale
see Fig. 14 ~ outlines a process flow chart wherein in process method steps 14200 - 1430, Stetson teaches acquiring a second graph and “merging a known graph and the second graph; thus teaching wherein graph data predicts an interaction type for the edge between the first node and the second node, and ¶0118 ~ "nodes in the knowledge graph reflect parameters within scenarios characterizing interaction types for edges between a plurality of nodes).
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to Georgiou with the graph data acquisition, as taught by Stetson, with a reasonable expectation of success where the resultant combination would provide mapping objects in the autonomous vehicle environment to traffic elements, thereby enabling benefits, including but not limited to: higher precision mapping and more reliable vehicle navigations.
As to Claim 3,
Georgiou/Stetson discloses the method of claim 1, wherein:
the first object is a first vehicle (see Figs. 4A - 4B and ¶0147; Stetson); and
the traffic element is a red traffic light, a yellow traffic light, a green traffic light, an unknown traffic light, a stop sign, or a yield sign. (See ¶0147 ~ traffic signs and/or lights; Stetson)
As to Claim 5,
Georgiou/Stetson discloses the method of claim 1, wherein the graph data comprises
map data of an area in the environment surrounding the traffic element. (See ¶0071; Georgiou ~ a map being generated for a region in an environment surrounding the traffic element).
As to Claim 6,
Georgiou/Stetson discloses the method of claim 5,
wherein the map data includes lane boundary data, left turn region data, right turn region data, motion path data, drivable area data, or intersection data. (See ¶0147; Georgiou ~ lane boundary information or any other traffic object).
As to Claim 7,
Georgiou/Stetson discloses the method of claim 5, further comprising:
updating the graph data to include data associated with the interaction type between the first node and the second node. (See Figs. 8 – 9, 14, and ¶0118; Stetson).
As to Claim 13,
Georgiou discloses a computing system for an autonomous vehicle (see Fig. 1 ~ illustrates a general arrangement of an autonomous vehicle motion control system that navigates a traffic environment world as further described in ¶0019), the computing system comprising:
PNG
media_image5.png
602
752
media_image5.png
Greyscale
one or more processors (see Fig. 5, ¶0168 ~ internal computing system 110 comprises a processor 510, and ¶0176 ~ processor); and
PNG
media_image6.png
594
772
media_image6.png
Greyscale
one or more non-transitory computer-readable medium storing instructions for execution by the one or more processors to cause the computing system to perform operations (see ¶0176 ~ non-transitory computer readable medium), the operations comprising:
an edge between the first node and the second node indicative of a relationship between the first object and the traffic element (see ¶0147 ~ "In an autonomous vehicle environment..., the environment... continuously changes over time... Traffic objects... include traffic signs, traffic signals, traffic lights..., or any other traffic object", ¶0151 ~ "at step 401… searching for a predetermined traffic sign within a predetermined geographic location by analyzing a region of an image frame associated with the traffic sign"; thus teaching edge determination and distinction between the first node and the second node [0029] of the disclosure, indicative of a relationship between the first object and the traffic element per [0028] of the disclosure);
wherein the interaction type is predicted from a predetermined set of discrete interaction types (see ¶0142 ~ differentiation of interaction types by discriminate analysis, ¶0148, ¶0151, and ¶0160; thereby teaching differentiation of interaction types by discriminate analysis);
processing the interaction type between the first node and the second node to predict a trajectory of the first object (see Fig. 4A-4B ~ outlines process flow charts for managing traffic signal location discrepancies and ¶0133 ~ traffic elements ~ lane markers - can be predicted wherein in combination local map generation indicated in process method step S350 as in ¶0135, a described assisted cruise control teaches prediction of a trajectory of the first object); and
controlling motion of the autonomous vehicle based on a motion plan determined based on the trajectory of the first object. (See Fig. 1 and ¶0019 ~ autonomous vehicle motion is controlled based upon constraints imposed influence by objects).
Stetson is then relied upon to disclose an autonomous vehicle motion control method comprising obtaining graph data associated with a first node indicative of a first object in an environment of the autonomous vehicle and a second node indicative of a traffic element in the environment of the autonomous vehicle (see Fig. 8 and ¶0115; Stetson ~ "FIG. 8… set of training data for an autonomous vehicle deployment during an unprotected left as a traffic light turns red developed with a graph interface system… incorporated into a graph, with each parameter being represented by nodes… linked in any of a number of ways… to generate hybrid scenarios”), the graph data further comprising processing the graph data to predict an interaction type for the edge between the first node and the second node (see Fig. 8, Fig. 9; Stetson ~ outlines a process flow chart wherein traffic scene data is acquired, a knowledge graph is generated, Fig. 14; Stetson ~ outlines a process flow chart wherein in process method steps 14200 - 1430, Stetson teaches acquiring a second graph and “merging a known graph and the second graph; thus teaching wherein graph data predicts an interaction type for the edge between the first node and the second node, and ¶0118; Stetson ~ "nodes in the knowledge graph reflect parameters within scenarios characterizing interaction types for edges between a plurality of nodes),
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to Georgiou with the graph data acquisition, as taught by Stetson, with a reasonable expectation of success where the resultant combination would provide mapping objects in the autonomous vehicle environment to traffic elements, thereby enabling benefits, including but not limited to: higher precision mapping and more reliable vehicle navigations.
As to Claim 15,
Georgiou/Stetson discloses the computing system of claim 13, wherein:
the first object is a first vehicle (see Figs. 4A - 4B and ¶0147; Georgiou); and the traffic element is a red traffic light, a yellow traffic light, a green traffic light, an unknown traffic light, a stop sign, or a yield sign. (See ¶0147; Georgiou).
As to Claim 19,
Georgiou/Stetson discloses the computing system of claim 13,
wherein the graph data includes a directional edge between the first node and the second node, the directional edge defining a relative position and velocity of the first object in relation to the traffic element. (See Fig. 8 and ¶0118; Stetson).
As to Claim 20,
Georgiou discloses an autonomous vehicle (see Fig. 1 ~ illustrates a general arrangement of an autonomous vehicle motion control system that navigates a traffic environment world as further described in ¶0019) comprising:
one or more processors (see Fig. 5 and ¶0168 ~ processor); and
one or more non-transitory computer-readable medium storing instructions for execution by the one or more processors (see ¶0176 ~ non-transitory computer readable medium) to cause the one or more processors to perform operations, the operations comprising:
an edge between the first node and the second node indicative of a relationship between the first object and the traffic element; processing the graph data to predict an interaction type for the edge between the first node and the second node (see ¶0147 ~ "In an autonomous vehicle environment..., the environment... continuously changes over time... Traffic objects... include traffic signs, traffic signals, traffic lights..., or any other traffic object", ¶0151 ~ "at step 401… searching for a predetermined traffic sign within a predetermined geographic location by analyzing a region of an image frame associated with the traffic sign"; thus teaching edge determination and distinction between the first node and the second node [0029] of the disclosure, indicative of a relationship between the first object and the traffic element per [0028] of the disclosure),
wherein the interaction type is predicted from a predetermined set of discrete interaction types (see ¶0142 ~ differentiation of interaction types by discriminate analysis, ¶0148, ¶0151, and ¶0160; thereby teaching differentiation of interaction types by discriminate analysis);
processing the interaction type between the first node and the second node to predict a trajectory of the first object (see Fig. 4A-4B ~ outlines process flow charts for managing traffic signal location discrepancies and ¶0133 ~ traffic elements ~ lane markers - can be predicted wherein in combination local map generation indicated in process method step S350 as in ¶0135, a described assisted cruise control teaches prediction of a trajectory of the first object); and
controlling motion of the autonomous vehicle based on a motion plan determined based on the trajectory of the first object. (See Fig. 1 and ¶0019 ~ autonomous vehicle motion is controlled based upon constraints imposed influence by objects).
Stetson is then introduced to disclose obtaining graph data associated with a first node indicative of a first object in an environment of the autonomous vehicle and a second node indicative of a traffic element in the environment of the autonomous vehicle (see Fig. 8 and ¶0115 ~ "FIG. 8… set of training data for an autonomous vehicle deployment during an unprotected left as a traffic light turns red developed with a graph interface system… incorporated into a graph, with each parameter being represented by nodes… linked in any of a number of ways… to generate hybrid scenarios”).
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to Georgiou with the graph data acquisition, as taught by Stetson, with a reasonable expectation of success where the resultant combination would provide mapping objects in the autonomous vehicle environment to traffic elements, thereby enabling benefits, including but not limited to: higher precision mapping and more reliable vehicle navigations.
Allowable Subject Matter
Claims 2, 4, 8 – 12, 14, and 16 – 18 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
In particular, the available prior art appears to be silent in disclosing the method of claim 1, further comprising:
processing the graph data to predict an interaction type for the edge between the first node and the second node
in response to determining the distance between the first object and the traffic element to be less than a predefined distance; and
wherein processing the interaction type between the first node and the second node to predict a trajectory of the first object comprises processing the interaction type using a machine-learned interaction prediction model to determine the trajectory of the first object. Emphasis added.
The prior art does not appear to explicitly teach or disclose the above recited claim limitations.
To that end and although further search and consideration would always need to be performed based upon any submitted amendments by the Applicant, it is the Examiner’s position that incorporating these above recited claim limitations into independent claims 1, 13, and 20 could potentially advance prosecution.
Conclusion
Any inquiry concerning this communication or earlier communications from the Examiner should be directed to ASHLEY L. REDHEAD, JR. whose telephone number is (571) 272 - 6952. The Examiner can normally be reached on weekdays, Monday through Thursday, between 7 a.m. and 5 p.m.
If attempts to reach the Examiner by telephone are unsuccessful, the Examiner’s Supervisor, Peter Nolan can be reached Monday through Friday, between 9 a.m. and 5 p.m. at (571) 270 – 7016. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ASHLEY L REDHEAD JR./Examiner, Art Unit 3661