DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 03/29/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “a plurality of sensors configured to capture sensor data…” in claims 1 and 20.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
Applicant has provided sufficient structure for the claimed sensors in paragraph 21 of the specification as “sensors 202 may include various sensors such as, for example, … cameras”.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-5, 7-16, and 18-20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Kassel et al. (U.S. Publication No. 2024/0029446; hereinafter Kassel).
Regarding claim 1, Kassel teaches a system for infrastructure and environmental mapping and automated vehicle behavior modification, the system comprising: a plurality of sensors of an autonomous vehicle (Kassel: Par. 372; i.e., the host vehicle may include one or more cameras onboard the host vehicle),
the plurality of sensors configured to capture sensor data representing an environment in which the autonomous vehicle is operating (Kassel: Par. 114; i.e., one or more of image capture devices 122, 124, and 126 may be configured to acquire image data from an environment in front of vehicle 200, behind vehicle 200, to the sides of vehicle 200, or combinations thereof);
and an autonomy computing system comprising a processor and a memory, the processor programmed to (Kassel: Par. 372; i.e., The navigation system for the host vehicle may include at least one processor comprising circuitry and a memory):
receive, from the plurality of sensors, first sensor data representing the environment in which the autonomous vehicle is operating at a first time, the first sensor data including image or video data (Kassel: Par. 372; i.e., as shown in step 2810 of process 2800, the memory may include instructions that when executed by the circuitry cause the at least one processor to receive at least one image from a camera on the host vehicle);
detect a difference between the first sensor data and historical sensor data at a location within the environment, the historical sensor data captured within the environment over a preceding period of time (Kassel: Par. 255; i.e., the new data indicates that a previously recognized landmark at a specific location no longer exists, or is replaced by another landmark; Par. 376; i.e., at step 2840, the navigation system for the host vehicle may compare the generated feature vector to a plurality of feature vectors stored in a database);
based on the detected difference, store, in the memory, an incident record indexed to the location relative to a stored map of the environment (Kassel: Par. 377; i.e., at step 2850, the navigation system for the host vehicle may, in response to a determination that the generated feature vector may not match an entry in the database, send the generated feature vector to a server; Par. 364; i.e., vehicle may provide to the server other information relative to the detected sign (e.g., the location of the sign);
and when one or more adjustment criteria associated with the location are satisfied, initiate one or more remedial actions associated with operation of the autonomous vehicle within the environment (Kassel: Par. 389; i.e., at step 3250, the navigation system may cause at least one navigational action to be taken by the host vehicle based on the identified traffic sign type).
Regarding claim 2, Kassel teaches the system according to claim 1. Kassel further teaches wherein the processor is further programmed to detect the difference by: executing a trained machine learning model using the first sensor data as input; and processing output from the trained machine learning model (Kassel: Par. 369; i.e., FIG. 27B provides a diagrammatic representation of a signature neural network system.. Pixel data for traffic sign object 2720 may be input into a trained neural network 2730. The output of trained neural network 2730 may be a feature vector (e.g., signature vector) representative of the traffic sign object 2720… These generated feature vectors may be compared to feature vectors stored in a traffic sign database 2750 to determine whether the generated feature vectors represent recognized sign classes).
Regarding claim 3, Kassel teaches the system according to claim 2. Kassel further teaches wherein the processor is further programmed to train the trained machine learning model on a training dataset including the historical sensor data (Kassel: Par. 361; i.e., training of the neural networks in this case may involve providing sample images to the network, where the sample images include representations of traffic signs; Par. 374; i.e., neural network may be trained using a dataset containing traffic sign examples).
Regarding claim 4, Kassel teaches the system according to claim 2. Kassel further teaches wherein the processor is further programmed to initiate the remedial action by: re-training the trained machine learning model using the incident record (Kassel: Par. 360; i.e., the disclosed systems may implement a feature vector methodology. Such a methodology may allow for sign identification capabilities of a system to be updated; the model is updated to identify new signs).
Regarding claim 5, Kassel teaches the system according to claim 1. Kassel further teaches wherein the processor is further programmed to: localize the detected difference relative to the stored map of the environment; and generate the incident record including the localization (Kassel: Par. 377; i.e., at step 2850, the navigation system for the host vehicle may, in response to a determination that the generated feature vector may not match an entry in the database, send the generated feature vector to a server; Par. 364; i.e., vehicle may provide to the server other information relative to the detected sign (e.g., the location of the sign).
Regarding claim 7, Kassel teaches the system according to claim 1. Kassel further teaches wherein the one or more adjustment criteria include one of a number of incidents relative to the location, a type of incident relative to the location, or a magnitude of incident relative to the location (Kassel: Par. 389; i.e., at step 3250, the navigation system may cause at least one navigational action to be taken by the host vehicle based on the identified traffic sign type; the navigation action is determined based on the type of traffic sign).
Regarding claim 8, Kassel teaches the system according to claim 1. Kassel further teaches wherein the processor is further programmed to initiate the remedial action by: updating the stored map of the environment (Kassel: Par. 259; i.e., the server may identify model changes, such as … new signs, removal of signs, etc.… The server may continuously or periodically or instantaneously update the model upon receiving new data from the vehicles; Par. 342; i.e., the autonomous road navigation model may be sparse map 800, and server 1230 may update the sparse map).
Regarding claim 9, Kassel teaches the system according to claim 8. Kassel further teaches wherein the processor is further programmed to transmit the updated map to a plurality of other autonomous vehicles operating in the environment (Kassel: Par. 315; i.e., the at least one processor 2315 of the hub vehicle may transmit the autonomous vehicle road navigation model or the update to the model to other vehicles for providing autonomous navigation guidance; Par. 327; i.e., the updated road navigation model and/or sparse map may be distributed to a plurality of autonomous vehicles).
Regarding claim 10, Kassel teaches the system according to claim 1. Kassel further teaches wherein the processor is further programmed to initiate the remedial action by: changing operation of the autonomous vehicle within the environment (Kassel: Par. 369; i.e., information associated with the recognized sign classes may be used in determining appropriate navigational actions for the host vehicle (e.g., maintain speed within a 70 mph limit, expect passing vehicles, initiate braking to prepare for an approaching bend in the road, etc.)).
Regarding claim 11, Kassel teaches the system according to claim 10. Kassel further teaches wherein the operation includes one of a route, a speed, and a lane selection (Kassel: Par. 369; i.e., information associated with the recognized sign classes may be used in determining appropriate navigational actions for the host vehicle (e.g., maintain speed within a 70 mph limit, expect passing vehicles, initiate braking to prepare for an approaching bend in the road, etc.)).
Regarding claim 12, Kassel teaches a computer-implemented method for infrastructure and environmental mapping and autonomous vehicle behavior modification, the method implemented by an autonomy computing system of an autonomous vehicle, the autonomy system including a processor and a memory, the method comprising (Kassel: Par. 68; i.e., an autonomous vehicle may use information obtained while navigating; Par. 372; i.e., the navigation system for the host vehicle may include at least one processor comprising circuitry and a memory):
receiving, from a plurality of sensors of the autonomous vehicle, first sensor data representing an environment in which the autonomous vehicle is operating at a first time, the first sensor data including image or video data (Kassel: Par. 114; i.e., one or more of image capture devices 122, 124, and 126 may be configured to acquire image data from an environment in front of vehicle 200, behind vehicle 200, to the sides of vehicle 200, or combinations thereof; Par. 372; i.e., as shown in step 2810 of process 2800, the memory may include instructions that when executed by the circuitry cause the at least one processor to receive at least one image from a camera on the host vehicle);
detecting a difference between the first sensor data and historical sensor data at a location within the environment, the historical sensor data captured within the environment over a preceding period of time (Kassel: Par. 255; i.e., the new data indicates that a previously recognized landmark at a specific location no longer exists, or is replaced by another landmark; Par. 376; i.e., at step 2840, the navigation system for the host vehicle may compare the generated feature vector to a plurality of feature vectors stored in a database);
based on the detected difference, storing, in the memory, an incident record indexed to the location relative to a stored map of the environment (Kassel: Par. 377; i.e., at step 2850, the navigation system for the host vehicle may, in response to a determination that the generated feature vector may not match an entry in the database, send the generated feature vector to a server; Par. 364; i.e., vehicle may provide to the server other information relative to the detected sign (e.g., the location of the sign);
and when one or more adjustment criteria associated with the location are satisfied, initiating one or more remedial actions associated with operation of the autonomous vehicle within the environment (Kassel: Par. 389; i.e., at step 3250, the navigation system may cause at least one navigational action to be taken by the host vehicle based on the identified traffic sign type).
Regarding claim 13, Kassel teaches the method according to claim 12. Kassel further teaches wherein detecting the difference comprises: executing a trained machine learning model using the first sensor data as input; and processing output from the trained machine learning model (Kassel: Par. 369; i.e., FIG. 27B provides a diagrammatic representation of a signature neural network system.. Pixel data for traffic sign object 2720 may be input into a trained neural network 2730. The output of trained neural network 2730 may be a feature vector (e.g., signature vector) representative of the traffic sign object 2720… These generated feature vectors may be compared to feature vectors stored in a traffic sign database 2750 to determine whether the generated feature vectors represent recognized sign classes).
Regarding claim 14, Kassel teaches the method according to claim 13. Kassel further teaches the method further comprising training the trained machine learning model on a training dataset including the historical sensor data (Kassel: Par. 361; i.e., training of the neural networks in this case may involve providing sample images to the network, where the sample images include representations of traffic signs; Par. 374; i.e., neural network may be trained using a dataset containing traffic sign examples).
Regarding claim 15, Kassel teaches the method according to claim 13. Kassel further teaches wherein initiating the remedial action comprises re-training the trained machine learning model using the incident record (Kassel: Par. 360; i.e., the disclosed systems may implement a feature vector methodology. Such a methodology may allow for sign identification capabilities of a system to be updated; the model is updated to identify new signs).
Regarding claim 16, Kassel teaches the method according to claim 12. Kassel further teaches localizing the detected difference relative to the stored map of the environment; and generating the incident record including the localization (Kassel: Par. 377; i.e., at step 2850, the navigation system for the host vehicle may, in response to a determination that the generated feature vector may not match an entry in the database, send the generated feature vector to a server; Par. 364; i.e., vehicle may provide to the server other information relative to the detected sign (e.g., the location of the sign).
Regarding claim 18, Kassel teaches the method according to claim 12. Kassel further teaches wherein initiating the remedial action comprises: updating the stored map of the environment (Kassel: Par. 259; i.e., the server may identify model changes, such as … new signs, removal of signs, etc.… The server may continuously or periodically or instantaneously update the model upon receiving new data from the vehicles; Par. 342; i.e., the autonomous road navigation model may be sparse map 800, and server 1230 may update the sparse map);
and transmitting the updated map to a plurality of other autonomous vehicles operating in the environment (Kassel: Par. 315; i.e., the at least one processor 2315 of the hub vehicle may transmit the autonomous vehicle road navigation model or the update to the model to other vehicles for providing autonomous navigation guidance; Par. 327; i.e., the updated road navigation model and/or sparse map may be distributed to a plurality of autonomous vehicles).
Regarding claim 19, Kassel teaches the method according to claim 12. Kassel further teaches wherein initiating the remedial action comprises changing operation of the autonomous vehicle within the environment (Kassel: Par. 369; i.e., information associated with the recognized sign classes may be used in determining appropriate navigational actions for the host vehicle (e.g., maintain speed within a 70 mph limit, expect passing vehicles, initiate braking to prepare for an approaching bend in the road, etc.)).
Regarding claim 20, Kassel teaches an autonomous vehicle comprising: a plurality of sensors configured to capture sensor data representing an environment in which the autonomous vehicle is operating (Kassel: Par. 68; i.e., an autonomous vehicle may use information obtained while navigating; Par. 114; i.e., one or more of image capture devices 122, 124, and 126 may be configured to acquire image data from an environment in front of vehicle 200, behind vehicle 200, to the sides of vehicle 200, or combinations thereof)
and an autonomy computing system comprising a processor and a memory, the processor programmed to (Kassel: Par. 372; i.e., The navigation system for the host vehicle may include at least one processor comprising circuitry and a memory):
receive, from the plurality of sensors, first sensor data representing the environment in which the autonomous vehicle is operating at a first time, the first sensor data including image or video data (Kassel: Par. 372; i.e., as shown in step 2810 of process 2800, the memory may include instructions that when executed by the circuitry cause the at least one processor to receive at least one image from a camera on the host vehicle);
detect a difference between the first sensor data and historical sensor data at a location within the environment, the historical sensor data captured within the environment over a preceding period of time (Kassel: Par. 255; i.e., the new data indicates that a previously recognized landmark at a specific location no longer exists, or is replaced by another landmark; Par. 376; i.e., at step 2840, the navigation system for the host vehicle may compare the generated feature vector to a plurality of feature vectors stored in a database);
based on the detected difference, store, in the memory, an incident record indexed to the location relative to a stored map of the environment (Kassel: Par. 377; i.e., at step 2850, the navigation system for the host vehicle may, in response to a determination that the generated feature vector may not match an entry in the database, send the generated feature vector to a server; Par. 364; i.e., vehicle may provide to the server other information relative to the detected sign (e.g., the location of the sign);
and when one or more adjustment criteria associated with the location are satisfied, initiate one or more remedial actions associated with operation of the autonomous vehicle within the environment (Kassel: Par. 389; i.e., at step 3250, the navigation system may cause at least one navigational action to be taken by the host vehicle based on the identified traffic sign type).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 6 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Kassel and further in view of Wheeler et al. (U.S. Publication No. 2021/0254983; hereinafter Wheeler).
Regarding claim 6, Kassel teaches the system according to claim 1, but does not explicitly teach wherein the processor is further programmed to: identify a timestamp in the first sensor data associated with the detected difference; and store the incident record including the timestamp.
However, in the same field of endeavor, Wheeler teaches wherein the processor is further programmed to: identify a timestamp in the first sensor data associated with the detected difference; and store the incident record including the timestamp (Wheeler: Par. 93; i.e., a mismatch record includes a mismatch record type, the current location (e.g., latitude and longitude coordinates) of the vehicle 150, and a current timestamp. A mismatch record is associated with raw sensor data (e.g., raw sensor data related to the unverified detected object or its location)).
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the system of Kassel to have further incorporated wherein the processor is further programmed to: identify a timestamp in the first sensor data associated with the detected difference; and store the incident record including the timestamp, as taught by Wheeler. Doing so would allow the system to improve the accuracy of landmark maps which improves overall safety (Wheeler: Par. 79; i.e., the map update module 420 updates existing landmark maps to improve the accuracy of the landmark maps, and to thereby improve passenger and pedestrian safety).
Regarding claim 17, Kassel teaches the method according to claim 12, but does not explicitly teach identifying a timestamp in the first sensor data associated with the detected difference; and storing the incident record including the timestamp.
However, in the same field of endeavor, Wheeler teaches identifying a timestamp in the first sensor data associated with the detected difference; and storing the incident record including the timestamp (Wheeler: Par. 93; i.e., a mismatch record includes a mismatch record type, the current location (e.g., latitude and longitude coordinates) of the vehicle 150, and a current timestamp. A mismatch record is associated with raw sensor data (e.g., raw sensor data related to the unverified detected object or its location)).
It would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Kassel to have further incorporated identifying a timestamp in the first sensor data associated with the detected difference; and storing the incident record including the timestamp, as taught by Wheeler. Doing so would allow the system to improve the accuracy of landmark maps which improves overall safety (Wheeler: Par. 79; i.e., the map update module 420 updates existing landmark maps to improve the accuracy of the landmark maps, and to thereby improve passenger and pedestrian safety).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Additional prior art deemed pertinent in the art of detecting differences between current and historical sensor data includes Harrison (U.S. Publication No. 2025/0014460), Bangalore Ramaiah et al. (U.S. Publication No. 2022/0350995), Mizrachi et al. (U.S. Publication No. 2020/0282999), Williams et al. (U.S. Publication No. 2018/0361584), and Xu et al. (U.S. Patent No. 12209869).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRANDON Z WILLIS whose telephone number is (571)272-5427. The examiner can normally be reached Weekdays 8:00-5:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramya P Burgess can be reached at (571) 272-6011. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BRANDON Z WILLIS/Examiner, Art Unit 3661