Prosecution Insights
Last updated: April 19, 2026
Application No. 18/494,252

COMPUTER SYSTEM FOR STORING DATA

Non-Final OA §101§102§103§112
Filed
Oct 25, 2023
Examiner
JHA, ABDHESH K
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Aptiv Technologies AG
OA Round
1 (Non-Final)
80%
Grant Probability
Favorable
1-2
OA Rounds
2y 5m
To Grant
99%
With Interview

Examiner Intelligence

Grants 80% — above average
80%
Career Allow Rate
328 granted / 408 resolved
+28.4% vs TC avg
Strong +18% interview lift
Without
With
+18.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 5m
Avg Prosecution
24 currently pending
Career history
432
Total Applications
across all art units

Statute-Specific Performance

§101
10.0%
-30.0% vs TC avg
§103
47.2%
+7.2% vs TC avg
§102
20.4%
-19.6% vs TC avg
§112
13.4%
-26.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 408 resolved cases

Office Action

§101 §102 §103 §112
DETAILED ACTION Claims 1-15 dated 10/25/2023 are considered in this office action. Claims 1-15 are pending examination. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Drawings The drawings, Figures 1-3 are objected to under 37 CFR 1.83(a) because they fail to show detail of the elements as described in the specification. Any structural detail that is essential for a proper understanding of the disclosed invention should be shown in the drawing. MPEP § 608.02(d). Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. The examiner suggests to label the empty boxes for example in Fig 1, Box 104 should be labeled as Vehicle entity, 102 as sensor entity and so on. Similar corrections are suggested for other figures as well. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: sensor entity, vehicle entity, log entity, sample entity, stream entity and algorithm entity in claims 1-12. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION. —The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-15 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. The terms "sensor entity", "vehicle entity", "log entity", "stream entity" and "sample entity " and “algorithm entity” used in claims 1, 2 ,5, and 11 are vague and unclear and fails to clearly describe the claim limitation to be understood by an ordinary person skilled in the art. In particular, it is not clear whether these expressions, e.g., relate to physical entities, computer means, or abstract data structures. For instance, from the wording of claim 1, it is unclear whether the term "sensor entity" refers to a physical sensor, computer means related to a sensor, or a data structure for storing sensor data. Similarly, the terms "validator", "pre-hook" and "post-hook" used in claim 13 are vague and unclear and leave the reader in doubt as to the meaning of the technical feature to which they refer, thereby rendering the definition of the subject-matter of said claim unclear. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-15 are rejected under 35 U.S.C. 101 because the claimed inventions are directed to judicial exception involving abstract ideas, mental concepts without significantly more. 101 Analysis: Step 1 Claims 1-15 are directed to a system. Therefore, claims 1-15 fall into at least one of the four statutory categories. 101 Analysis: Step 2A, Prong I (MPEP § 2106.04) Prong 1: Does the claim recite a Judicial Exception? Step 2A, Prong I of the 2019 Patent Examiner’s Guide (PEG) analyzes the claims to determine whether they recite subject matter that falls into one of the following groups of abstract ideas: a) mathematical concepts [Symbol font/0xB7] mathematical relationships, mathematical formulas or equations, mathematical calculations b) certain methods of organizing human activity, and/or [Symbol font/0xB7] fundamental economic principles or practices (including hedging, insurance, mitigating risk) [Symbol font/0xB7] commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations) [Symbol font/0xB7] managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions) c) mental processes. [Symbol font/0xB7] concepts performed in the human mind (including an observation, evaluation, judgment, opinion). The following claims include limitations that recite an abstract idea and will be used to represent additional claims that merely elaborate on the recited abstract ideas for the remainder of the 35 U.S.C 101 rejection. Claim 1 recites the following abstract ideas: The claim recited a computer system for storing data related to driver assistance systems (ADAS) of vehicles, comprising sensor, vehicle, log, stream and sample entities that store configuration data with identifiers and references. Specifically: “a sensor entity, wherein the sensor entity is configured to store, for each of at least one sensor mounted on a vehicle, respective sensor configuration data comprising a respective sensor identifier; a vehicle entity, wherein the vehicle entity is configured to store, for each of at least one vehicle, respective vehicle configuration data comprising a respective vehicle identifier and a reference to at least one sensor identifier; a log entity, wherein the log entity is configured to store, for each of at least one log, respective logging configuration data comprising a respective log identifier and a reference to at least one vehicle identifier; a stream entity, wherein the stream entity is configured to store, for each of at least one stream, respective stream configuration data comprising a respective stream identifier, and a reference to at least one log identifier, wherein at least one of the streams is a sensor stream, wherein the stream configuration data of the sensor stream further comprises a reference to at least one sensor identifier; and a sample entity, wherein the sample entity is configured to store, for each of at least one sample, respective sample configuration data, comprising a respective sample identifier and a reference to at least one stream identifier.” These elements describe a process of organizing and storing data in a relational structure, where data (e.g. sensor, vehicle, log, stream, sample information) is categorized and linked via identifiers and references. This process can be performed mentally or with pen and paper as one could manually create tables or ledgers to organize sensor IDs, vehicle IDs, logs, streams, and samples and establish relationships between them. Such activities fall under mental processes as defined in the 2019 Revised Patent Subject Matter Eligibility Guidance. Dependent Claims 2-15 further elaborate upon the recited abstract ideas in Claim 1. Accordingly Claims 1-15 recite at least one abstract idea. 101 Analysis: Step 2A, Prong II (MPEP § 2106.04) Prong 2: Practical Application? Step 2A, Prong II of the 2019 PEG analyzes the claims to determine whether the claim recites any additional limitations that integrate the abstract idea into a practical application. The following claims recite additional limitations: Claim 1 recites the following additional limitations: A computer system for storing data of driver-assistance systems of vehicles In regards to computer system, the claim describes a generic computer system without specifying specialized hardware and hence using a general-purpose computer to perform data storage and organization does not constitute a practical application as it merely applies the abstract idea in a technological environment. In regards to ADAS, the application to driver assistance systems adda a field of use but does not improve the technology itself e.g. no enhancement to ADAS performance like faster collision detection or reduced latency. Hence, collecting and analyzing data for a specific application without technological improvement is not a practical application. Furthermore, use of entities with identifiers and references is standard database technique and not a novel technical solution. The combination of elements -generic computer, ADAS and relational database does not integrate the abstract idea into a practical application, as it merely applies data organization to a specific field using standard technology. The additional limitations do not: • Reflect an improvement in the functioning of a computer, or to any other technology or technical field – (MPEP § 2106.05(a)) • Apply or use a judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition • Apply the judicial exception with, or by use of, a particular machine – (MPEP § 2106.05(b)) • Effect a transformation or reduction of a particular article to a different state or thing – (MPEP § 2106.05(c)) • Apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is more than a drafting effort designed to monopolize the exception – (MPEP § 2106.05(e)). Dependent claims 2-15 further elaborate upon the recited abstract ideas in claims 1, but do not provide additional elements, and so do not integrate the abstract ideas into a practical application. Therefore, claims 1-15 do not integrate the recited abstract ideas into a practical application. 101 Analysis: Step 2B (MPEP § 2106.05) Step 2B: Significantly More? Step 2B of the Revised Guidance analyzes the claims to determine if the claims recite additional limitations that amount to significantly more than the judicial exception. When considered individually or in combination, the additional limitations of claims 1-15 do not amount to significantly more than the judicial exception for the same reasons discussed above as to why the additional limitations do not integrate the abstract idea into a practical application. The additional elements of outlined in Step 2A performing functions as designed simply accomplishes execution of the abstract ideas. Further, the additional limitation of in claim 1 does not amount to significantly more (there is no inventive concept in the claim). Therefore, the additional limitations of claims 1-15 do not amount to significantly more than the judicial exception. Thus, claims 1-15 recite abstract ideas with additional elements rendered at a high level of generality resulting in claims that do not integrate the abstract idea into a practical application or amount to significantly more than the judicial exception. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 3-6, 8-13 and 15 are rejected under 35 U.S.C. 102(a)(1) based upon a public use or sale or other public availability of the invention NuScenes (June 15, 2021) listed in the IDS dated 01/08/2024 in NPL section Cite no 2. (https://web.archive.org/web/20210615132757/https://www.nuscenes.org/nuscenes#lidarseg) and herein after will be referred as NuScenes. Regarding Claim 1, NuScenes discloses a computer system for storing data of driver-assistance systems of vehicles (p. 1, first paragraph and fourth paragraph; p. 6 and 7: "nuScenes schema''), the system comprising: a sensor entity (p. 7, boxes "sensor" and "calibrated_sensor*''), wherein the sensor entity is configured to store, for each of at least one sensor mounted on a vehicle (p. 4 to 6, sections "Car setup" and "Sensor calibration"; Figures on p. 5 and 6 indicating the positions of the Radar, Lidar, Cameras and /MU on the vehicle), respective sensor configuration data comprising a respective sensor identifier (p. 7, box "calibrated sensor*"; p. 8, section "calibrated sensor" -"sensor token"; p. 12 and 13, section "sensor”); a vehicle entity (p. 7, column "Vehicle" and boxes "log", "calibrated_sensor" and "ego pose"; p. 8, section "calibrated_sensor*": "Definition of a particular sensor ... as calibrated on a particular vehicle"; p. 10, section "log" - "vehicle''), wherein the vehicle entity is configured to store, for each of at least one vehicle, respective vehicle configuration data comprising a respective vehicle identifier (p. 10, section "log" - "vehicle'') and a reference to at least one sensor identifier (p. 8, section "calibrated_sensor*": " ... Definition of a particular sensor ... as calibrated on a particular vehicle ... "; p. 7, wherein column "Vehicle" comprises boxes "calibrated_sensor*" and "sensor''); a log entity (p. 7, boxes "log" and "map*"; p. 10, section "log''), wherein the log entity is configured to store, for each of at least one log, respective logging configuration data comprising a respective log identifier and a reference to at least one vehicle identifier (p. 10, section "log" - "token" and "vehicle''); a stream entity (p. 7, column "extraction" and boxes "scene*", "sample*" and "sample_data''), wherein the stream entity is configured to store, for each of at least one stream, respective stream configuration data comprising a respective stream identifier (p. 12, section "scene" - "token''), and a reference to at least one log identifier (p. 7, box "scene*"- "log_token"; p. 12, section "scene" "log_token''), wherein at least one of the streams is a sensor stream (p. 12, section "scene":·a scene is a 20s long sequence of consecutive frames extracted from a log ... ''), wherein the stream configuration data of the sensor stream further comprises a reference to at least one sensor identifier (p. 7, arrows between the boxes "scene*", "sample*", "sample_data" and "calibrated_ sensor*''); and a sample entity (p. 7, box "sample*''), wherein the sample entity is configured to store, for each of at least one sample, respective sample configuration data, comprising a respective sample identifier (p. 10 to 11, section "sample" "token'') and a reference to at least one stream identifier (p. 7, box "sample*" - "scene_token"; p. 10 to 11, section "sample" - "scene_token''). Regarding Claim 3, NuScenes teaches the computer system of claim 1. NuScene also teaches the sensor configuration data further comprises at least one of: sensor name data, sensor number data, sensor hardware data, sensor software data, sensor time data, sensor sample-type data, sensor status data, and/or sensor calibration data; the vehicle configuration data further comprises at least one of: vehicle name data, vehicle identification data, vehicle time data, vehicle sensor data, vehicle sensor calibration data, vehicle sensor calibration status data, vehicle sensor connection data, vehicle dimension data, and/or vehicle coordinate system data; the logging configuration data further comprises at least one of: logging origin data, logging start time data, logging end time data, and/or logging status data; the stream configuration data further comprises at least one of: stream sample data, stream status data, stream duration data, stream start time data and/or stream end time data; and/or the sample configuration data further comprises sample time data (p. 6 and 7, section "nuScenes schema"; p. 8 to 12, sections "calibrated sensor", "instance", "log", "sample", "sample_ annotation", "sample_ data", "scene", "sensor''). Regarding Claim 4, NuScenes teaches the computer system of claim 1. NuScene also teaches at least two of the sensor configuration data, the vehicle configuration data, the logging configuration data, the stream configuration data, and the sample configuration data comprise a common timestamp and are synchronized using the common timestamp (p. 6, section "Sensor synchronization": " ... timestamp of the image ... timestamp of the LIDAR scan"; p. 7, "timestamp" in boxes "sample*", "sample_data", "ego_pose*"; p. 10, section "log" date_captured''). Regarding Claim 5, NuScenes teaches the computer system of claim 1. NuScene also teaches each of the sensor entity, the vehicle entity, the log entity, the sample entity, and the stream entity comprise at least one index, and the at least one index is used to query and/or order the sensor configuration data, the vehicle configuration data, the logging configuration data, the sample configuration data, and the stream configuration data respectively (p. 6, section: "Data format": " ... Every row can be identified by its unique primary key token. Foreign keys such as sample_token may be used to link to the token of the table sample ... ", as well as the schema on p. 7). Regarding Claim 6, NuScenes teaches the computer system of claim 1. NuScene also teaches at least one of the samples is a binary sample, the sample configuration data of the binary sample comprises a reference to binary data of the at least one sensor, preferably measurement data of the at least one sensor (p. 7, boxes "sample*" and "sample_data"; p. 11 and 12, section "sample_data"- "filename''); and the sample configuration data comprises a sample type identifier (p. 7, boxes "sample*" and "sample_ data"; p. 11 and 12, section "sample_ data" "token" and "file format''). Regarding Claim 8, NuScenes teaches the computer system of claim 6. NuScene also teaches the sample configuration data of the binary sample further comprises at least one of: binary name data, binary address data, binary type data, binary check data, binary size data, binary status data, binary compress data, binary compress check data, and/or binary compress size data (p. 7, boxes "sample*" and "sample_ data"; p. 11 and 12, section "sample_ data" -"token" and "file format''), and/or wherein the sample type identifier comprises at least one of: BSON sample type, bounding box sample type, CDC sample type, camera sample type, cuboid sample type, custom sample type, ego motion sample type, GPS sample type, host sample type, lane sample type, point cloud sample type, radar sample type, semantic segmentation sample type, and/or video sample type (p. 7, boxes "sample*" and "sample_ data"; p. 11 and 12, section "sample_ data" -"token" and "file format''). Regarding Claim 9, NuScenes teaches the computer system of claim 1. NuScene also teaches the sensor configuration data further comprises a sensor type identifier, wherein the sensor type identifier comprises at least one of: camera sensor type, GPS sensor type, host sensor type, lidar sensor type, and/or radar sensor type (p. 8, section "calibrated_sensor"; p. 12 to 13, section "sensor''). Regarding Claim 10, NuScenes teaches the computer system of claim 1. NuScene also teaches wherein at least one of the streams is a dependent stream, and the respective stream configuration data of each dependent stream further comprises a respective reference to at least one other stream (p. 7, arrows between boxes "sample_annotation*", "sample*", instance*", "attribute" and "category*"; p. 11, section "sample_annotation" - "token" and "sample_token"; p. 9, section "instance"; p. 8, sections "attribute" and "category''). Regarding Claim 11, NuScenes teaches the computer system of claim 10. NuScene also teaches an algorithm entity (p. 7, box "visibility*"; p. 13, section "visibility": " ... Binned into 4 bins 0-40%, 40-60%, 60-80% and 80-100% "; p. 13 to 14, section "Data annotation"; p. 7, box "ego_pose*" - "translation" and "rotation''), the algorithm entity is configured to store, for each of at least one algorithm, respective algorithm configuration data comprising an algorithm identifier (p. 13, section "visibility" - "level" and "description"; p. 7, box "ego_pose*" - "translation" and "rotation''); wherein at least one of the dependent streams is an algorithm stream, and the stream configuration data of the algorithm stream further comprises a reference to at least one algorithm identifier (p. 13, section "visibility" - "level" and "description"; p. 7, box "ego_pose*" - "translation" and "rotation''); the algorithm entity is stored in the database (p. 7, box "visibility*"; p. 7, box "ego_pose*''); the algorithm configuration data uses a timestamp for synchronizing with at least one of the sensor configuration data, the vehicle configuration data, the logging configuration data, the sample configuration data, and the stream configuration data (p. 13, section "visibility" - "token"; p. 7, box "ego_pose*" - "timestamp''); and/or the algorithm entity further comprises at least one index, wherein the at least one index is used to query and/or order the algorithm configuration data. Regarding Claim 12, NuScenes teaches the computer system of claim 11. NuScene also teaches wherein the algorithm configuration data further comprises at least one of: algorithm name data, algorithm description data, algorithm version data, algorithm repository data, algorithm commit data, and/or algorithm information data (p. 13, section "visibility" - "token", "level" and "description"; p. 7, box "ego_pose*" - "translation" and "rotation''). Regarding Claim 13, NuScenes teaches the computer system of claim 1. NuScene also teaches at least one validator comprising a pre-hook (p. 2 to 3, section "scene planning'', implicit from p. 10, section "lidarseg" - "filename": "<str> -- The name of the .bin files containing ... ", wherein the skilled person knows that .bin files are compressed binary files that necessarily were compressed using some form of pre-hook). Regarding Claim 15, NuScenes teaches the computer system of claim 1. NuScene also teaches wherein the at least one sensor comprises at least one of: a lidar sensor, a radar sensor, a camera sensor, a GPS sensor, or host sensor (p. 4, section "Car setup", showing LIDAR, Cameras and RADARs). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 2, 7 and 14 are rejected under 35 U.S.C. 103 as being unpatentable over NuScenes in view of Simoudis (US20200364953A1) and herein after will be referred as Simoudis. Regarding Claim 2, NuScenes teaches the computer system of claim 1. NuScenes also teaches the sensor entity, the vehicle entity, the log entity, the sample entity, and the stream entity are stored in a database (p. 6, section "Data format''). NuScenes does not expressly teaches database including atleast one of a NoSQL database or a MongoDB database. Simoudis teaches database including atleast one of a NoSQL database or a MongoDB database (Para [0094]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified NuScenes to incorporate the teachings of Simoudis to include a NoSQL database or a MongoDB database. Doing so would provide an advantage scalability. Regarding Claim 7, NuScene teaches the computer system of claim 6. NuScene also teaches at least one portion of the binary data is stored in a storage component different to the database (p. 6, section "Data format":" ... All annotations and meta data ... are covered in a relational database ... " in combination with p. 11 and 12, section "sample_data": "filename": <Str> -- Relative path to data-blob on disk''), Simoudis teaches wherein preferably the storage component is a cloud storage (Para [0094]). Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over NuScenes. Regarding Claim 14, NuScenes teaches the computer system of claim 1. NuScenes also teaches the pre-hook initiates compressing data captured by the at least one sensor before storing the compressed data to the sample entity of the database (implicit from p. 10, section "lidarseg" - "filename": "<str> -- The name of the .bin files containing ... ".) The skilled person knows that .bin files are compressed binary files that necessarily were compressed using some form of pre-hook) or in a different storage component; and/or wherein the post-hook initiates deleting any stream of the stream entity comprising a reference to a log of the log entity if the log is deleted. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified NuScene to incorporate the teachings of obvious to include compressing data and deleting data. Doing so would optimize the database storage. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Ohno et al. (US2022/0194364A1) discloses the driver assistance system according to the present disclosure extracts, from information relating to a peripheral situation of the vehicle, risk target information relating to a risk target that is an existence causing a collision risk to the vehicle, obtains influence factor information relating to an influence factor that is a factor existing separately from the risk target and influencing the collision risk, determines a risk value obtained by quantifying the collision risk based on the risk target information and the influence factor information, and determines, based on the risk value, a manipulated variable of an actuator for controlling movement of the vehicle so as to decrease the collision risk. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ABDHESH K JHA whose telephone number is (571)272-6218. The examiner can normally be reached M-F:0800-1700. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James J Lee can be reached at 571-270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ABDHESH K JHA/Primary Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Oct 25, 2023
Application Filed
Jul 29, 2025
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602959
VEHICLE STORAGE MANAGEMENT SYSTEM, STORAGE MEDIUM, AND STORAGE MANAGEMENT METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12592100
VEHICLE-BASED DATA OPTIMIZATION
2y 5m to grant Granted Mar 31, 2026
Patent 12572156
SYSTEMS AND METHODS FOR LANDING SITE SELECTION AND FLIGHT PATH PLANNING FOR AN AIRCRAFT USING SOARING WEATHER
2y 5m to grant Granted Mar 10, 2026
Patent 12573250
Used car AI performance inspection system based on acoustic data analysis, and processing method therefor
2y 5m to grant Granted Mar 10, 2026
Patent 12555419
METHOD FOR REAL-TIME ECU CRASH REPORTING AND RECOVERY
2y 5m to grant Granted Feb 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
80%
Grant Probability
99%
With Interview (+18.3%)
2y 5m
Median Time to Grant
Low
PTA Risk
Based on 408 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month