Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
2. This Office Action is in response to the filing with the office dated 11/19/2025.
Claims 1, 14 have been amended. Claims 7, 12 have been cancelled. Claims 1 and 14 are independent claim. Claims 1-6, 8-11 and 13-14 are presented in this office action.
Priority
3. Applicant’s claim for the benefit of a prior-filed Indian Application No. 2021131059254 filed on 12/20/2021 is acknowledged by the examiner.
Applicant’s claim for the benefit of a prior-filed PCT Application No. PCT/DE2022/200284 filed on 02/09/2022 is acknowledged by the examiner.
Response to amendment/arguments
4. Applicant’s amendment with respect to the rejection of claims 56-58 under 35 U.S.C. § 101 have been fully considered. As a result the rejection has been withdrawn.
5. Applicant’s arguments with respect to the rejection of claims under 35 U.S.C. § 102 (a)(i) and 103(a) have been fully considered but are moot because the arguments are directed towards amended claims, thus necessitated the new ground of rejection as presented in this Office action.
Claim Rejections - 35 U.S.C. § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
6. Claims 1, 4, 5, 8-11 and 13-14 are rejected under 35 U.S.C. 103 as being unpatentable over Moustafa; Hassnaa (US 20220126864 A1) in view of Babb; Robert G. II (US 20070162903 A1) and further view of John; Mariam (US 20080016029 A1).
Regarding independent claim 1, Moustafa; Hassnaa (US 20220126864 A1) teaches, a method(Paragraphs [0182]-[0184] As discussed above, the autonomous driving stack of a vehicle may utilize a variety of sensor data (e.g., 258) generated by various sensors provided on and external to the vehicle (i.e., plurality of sensor readings are produced);
labeling the sensor data in the database (Paragraphs [0315]-[0316] the data sent by the autonomous vehicles comprises Image Data and Sensor Data and may also have some associated metadata. Both of the data sources can be used in conjunction or in isolation to extract and generate metadata/tags related to location (Examiner interprets labels as tags/ metadata));
producing dependency graphs between the labels; establishing dependencies between the labels based on the dependency graphs (Paragraph [0315], [0316] The cumulative location specific metadata can be information like geographic coordinates for example: “45° 31′ 22.4256” N and 1220 59′ 23.3880″ W″. It can also be additional environment information indicating environmental contexts such as terrain information (e.g., “hilly” or “flat”), elevation information (e.g., “59.1 m”), temperature information (e.g., “20° C.”), or weather information associated with that geolocation (e.g., “sunny”, “foggy”, or “snow”) (Examiner interprets dependency graph as taught by Moustafa et al as the location data that includes plurality of labels which are dependent of the location data);
identifying logical sequences of the labels (Paragraph [0439] a system may generate images for this scenario (e.g., by using the keywords “bicycle”, “snow”, and “highway”), but not the previous scenario. By intelligently controlling the synthetic data creation, the system may create images (for training) that would otherwise require a very long time for a vehicle to encounter in real life);
defining specific driving contexts based on the logical sequences (Paragraph[0450] if a context has keywords “bicycle”, “snow”, and “highway,” the image generator 5118 may generate one or more instances of image data each depicting a bicycle on a highway in the snow. Also see Paragraphs [0442], [0446]);
and saving the defined driving contexts in the database (Paragraph [0443] The determined context is stored in metadata/context dataset 5110 with the associated timestamp which can be used to map the context back to the raw data stream (e.g., the image data and/or the non-image sensor dataset). These stored metadata streams may tell a narrative of driving environment conditions over a period of time).
Moustafa et al fails to explicitly teach, wherein the dependency graphs are data structures that establish initial connections among the labels with one another wherein each of the connections has an associated dependency form that varies between connection.
Babb; Robert G. II (US 20070162903 A1) teaches, wherein the dependency graphs are data structures that establish initial connections among the labels with one another wherein each of the connections has an associated dependency form that varies between connection (Paragraph [0033]discloses, generating dependency directed graph, identifying dependencies by traversing the dependency directed graph and providing a tree view, a table view, and an inverse tree view, illustrating the identified dependencies.
Therefore it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention, to have modified the teachings of Moustafa et al by wherein the dependency graphs are data structures that establish initial connections among the labels with one another wherein each of the connections has an associated dependency form that varies between connection as taught by Babb et al (Paragraph [0033]).
One of the ordinary skill in the art would have been motivated to make this modification, by doing so, the tree view identifies the strong components with associated dependency cycles, cyclic sets and associated strong component and dependency arc simplification information. The tree view can be configured to further identify the exact set of causes or reasons for each dependency found, analyzed, and displayed as taught by Babb et al (Paragraph [0031], [0006]).
Moustafa et al and Babb et al fails to explicitly teach, and the connections define relationships between individual driving conditions or features that are represented by the labels.
John; Mariam (US 20080016029 A1) teaches, and the connections define relationships between individual driving conditions or features that are represented by the label (Figs 3, 4 Paragraph [0064] discloses, two separate nodes with different servers such as first and second server located at different locations such as Austin and Raleigh. First server has a dependency graph which has an IP address of `155.143.153.110,’ and second server has a dependency graph which has an IP address of `133.152.124.106,` and both servers have operating system installed on them and both the operating systems have a relationship represented by a label of Linux software installed on them. Also see [0023]).
Therefore it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention, to have modified the teachings of Moustafa et al and Babb et al by and the connections define relationships between individual driving conditions or features that are represented by the label as taught by John et al (Paragraph [0064]).
One of the ordinary skill in the art would have been motivated to make this modification, by doing so, Identifying (302) relationships among the types of nodes in such a manner advantageously allows for optimizing a query to a database using a node-based approach or a relationship-based approach as taught by John et al (Paragraph [0065]).
Regarding dependent claim 4, Moustafa et al, Babb et al and John et al teach, the method according to claim 1.
Moustafa et al further teaches, further comprising providing the database to a driver assistance system of the ego vehicle (Paragraph [0166] FIG. 1 is a simplified illustration 100 showing an example autonomous driving environment. Vehicles (e.g., 105, 110, 115, etc.) may be provided with varying levels of autonomous driving capabilities facilitated through in-vehicle computing systems with logic implemented in hardware, firmware, and/or software to enable respective autonomous driving stacks. Such autonomous driving stacks may allow vehicles to self-control or provide driver assistance to detect roadways, navigate from one point to another, detect other vehicles and road actors (e.g., pedestrians (e.g., 135), bicyclists, etc.), detect obstacles and hazards (e.g., 120), and road conditions (e.g., traffic, road conditions, weather conditions, etc.), and adjust control and guidance of the vehicle accordingly).
Regarding dependent claim 5, Moustafa et al, Babb et al and John et al teach, the method according to claim 4.
Moustafa et al further teaches, further comprising updating the database with sensor data of the ego vehicle during the operation of the ego vehicle (Paragraph [0168] discloses updating the database with sensor data during the operation of ego/ autonomous vehicle).
Regarding dependent claim 8, Moustafa et al, Babb et al and John et al teach the method according to claim 1.
Babb et al further teaches, wherein the form comprises a first form wherein one label in a graph triggers another label and a second form wherein one label in the graph always occurs with another label (Paragraph [0033] discloses, a cyclic dependency graph where one node/ label occurs with another node/ label and the first form triggers the second form to complete the cycle).
Regarding dependent claim 9, Moustafa et al, Babb et al and John et al teach, the method according to claim 4.
Moustafa et al further teaches, further comprising: collecting sensor data of the ego vehicle during the operation of the ego vehicle (Paragraph [0225] the connected device will only collect and transport the sensor data that meets the specified conditions, which may be updated (e.g., dynamically) as the model continues to evolve and train);
comparing the sensor data of the ego vehicle to the defined driving contexts in the database and detecting a specific driving context of the ego vehicle based on the comparison to the defined driving contexts in the database (Paragraph [0258] The environment model may then be fed into a planning system 1904 of an in-vehicle autonomous driving system, which takes the actively updated environment information and constructs a plan of action in response (which may include, e.g., route information, behavior information, prediction information, and trajectory information) to the predicted behavior of these environment conditions. The plan is then provided to an actuation system 1906, which can make the car act on said plan (e.g., by actuating the gas, brake, and steering systems of the autonomous vehicle. Also see Paragraph [0472]).
Regarding dependent claim 10, Moustafa et al, Babb et al and John et al teach, the method according to claim 9.
Moustafa et al further teaches, further comprising selecting a behavior for the driver assistance system based on the specific driving context (Paragraph [0258] The environment model may then be fed into a planning system 1904 of an in-vehicle autonomous driving system, which takes the actively updated environment information and constructs a plan of action in response (which may include, e.g., route information, behavior information, prediction information, and trajectory information) to the predicted behavior of these environment conditions. The plan is then provided to an actuation system 1906, which can make the car act on said plan (e.g., by actuating the gas, brake, and steering systems of the autonomous vehicle. Also see Paragraph [0472]).
Regarding dependent claim 11, Moustafa et al, Babb et al and John et al teach, the method according to claim 9.
Moustafa et al further teaches, wherein detecting a specific driving context of the ego vehicle based on the comparison to the defined driving contexts in the database further comprises identifying a transition point from one driving context state to another based on the defined driving contexts in the database (Paragraph [0258] The environment model may then be fed into a planning system 1904 of an in-vehicle autonomous driving system, which takes the actively updated environment information and constructs a plan of action in response (which may include, e.g., route information, behavior information, prediction information, and trajectory information) to the predicted behavior of these environment conditions. The plan is then provided to an actuation system 1906, which can make the car act on said plan (e.g., by actuating the gas, brake, and steering systems of the autonomous vehicle. Also see Paragraph [0472]).
Regarding dependent claim 13, Moustafa et al, Babb et al and John et al teach, the method according to claim 1.
Moustafa et al further teaches, wherein a label is a single characterizable observation (Paragraph [0234] the model 1305 may determine, from the inputs 1310, that the vehicle's environment is an urban environment characterized by high traffic and dynamic conditions (e.g., at 1315), well-trained highway characterized by largely static driving conditions (e.g., 1320), open country or forests characterized by largely untrained roadways and likely under-developed autonomous driving support infrastructure (e.g., 1325), among other examples. Indeed, location-based or -specific scenes or alerts (e.g., 1330) may also be detected from the sensor data 1310, such as low signal zones, accidents, abnormal obstacles or road obstructions, etc. Also see Paragraph [0202]).
Regarding independent claim 14, Moustafa; Hassnaa (US 20220126864 A1) teaches, a method of creating a database for driving context recognition for a driver assistance system of an ego vehicle, the method comprising: producing a database having sensor data based on a plurality of sensor recordings (Paragraphs [0182]-[0184] As discussed above, the autonomous driving stack of a vehicle may utilize a variety of sensor data (e.g., 258) generated by various sensors provided on and external to the vehicle (i.e., plurality of sensor readings are produced);
labeling the sensor data in the database (Paragraphs [0315]-[0316] the data sent by the autonomous vehicles comprises Image Data and Sensor Data and may also have some associated metadata. Both of the data sources can be used in conjunction or in isolation to extract and generate metadata/tags related to location (Examiner interprets labels as tags/ metadata));
wherein the labels are independent of geographic location information;
producing dependency graphs between the labels wherein the dependency graphs establish initial connections among the labels with one another;
establishing dependencies between the labels based on the dependency graphs;
identifying logical sequences of the labels (Paragraph [0439] a system may generate images for this scenario (e.g., by using the keywords “bicycle”, “snow”, and “highway”), but not the previous scenario. By intelligently controlling the synthetic data creation, the system may create images (for training) that would otherwise require a very long time for a vehicle to encounter in real life);
defining specific driving contexts based on the logical sequences (Paragraph[0450] if a context has keywords “bicycle”, “snow”, and “highway,” the image generator 5118 may generate one or more instances of image data each depicting a bicycle on a highway in the snow. Also see Paragraphs [0442], [0446]);
and saving the defined driving contexts in the database; utilizing the driving contexts to operate a driver assist system in an ego vehicle (Paragraph [0443] The determined context is stored in metadata/context dataset 5110 with the associated timestamp which can be used to map the context back to the raw data stream (e.g., the image data and/or the non-image sensor dataset). These stored metadata streams may tell a narrative of driving environment conditions over a period of time).
Moustafa et al fails to explicitly teach, wherein the dependency graphs are data structures that establish initial connections among the labels with one another wherein each of the connections has an associated dependency form that varies between connection.
Babb; Robert G. II (US 20070162903 A1) teaches, wherein the dependency graphs are data structures that establish initial connections among the labels with one another wherein each of the connections has an associated dependency form that varies between connection (Paragraph [0033]discloses, generating dependency directed graph, identifying dependencies by traversing the dependency directed graph and providing a tree view, a table view, and an inverse tree view, illustrating the identified dependencies.
Therefore it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention, to have modified the teachings of Moustafa et al by wherein the dependency graphs are data structures that establish initial connections among the labels with one another wherein each of the connections has an associated dependency form that varies between connection as taught by Babb et al (Paragraph [0033]).
One of the ordinary skill in the art would have been motivated to make this modification, by doing so, the tree view identifies the strong components with associated dependency cycles, cyclic sets and associated strong component and dependency arc simplification information. The tree view can be configured to further identify the exact set of causes or reasons for each dependency found, analyzed, and displayed as taught by Babb et al (Paragraph [0031], [0006]).
Moustafa et al and Babb et al fails to explicitly teach, and the connections define relationships between individual driving conditions or features that are represented by the labels.
John; Mariam (US 20080016029 A1) teaches, and the connections define relationships between individual driving conditions or features that are represented by the label (Figs 3, 4 Paragraph [0064] discloses, two separate nodes with different servers such as first and second server located at different locations such as Austin and Raleigh. First server has a dependency graph which has an IP address of `155.143.153.110,’ and second server has a dependency graph which has an IP address of `133.152.124.106,` and both servers have operating system installed on them and both the operating systems have a relationship represented by a label of Linux software installed on them. Also see [0023]).
Therefore it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention, to have modified the teachings of Moustafa et al and Babb et al by and the connections define relationships between individual driving conditions or features that are represented by the label as taught by John et al (Paragraph [0064]).
One of the ordinary skill in the art would have been motivated to make this modification, by doing so, Identifying (302) relationships among the types of nodes in such a manner advantageously allows for optimizing a query to a database using a node-based approach or a relationship-based approach as taught by John et al (Paragraph [0065]).
7. Claims 2, 3 and 6 are rejected under 35 U.S.C. 103 as being unpatentable over Moustafa; Hassnaa (US 20220126864 A1) in view of Babb; Robert G. II (US 20070162903 A1), John; Mariam (US 20080016029 A1) and in further view of AGARWAL; Puneet (US 20170109653 A1).
Regarding dependent claim 2, Moustafa et al, Babb et al and John et al teach, the method according to Claim 1.
Moustafa et al further teaches, wherein the establishing comprises establishing the dependencies between the labels based on discretized time slots
Moustafa et al and Babb et al et al fails to explicitly teach, discretized time slots.
AGARWAL; Puneet (US 20170109653 A1) teaches, wherein the establishing comprises establishing the dependencies between the labels based on discretized time slots (Paragraph [0028] The sensor devices 104 may send sensor data to the system 102 via the network 106. The system 102 is caused to analyze the sensor data to summarize machine usage. Herein, the sensor data that is received from multiple sensors for the specified time-period may be referred to as ‘multi-sensor data’. A sensor's behavior over a period of operation can be represented by a histogram that can capture the distribution of different values of that sensor data for a specified time-period over which the machine runs. The time period can be a single run of the machine, a day, a week, and so on (i.e., plurality of sensor data/ labels is based within a specified time period/ discretized time slots such as every day, every week, and so on). Also see Fig. 3, Paragraph [0064]).
Therefore it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention, to have modified the teachings of Moustafa et al, Babb et al and John et al by providing discretized time slots, as taught by AGARWAL et al (Paragraph [0028]).
One of the ordinary skill in the art would have been motivated to make this modification, by doing so, the disclosed methods and the systems may facilitate in analyzing whether a particular usage pattern of an engine corresponds the type of equipment it is installed in, or whether a certain driving behavior is peculiar to certain models of vehicles, or geographies, and so on as taught by AGARWAL et al (Paragraph [0018]).
Regarding dependent claim 3, Moustafa et al, Babb et al, John et al and AGARWAL et al teach, the method according to claim 2.
AGARWAL et al further teaches, wherein the discretized time slots are clustered (Paragraph [0020] the system disclosed herein first clusters days according to each sensor separately. Also see Paragraph [0050] the processor 202 is configured to, with the content of the memory 204, and optionally with other components described herein, to cause the system 200 to selectively merge two or more sensor-clusters of a sensor from the first plurality of sensor-clusters so as to include more days in a sensor-cluster collaboratively based on behavior of other sensors for the same days (i.e., the discretized time slots are clustered)).
Regarding dependent claim 6, Moustafa et al, Babb et al, John et al and AGARWAL et al teach, the method according to Claim 2,
AGARWAL et al further teaches, further comprising performing a histogram analysis determination of the discretized time (Paragraph [0029] In an embodiment, the system 102 is caused to compute multiple histogram (or intensity profiles) from the sensor data. The system 102 is caused to compute histograms representative of each of the sensors' behavior for each day. An example of a plurality of histograms corresponding to multiple sensors for multiple days is described further with reference to FIG. 3. The system 102 is caused to systematically summarize the multi-sensor data to determine machine behavior. An example implementation of the system 102 for summarizing the multi-sensor data is described further with reference to FIG. 2).
Therefore it would have been obvious to one of the ordinary skill in the art before the effective filing date of the claimed invention, to have modified the teachings of Moustafa et al and Babb et al et al by further comprising performing a histogram analysis determination of the discretized time, as taught by AGARWAL et al (Paragraph [0028]).
One of the ordinary skill in the art would have been motivated to make this modification, by doing so, the disclosed methods and the systems may facilitate in analyzing whether a particular usage pattern of an engine corresponds the type of equipment it is installed in, or whether a certain driving behavior is peculiar to certain models of vehicles, or geographies, and so on as taught by AGARWAL et al (Paragraph [0018]).
Closest Prior Art
8. The prior art made of record and not relied upon is considered pertinent to the applicant’s disclosure.
RASOULI; Amir (US 20220156576 A1) teaches, a time series of sensor data (i.e., sensor data over a sequence of time steps) may be preprocessed into a time series of feature data (i.e., feature data over the same sequence of time steps). For example, sensor data generated by an environmental sensor 110 over a plurality of time steps may be preprocessed into a plurality of 2D maps (e.g., OGMs) representing the location of classified objects (i.e. objects classified into one of the plurality of object categories) over the time steps. In some examples, preprocessing may be used to generate at least one time series of feature data used at next step 504, and another time series of feature data may not require preprocessing (Paragraph [0097]).
9. Examiner has pointed out particular references contained in the prior arts of record in the body of this action for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and Figures may apply as well. It is respectfully requested from the applicant, in preparing the response, to consider fully the entire references as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior arts or disclosed by the examiner. It is noted that any citation to specific pages, columns, figures, or lines in the prior art references any interpretation of the references should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. In re Heck, 699 F.2d 1331-33, 216 USPQ 1038-39 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 USPQ 275, 277 (CCPA 1968))).
Conclusion
Applicant’s amendments/Arguments necessitated new grounds of rejection as presented in this office action. THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SUMAN RAJAPUTRA whose telephone number is (571) 272-4669. The examiner can normally be reached between 8:00 AM - 5:00 PM.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tony Mahmoudi (571) 272-4078 can be reached. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/ patents/ apply/ patent-center for more information about Patent Center and https://www.uspto.gov/ patents/ docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/S. R./
Examiner, Art Unit 2163
/ALEX GOFMAN/Primary Examiner, Art Unit 2163