DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This is a Final Rejection on the Merits. Claims 1-2 7-8, 10-12, and 14 are currently pending and are addressed below.
Response to Amendments
The amendment filed on September 10th, 2025 has been considered and entered. Accordingly claims 1 and 10-11 have been amended.
Response to Arguments
The applicant states (Amend. 8-10) that Takahashi (US 20190281430 A1) (“Takahashi”) in view of Heyl (US 20210300394 A1) (“Heyl”) in view of Petousis (US 20180261020 A1) (“Petousis”) fail to disclose the limitations of independent claim 1. The examiner respectfully disagrees. The applicant states that Petousis fails to teach “a predefined development priority of each ADS feature to the other ADS features” and “generation of an arbitration signal for allocating in-vehicle platform resources based on that development priority scheme together with platform constraints, requirements, and the current scene. Petousis teaches generating a priority order for sensor data. The sensor data can include any type of data from a vehicle such that the data can be from the vehicle’s ADS features (See at least Petousis Paragraph 27). Furthermore, the claims and specification of the instant application does not provide a limiting definition of an “arbitration signal”, with the published specification stating in paragraph 84 “Here, the first arbitration signal is indicative of a resource allocation of the platform of the vehicle for transmission of input data (e.g. sensor data) for a first ADS feature”. Petousis teaches that an arbitration signal for the vehicle sensor data is determined and transmitted based on factors that include importance, available bandwidth, and various other factors (See at least Petousis Paragraphs 47, 49, 57) such that under the BRI of the claim limitations Petousis discloses the limitations.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-2 7-8, 10-12, and 14 are rejected under 35 U.S.C. 103 as being unpatentable over i Takahashi (US 20190281430 A1) (“Takahashi”) in view of Heyl (US 20210300394 A1) (“Heyl”) in view of Petousis (US 20180261020 A1) (“Petousis”).
With respect to claim 1, Takahashi teaches a method for allocating platform resources in a vehicle for development, evaluation, and/or testing of automated driving system (ADS) features, the method comprising:
storing during a time period, sensor data indicative of a surrounding environment of the vehicle in a data storage device of the vehicle (See at least Takahashi Paragraphs 27-28 “In order to solve this problem an information processing apparatus according to an aspect of the present disclosure causes a first recognizer to execute a first recognition process that takes sensor information as input, and a second recognizer to execute a second recognition process that takes the sensor information as input, the second recognizer having different capability conditions from the first recognizer; determines one of a transmission necessity and a transmission priority of the sensor information depending on a difference between a first recognition result of the first recognition process and a second recognition result of the second recognition process; and transmits the sensor information to a server apparatus based on the determined one of the transmission necessity and the transmission priority. This configuration makes it possible to limit situations in which the image data beneficial for the retraining cannot be transmitted to the server apparatus, which performs the retraining, due to issues with the network bandwidth or the electric power consumption, even when the amount of the data accumulated while driving is extensive. In other words, by transmitting the image data beneficial for advancing the AI on priority basis to the server apparatus when the network bandwidth and transmission time are limited, the training data is efficiently collected by the training system. This makes it possible to advance the AI with more certainty, speed up the AI update cycle in the self-driving car, and provide a safer and more pleasant self-driving car to the user early on”);
obtaining data indicative of a set of platform constraints of the vehicle (See at least Takahashi Paragraph 28 “This configuration makes it possible to limit situations in which the image data beneficial for the retraining cannot be transmitted to the server apparatus, which performs the retraining, due to issues with the network bandwidth or the electric power consumption, even when the amount of the data accumulated while driving is extensive. In other words, by transmitting the image data beneficial for advancing the AI on priority basis to the server apparatus when the network bandwidth and transmission time are limited, the training data is efficiently collected by the training system. This makes it possible to advance the AI with more certainty, speed up the AI update cycle in the self-driving car, and provide a safer and more pleasant self-driving car to the user early on” | Paragraph 33 “The information processing apparatus further determines whether a vehicle including the information processing apparatus has a surplus of computational resources greater than or equal to a predetermined amount, and may cause the second recognizer to execute a recognition process when it is determined that the vehicle has a surplus of the computational resources greater than or equal to the predetermined amount” | Paragraph 35 “The second recognizer that performs the second recognition process may be configured according to one of (i) available computational resources in the vehicle including the information processing apparatus and (ii) a training purpose for the first recognizer”);
obtaining data indicative of a set of requirements for each of a plurality of ADS features (See at least Takahashi Paragraph 29 “The second recognizer may have more computational resources than the first recognize” | Paragraphs 35-36 “The second recognizer that performs the second recognition process may be configured according to one of (i) available computational resources in the vehicle including the information processing apparatus and (ii) a training purpose for the first recognizer. In other words, one of a variety of available second recognizers is selected and used for the second recognition process depending on the amount of available computational resources. This makes it possible to effectively utilize the resources in the self-driving car. For example, the second recognition process can be executed for collecting training data suitable for implementing recognition of new objects in order to provide new functionality in the future. This encourages providing a safer and more pleasant self-driving car to the user”);
obtaining data indicative of a priority scheme for the plurality of the ADS features (See at least Takahashi Paragraph 31 “The information processing apparatus may determine one of the transmission necessity and the transmission priority depending on a numerical difference between the first recognition result and the second recognition result. The information processing apparatus may determine one of the transmission necessity and the transmission priority depending on a type difference between the first recognition result and the second recognition result” | Paragraphs 116-117 “Sensor information that fulfills the predetermined condition may be prioritized over the other sensor information and used in the object detection of screening detector or transmitted to server apparatus 103. An example of this predetermined condition includes a condition related to the time the sensor information is been generated. With this, for example, sensor information generated at a certain time during driving is processed with priority. A different example may be a condition related to the control details for allowing the car including information processing apparatus 200 to drive at the time the sensor information has been generated. For example, sensor information generated when the driver or the self-driving system performs a certain control, e.g. sudden braking, may also be processed with priority. Yet another example may be a condition related to the external conditions of the car. For example, sensor information generated while the car is driving in certain weather or places, e.g. rain, on poor roads, or in a tunnel, may also be processed with priority. This enables collecting sensor information under conditions in which it is difficult to collect training data and conditions in which one specifically wants to enhance the recognition accuracy, and to facilitate the retraining for improving the real-time object detection precision”) (See at least Takahashi Paragraph 71 “By filtering the data using the condition, the training data can, for example, be collected efficiently under driving conditions in which there is relatively little training data. By performing training using training data collected in such a way, object recognition precision can be boosted above a fixed level without exception regardless of the possibility of certain driving conditions occurring. A safer and more pleasant self-driving car can, therefore, be provided to the user early on” | Paragraph 116 “Sensor information that fulfills the predetermined condition may be prioritized over the other sensor information and used in the object detection of screening detector or transmitted to server apparatus 103. An example of this predetermined condition includes a condition related to the time the sensor information is been generated. With this, for example, sensor information generated at a certain time during driving is processed with priority. A different example may be a condition related to the control details for allowing the car including information processing apparatus 200 to drive at the time the sensor information has been generated. For example, sensor information generated when the driver or the self-driving system performs a certain control, e.g. sudden braking, may also be processed with priority. Yet another example may be a condition related to the external conditions of the car. For example, sensor information generated while the car is driving in certain weather or places, e.g. rain, on poor roads, or in a tunnel, may also be processed with priority”);
obtaining data indicative of a current scene or scenario in the surrounding environment of the vehicle (See at least Takahashi Paragraph 87 “ Screening detector 201 next obtains the difference between the first recognition result of the first recognition process and the second recognition result of the second recognition process both with respect to the sensor information of a scene identified by an ID, and determines the transmission priority rank of the sensor information of the scene in question depending on this difference (step S404). The determined priority rank is accumulated by detection result with priority rank accumulator 206 as a portion of the detection result with priority rank data along with the second recognition result (step S405)”); and
generating based on the platform constraints, the set of requirements, and the current scene or scenario, an arbitration signal indicative of a resource allocation of the platform of the vehicle to at least one of the plurality of the ADS features (See at least Takahashi Paragraphs 74-76 “Detection result transmission possibility verifier 205 determines whether the sensor information can be transmitted to server apparatus 103 based on predetermined information. This predetermined information relates to, for example, whether self-driving car 101 is in a parked state, whether self-driving car 101 is charged to or over a predetermined amount, or whether self-driving car 101 is being charged. This information serves as determination factors relating to whether screening detector 201 can execute the processes up to the determining of the priority rank of the sensor information to be transmitted to server apparatus 103. In other words, detection result transmission possibility verifier 205 determines whether the sensor information can be transmitted to server apparatus 103 based on whether the necessary resources (hereinafter, computational resources) are available, e.g. the processor, memory, and electric power consumed thereby necessary for the processes executed by screening detector 201. Detection result transmission possibility verifier 205 is an example of the assessor in the present embodiment. Note that since screening detector 201, as mentioned above, produces more accurate results than object detector 203, more computational resources may be required for these processes. In order to finish these processes in the shortest time frame possible, the processor may require more computational resources since the processor may operate at a higher driving frequency when actuating screening detector 201 than when actuating object detector 203. For example, in the above predetermined information, available information relating to whether a network with enough available bandwidth to transmit data at or above a predetermined speed is available is included. This information serves as a determination factor relating to whether the sensor information can be transmitted to server apparatus 103 based on the determined priority rank”)
based on the resource allocation indicated by the arbitration signal, performing at least one of:
transmitting a portion of the stored sensor data to a remote entity for offline processing (See at least Takahashi Paragraph 32 “By determining the transmission necessity or transmission priority based on such a difference, the training data, which is likely to be beneficial to the retraining, is transmitted to the server apparatus on priority basis. In other words, the training data is collected efficiently”);
evaluating, in accordance with the resource allocation, an output of at least one ADS feature using at least a portion of stored sensor data as input (See at least Takahashi Paragraph 56 “The object detection by screening detector 201 is not performed real-time; the object detection process is, for example, executed for 30 minutes based on one hour's worth of the image data, and provides more accurate results than object detector 203. In other words, even when screening detector 201 and object detector 203 each perform the object detection based on images of the same scene, the results may still vary. Screening detector 201 (i) compares the object detection result with the detection result indicated by detection result data 301 accumulated by detection result accumulator 204, (ii) determines the priority rank depending on the difference between both, (iii) generates the detection result with priority rank data, which is data including the object detection result to which the determined priority information is added, and (iv) accumulates the detection result with priority rank data in detection result in priority rank accumulator 206. FIG. 3B shows detection result with priority rank data 321 that is an example of a configuration of the detection result with priority rank data and is accumulated by detection result with priority rank accumulator 206. In the example in FIG. 3B, priority rank 322 in detection result with priority rank data 321 is the above priority rank determined by screening detector 201” | Paragraph 66 “The second recognizer that performs the second recognition process is configured according to one of (i) available computational resources in the vehicle including the information processing apparatus and (ii) a training purpose for the first recognizer”); and
updating, in accordance with the resource allocation, at least one ADS feature using at least one portion of the stored sensor data as input (See at least Takahashi Paragraph 38 “This enables the second recognizer mounted in the self-driving car to be updated with great flexibility regarding place or time, making it more versatile”).
Takahashi fails to explicitly disclose that the ADS features are being developed, evaluated, and/or tested; the priority scheme comprises a predefined development priority of each ADS feature relative to the other ADS features of the plurality of ADS features; and that the arbitration signal is generated also based on the priority scheme.
Heyl teaches that the ADS features are being developed, evaluated, and/or tested (See at least Heyl Paragraph 46 “In order to determine whether the vehicle system 102 has sufficient robustness, i.e. can correctly detect objects in the environment of the vehicle 100 under different environmental conditions, based on the sensor data 112 the evaluation unit 110 determines a probability of existence for each of the detected objects indicating the probability with which the detected object, for example a model of the vehicle ahead 113, corresponds to a real object, here the actual vehicle ahead 113. Furthermore, the evaluation unit 110 determines a probability of detection, indicating the probability of the sensor system detecting an object in the environment of the vehicle 100, here the vehicle ahead 113, at all.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Takahashi to include that the ADS are being developed, evaluated, and/or tested, as taught by Heyl as disclosed above, in order to ensure that the resource allocation goes to important aspects of vehicle safety (Heyl Paragraph 5 “Embodiments of the present disclosure allow in an advantageous manner the estimation of the robustness of a vehicle system with multiple sensors for environment detection on the basis of sensor-specific probabilities of existence and detection. As a result, false positives, false negatives, or other incorrect results when detecting objects can be avoided.”).
Takahashi in view of Heyl Fail to explicitly disclose that the priority scheme comprises a predefined development priority of each ADS feature relative to the other ADS features of the plurality of ADS features; and that the arbitration signal is generated also based on the priority scheme
Petousis teaches that the priority scheme comprises a predefined priority of each ADS feature relative to the other ADS features of the plurality of ADS features based on various factors (See at least Petousis Paragraphs 28-30 “Prioritizing the vehicle sensor data functions to determine a level of importance of the received vehicle sensor data and prepare the vehicle sensor data for scheduling. Prioritizing vehicle sensor data preferably includes determining (or selecting) a prioritization scheme, and then prioritizing the vehicle sensor data according to the determined prioritization scheme. Prioritizing the vehicle sensor data is preferably performed by a prioritization module of the vehicle system but can alternatively be performed by any suitable portion or module of the vehicle system. Prioritizing the vehicle sensor data can be based on characteristics of the vehicle sensor data itself (e.g., block size, packet content, compressibility, type, etc.), a prioritization request (e.g., a remote query specifies a type of data or combination of data), an internal criterion (e.g., time of day, preset schedule, etc.), vehicle sensor data analysis (e.g., by the system, a third-party application executing on the vehicle, etc.), derivative data (e.g., recognized object classes), and/or any other suitable criteria. Prioritizing vehicle sensor data can additionally or alternatively include rule-based prioritization, prioritizing based on data classification, prioritizing based on heuristics, and prioritizing based on probability and/or stochasticity … Determining the prioritization scheme functions to establish the criteria against which the vehicle sensor data is prioritized. For example, the prioritization scheme specifies the rules and/or algorithms used to determine the priority (e.g., importance), a piece of data should have. Determining the prioritization scheme can be based on a remote query (e.g., a remote query specifies a prioritization scheme or range of possible priorities for each application), the data contents (e.g., the data type, the data values, etc.), a predetermined set of rules, or otherwise determined. The prioritization scheme can be determined automatically (e.g., trained on historical data priorities, such as for contexts with similar data parameters), manually (e.g., specified by a remote query), extracted from a remote query (e.g., a remote query specifies operations level data should be prioritized over application level data, but internally derived parameters prevent the remote query from overriding critical data having the highest prioritization level), or otherwise determined. The prioritization scheme can be determined based on categorical rules, data message or block size, data compressibility, a remote query, or any other suitable basis for prioritization. The prioritization scheme can additionally or alternatively be determined in any suitable manner. Determining the priority can additionally include determining the priority based on a combination of a remote query and data contents. For example, determining the priority can include ranking the vehicle sensor data according to a remote query, and selectively overruling the ranking specified by the remote query according to the data category (e.g., data in the critical category is given a higher ranking than a user preference of high resolution video data). In particular, the vehicle sensor data may include a plurality of different vehicle sensor data types (e.g., vehicle data from different vehicle sensors, vehicle data collected at different times, etc.) and the vehicle may function to prioritize the vehicle sensor data by ranking the varying vehicle sensor data types within the vehicle sensor data according to a determined level of importance of the data contents of the vehicle sensor data types to the vehicle and/or according to external request by a remote computing system.”); and
that the arbitration signal is generated also based on the priority scheme (See at least Petousis Abstract “A system and method that includes collecting vehicle sensor data, wherein prioritizing vehicle sensor data includes identifying a level of importance for each of a plurality of vehicle sensor data types included in the vehicle sensor data; generating a vehicle sensor data schedule, wherein generating the vehicle data schedule includes one or more of (i) identifying a transmission order for each of the plurality of vehicle sensor data types and (ii) identifying a storage scheme selected from a hierarchy of data storage types for each of the plurality of vehicle sensor data types” | Paragraph 48 “ In variations, scheduling can be performed by a scheduler (e.g., scheduling module). The scheduler preferably operates at a specified frequency (e.g., every 100 ms), but can alternatively operate in response to a trigger, at a non-periodic frequency (e.g., asynchronously), or in any other suitable manner. The vehicle sensor data received at and/or transferred to the scheduler preferably includes properties such as importance (e.g., priority, prioritization, weight) and size, but can additionally or alternatively include any suitable properties and/or characteristics.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Takahashi in view of Heyl to include that the priority scheme comprises a predefined priority of each ADS feature relative to the other ADS features of the plurality of ADS features based on various factors and that the arbitration signal is generated also based on the priority scheme, as taught by Petousis as disclosed above, such that the priority scheme comprises a predefined development priority, in order to ensure optimal resource allocation (Petousis Paragraph 2 “This invention relates generally to the autonomous vehicle field, and more specifically to a new and useful method for processing sensor data generated by vehicles.”).
With respect to claim 2, and similarly claim 12, Takahashi in view of Heyl in view of Petousis teaches evaluating the current scene or scenario in order to determine a score indicative of a potential development gain of using at least a portion of the stored sensor data for each of the plurality of ADS features; and generating, based on the platform constraints and the set of requirements, the arbitration signal indicative of a resource allocation of the platform of the vehicle for at least one of the plurality of ADS features in accordance with the determined score and the priority scheme (See at least Takahashi Paragraph 63 “evaluating the current scene or scenario in order to determine a score indicative of a potential development gain of using at least a portion of the stored sensor data for each of the plurality of ADS features; and generating, based on the platform constraints and the set of requirements, the arbitration signal indicative of a resource allocation of the platform of the vehicle for at least one of the plurality of ADS features in accordance with the determined score and the priority scheme” | Paragraph 117 “This enables collecting sensor information under conditions in which it is difficult to collect training data and conditions in which one specifically wants to enhance the recognition accuracy, and to facilitate the retraining for improving the real-time object detection precision” | Paragraphs 39-40 “The information processing apparatus may prioritize sensor information that fulfills a predetermined condition over other sensor information, and cause the second recognizer to execute the second recognition process. By filtering the data using the condition, the training data can, for example, be collected efficiently under driving conditions in which there is relatively little training data. By performing the training using training data collected in such a way, object recognition accuracy can be boosted above a fixed level without exception regardless the possibility of certain driving conditions occurring. A safer and more pleasant self-driving car can, therefore, be provided to the user early on”).
With respect to claim 7, Takahashi in view of Heyl in view of Petousis teaches that the set of platform constraints include at least one of, available power, available computational resources, available data storage capacity, and available bandwidth for data transmission (See at least Takahashi Paragraph 74 “Detection result transmission possibility verifier 205 determines whether the sensor information can be transmitted to server apparatus 103 based on predetermined information. This predetermined information relates to, for example, whether self-driving car 101 is in a parked state, whether self-driving car 101 is charged to or over a predetermined amount, or whether self-driving car 101 is being charged. This information serves as determination factors relating to whether screening detector 201 can execute the processes up to the determining of the priority rank of the sensor information to be transmitted to server apparatus 103. In other words, detection result transmission possibility verifier 205 determines whether the sensor information can be transmitted to server apparatus 103 based on whether the necessary resources (hereinafter, computational resources) are available, e.g. the processor, memory, and electric power consumed thereby necessary for the processes executed by screening detector 201. Detection result transmission possibility verifier 205 is an example of the assessor in the present embodiment.”).
With respect to claim 8, Takahashi in view of Heyl in view of Petousis teaches that the set of requirements for each of the plurality of the ADS features comprises an estimated power consumption, estimated computational resource need, an estimated data storage need, and an estimated bandwidth need (See at least Takahashi Paragraph 66 “The second recognizer that performs the second recognition process is configured according to one of (i) available computational resources in the vehicle including the information processing apparatus and (ii) a training purpose for the first recognizer” | Paragraph 103 “Detector generator 701 generates detector 702 based on object detector 203 (step S801). In the generation of detector 702 based on object detector 203, a detector is realized with better recognition performance or processing performance than object detector 203 by updating the capability conditions of object detector 203, e.g. using larger (high-resolution) input images than when object detector 203 is operating, increasing the bit depth of the parallel computing of the processer, or boosting the frequency of the processor. Computational resources that are not available during self-driving, such as hardware resources (e.g. computing power or memory used by a control system application during self-driving), electric power, or the like may additionally be used to realize a detector with enhanced recognition performance or processing performance”).
With respect to claim 10, Takahashi teaches a non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an in-vehicle processing system, the one or more programs comprising instructions for performing the method for allocating platform resources in a vehicle for development, evaluation, and/or testing of automated driving system (ADS) features, the method comprising:
storing during a time period, sensor data indicative of a surrounding environment of the vehicle in a data storage device of the vehicle (See at least Takahashi Paragraphs 27-28 “In order to solve this problem an information processing apparatus according to an aspect of the present disclosure causes a first recognizer to execute a first recognition process that takes sensor information as input, and a second recognizer to execute a second recognition process that takes the sensor information as input, the second recognizer having different capability conditions from the first recognizer; determines one of a transmission necessity and a transmission priority of the sensor information depending on a difference between a first recognition result of the first recognition process and a second recognition result of the second recognition process; and transmits the sensor information to a server apparatus based on the determined one of the transmission necessity and the transmission priority. This configuration makes it possible to limit situations in which the image data beneficial for the retraining cannot be transmitted to the server apparatus, which performs the retraining, due to issues with the network bandwidth or the electric power consumption, even when the amount of the data accumulated while driving is extensive. In other words, by transmitting the image data beneficial for advancing the AI on priority basis to the server apparatus when the network bandwidth and transmission time are limited, the training data is efficiently collected by the training system. This makes it possible to advance the AI with more certainty, speed up the AI update cycle in the self-driving car, and provide a safer and more pleasant self-driving car to the user early on”);
obtaining data indicative of a set of platform constraints of the vehicle (See at least Takahashi Paragraph 28 “This configuration makes it possible to limit situations in which the image data beneficial for the retraining cannot be transmitted to the server apparatus, which performs the retraining, due to issues with the network bandwidth or the electric power consumption, even when the amount of the data accumulated while driving is extensive. In other words, by transmitting the image data beneficial for advancing the AI on priority basis to the server apparatus when the network bandwidth and transmission time are limited, the training data is efficiently collected by the training system. This makes it possible to advance the AI with more certainty, speed up the AI update cycle in the self-driving car, and provide a safer and more pleasant self-driving car to the user early on” | Paragraph 33 “The information processing apparatus further determines whether a vehicle including the information processing apparatus has a surplus of computational resources greater than or equal to a predetermined amount, and may cause the second recognizer to execute a recognition process when it is determined that the vehicle has a surplus of the computational resources greater than or equal to the predetermined amount” | Paragraph 35 “The second recognizer that performs the second recognition process may be configured according to one of (i) available computational resources in the vehicle including the information processing apparatus and (ii) a training purpose for the first recognizer”);
obtaining data indicative of a set of requirements for each of a plurality of ADS features (See at least Takahashi Paragraph 29 “The second recognizer may have more computational resources than the first recognize” | Paragraphs 35-36 “The second recognizer that performs the second recognition process may be configured according to one of (i) available computational resources in the vehicle including the information processing apparatus and (ii) a training purpose for the first recognizer. In other words, one of a variety of available second recognizers is selected and used for the second recognition process depending on the amount of available computational resources. This makes it possible to effectively utilize the resources in the self-driving car. For example, the second recognition process can be executed for collecting training data suitable for implementing recognition of new objects in order to provide new functionality in the future. This encourages providing a safer and more pleasant self-driving car to the user”);
obtaining data indicative of a priority scheme for the plurality of the ADS features (See at least Takahashi Paragraph 31 “The information processing apparatus may determine one of the transmission necessity and the transmission priority depending on a numerical difference between the first recognition result and the second recognition result. The information processing apparatus may determine one of the transmission necessity and the transmission priority depending on a type difference between the first recognition result and the second recognition result” | Paragraphs 116-117 “Sensor information that fulfills the predetermined condition may be prioritized over the other sensor information and used in the object detection of screening detector or transmitted to server apparatus 103. An example of this predetermined condition includes a condition related to the time the sensor information is been generated. With this, for example, sensor information generated at a certain time during driving is processed with priority. A different example may be a condition related to the control details for allowing the car including information processing apparatus 200 to drive at the time the sensor information has been generated. For example, sensor information generated when the driver or the self-driving system performs a certain control, e.g. sudden braking, may also be processed with priority. Yet another example may be a condition related to the external conditions of the car. For example, sensor information generated while the car is driving in certain weather or places, e.g. rain, on poor roads, or in a tunnel, may also be processed with priority. This enables collecting sensor information under conditions in which it is difficult to collect training data and conditions in which one specifically wants to enhance the recognition accuracy, and to facilitate the retraining for improving the real-time object detection precision”) (See at least Takahashi Paragraph 71 “By filtering the data using the condition, the training data can, for example, be collected efficiently under driving conditions in which there is relatively little training data. By performing training using training data collected in such a way, object recognition precision can be boosted above a fixed level without exception regardless of the possibility of certain driving conditions occurring. A safer and more pleasant self-driving car can, therefore, be provided to the user early on” | Paragraph 116 “Sensor information that fulfills the predetermined condition may be prioritized over the other sensor information and used in the object detection of screening detector or transmitted to server apparatus 103. An example of this predetermined condition includes a condition related to the time the sensor information is been generated. With this, for example, sensor information generated at a certain time during driving is processed with priority. A different example may be a condition related to the control details for allowing the car including information processing apparatus 200 to drive at the time the sensor information has been generated. For example, sensor information generated when the driver or the self-driving system performs a certain control, e.g. sudden braking, may also be processed with priority. Yet another example may be a condition related to the external conditions of the car. For example, sensor information generated while the car is driving in certain weather or places, e.g. rain, on poor roads, or in a tunnel, may also be processed with priority”);
obtaining data indicative of a current scene or scenario in the surrounding environment of the vehicle (See at least Takahashi Paragraph 87 “Screening detector 201 next obtains the difference between the first recognition result of the first recognition process and the second recognition result of the second recognition process both with respect to the sensor information of a scene identified by an ID, and determines the transmission priority rank of the sensor information of the scene in question depending on this difference (step S404). The determined priority rank is accumulated by detection result with priority rank accumulator 206 as a portion of the detection result with priority rank data along with the second recognition result (step S405)”); and
generating based on the platform constraints, the set of requirements and the current scene or scenario, an arbitration signal indicative of a resource allocation of the platform of the vehicle to at least one of the plurality of the ADS features (See at least Takahashi Paragraphs 75-76 “Note that since screening detector 201, as mentioned above, produces more accurate results than object detector 203, more computational resources may be required for these processes. In order to finish these processes in the shortest time frame possible, the processor may require more computational resources since the processor may operate at a higher driving frequency when actuating screening detector 201 than when actuating object detector 203. For example, in the above predetermined information, available information relating to whether a network with enough available bandwidth to transmit data at or above a predetermined speed is available is included. This information serves as a determination factor relating to whether the sensor information can be transmitted to server apparatus 103 based on the determined priority rank”)
based on the resource allocation indicated by the arbitration signal, performing at least one of:
transmitting a portion of the stored sensor data to a remote entity for offline processing (See at least Takahashi Paragraph 32 “By determining the transmission necessity or transmission priority based on such a difference, the training data, which is likely to be beneficial to the retraining, is transmitted to the server apparatus on priority basis. In other words, the training data is collected efficiently”);
evaluating, in accordance with the resource allocation, an output of at least one ADS feature using at least a portion of stored sensor data as input (See at least Takahashi Paragraph 56 “The object detection by screening detector 201 is not performed real-time; the object detection process is, for example, executed for 30 minutes based on one hour's worth of the image data, and provides more accurate results than object detector 203. In other words, even when screening detector 201 and object detector 203 each perform the object detection based on images of the same scene, the results may still vary. Screening detector 201 (i) compares the object detection result with the detection result indicated by detection result data 301 accumulated by detection result accumulator 204, (ii) determines the priority rank depending on the difference between both, (iii) generates the detection result with priority rank data, which is data including the object detection result to which the determined priority information is added, and (iv) accumulates the detection result with priority rank data in detection result in priority rank accumulator 206. FIG. 3B shows detection result with priority rank data 321 that is an example of a configuration of the detection result with priority rank data and is accumulated by detection result with priority rank accumulator 206. In the example in FIG. 3B, priority rank 322 in detection result with priority rank data 321 is the above priority rank determined by screening detector 201” | Paragraph 66 “The second recognizer that performs the second recognition process is configured according to one of (i) available computational resources in the vehicle including the information processing apparatus and (ii) a training purpose for the first recognizer”); and
updating, in accordance with the resource allocation, at least one ADS feature using at least one portion of the stored sensor data as input (See at least Takahashi Paragraph 38 “This enables the second recognizer mounted in the self-driving car to be updated with great flexibility regarding place or time, making it more versatile”).
Takahashi fails to explicitly disclose that the ADS features are being developed, evaluated, and/or tested; the priority scheme comprises a predefined development priority of each ADS feature relative to the other ADS features of the plurality of ADS features; and that the arbitration signal is generated also based on the priority scheme.
Heyl teaches that the ADS features are being developed, evaluated, and/or tested (See at least Heyl Paragraph 46 “In order to determine whether the vehicle system 102 has sufficient robustness, i.e. can correctly detect objects in the environment of the vehicle 100 under different environmental conditions, based on the sensor data 112 the evaluation unit 110 determines a probability of existence for each of the detected objects indicating the probability with which the detected object, for example a model of the vehicle ahead 113, corresponds to a real object, here the actual vehicle ahead 113. Furthermore, the evaluation unit 110 determines a probability of detection, indicating the probability of the sensor system detecting an object in the environment of the vehicle 100, here the vehicle ahead 113, at all.”).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the method of Takahashi to include that the ADS are being developed, evaluated, and/or tested, as taught by Heyl as disclosed above, in order to ensure that the resource allocation goes to important aspects of vehicle safety (Heyl Paragraph 5 “Embodiments of the present disclosure allow in an advantageous manner the estimation of the robustness of a vehicle system with multiple sensors for environment detection on the basis of sensor-specific probabilities of existence and detection. As a result, false positives, false negatives, or other incorrect results when detecting objects can be avoided.”).
Takahashi in view of Heyl Fail to explicitly disclose that the priority scheme comprises a predefined development priority of each ADS feature relative to the other ADS features of the plurality of ADS features; and that the arbitration signal is generated also based on the priority scheme
Petousis teaches that the priority scheme comprises a predefined priority of each ADS feature relative to the other ADS features of the plurality of ADS features based on various factors (See at least Petousis Paragraphs 28-30 “Prioritizing the vehicle sensor data functions to determine a level of importance of the received vehicle sensor data and prepare the vehicle sensor data for scheduling. Prioritizing vehicle sensor data preferably includes determining (or selecting) a prioritization scheme, and then prioritizing the vehicle sensor data according to the determined prioritization scheme. Prioritizing the vehicle sensor data is preferably performed by a prioritization module of the vehicle system but can alternatively be performed by any suitable portion or module of the vehicle system. Prioritizing the vehicle sensor data can be based on characteristics of the vehicle sensor data itself (e.g., block size, packet content, compressibility, type, etc.), a prioritization request (e.g., a remote query specifies a type of data or combination of data), an internal criterion (e.g., time of day, preset schedule, etc.), vehicle sensor data analysis (e.g., by the system, a third-party application executing on the vehicle, etc.), derivative data (e.g., recognized object classes), and/or any other suitable criteria. Prioritizing vehicle sensor data can additionally or alternatively include rule-based prioritization, prioritizing based on data classification, prioritizing based on heuristics, and prioritizing based on probability and/or stochasticity … Determining the prioritization scheme functions to establish the criteria against which the vehicle sensor data is prioritized. For example, the prioritization scheme specifies the rules and/or algorithms used to determine the priority (e.g., importance), a piece of data should have. Determining the prioritization scheme can be based on a remote query (e.g., a remote query specifies a prioritization scheme or range of possible priorities for each application), the data contents (e.g., the data type, the data values, etc.), a predetermined set of rules, or otherwise determined. The prioritization scheme can be determined automatically (e.g., trained on historical data priorities, such as for contexts with similar data parameters), manually (e.g., specified by a remote query), extracted from a remote query (e.g., a remote query specifies operations level data should be prioritized over application level data, but internally derived parameters prevent the remote query from overriding critical data having the highest prioritization level), or otherwise determined. The prioritization scheme can be determined based on categorical rules, data message or block size, data compressibility, a remote query, or any other suitable basis for prioritization. The prioritization scheme can additionally or alternatively be determined in any suitable manner. Determining the priority can additionally include determining the priority based on a combination of a remote query and data contents. For example, determining the priority can include ranking the vehicle sensor data according to a remote query, and selectively overruling the ranking specified by the remote query according to the data category (e.g., data in the critical category is given a higher ranking than a user preference of high resolution video data). In particular, the vehicle sensor data may include a plurality of different vehicle sensor data types (e.g., vehicle data from different vehicle sensors, vehicle data collected at different times, etc.) and the vehicle may function to prioritize the vehicle sensor data by ranking the varying vehicle sensor data types within the vehicle sensor data according to a determined level of importa