Prosecution Insights
Last updated: April 19, 2026
Application No. 18/766,278

VEHICLE DEVICE AND INFORMATION INTEGRATION METHOD

Final Rejection §102§103
Filed
Jul 08, 2024
Examiner
ALGEHAIM, MOHAMED A
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
DENSO CORPORATION
OA Round
2 (Final)
59%
Grant Probability
Moderate
3-4
OA Rounds
3y 3m
To Grant
81%
With Interview

Examiner Intelligence

Grants 59% of resolved cases
59%
Career Allow Rate
122 granted / 207 resolved
+6.9% vs TC avg
Strong +22% interview lift
Without
With
+21.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
37 currently pending
Career history
244
Total Applications
across all art units

Statute-Specific Performance

§101
14.8%
-25.2% vs TC avg
§103
49.6%
+9.6% vs TC avg
§102
15.6%
-24.4% vs TC avg
§112
15.3%
-24.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 207 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1-9, & 11-17 of U.S. Application No. 18/766278 filed on 12/31/2025 have been examined. Office Action is in response to the Applicant's amendments and remarks filed12/31/2025. Claims 1-3, 6, & 11-12 are presently amended. Claim 10 is cancelled and Claims 13-17 are newly added. Claims 1-9, & 11-17 are presently pending and are presented for examination. Response to Arguments In regards to the previous claim objections: the amendments to the claims overcome the previous claim objection(s). Therefore, the previous claim objection(s) is/are withdrawn. In regards to the previous claim interpretation under 35 U.S.C. § 112(f): Applicant does not provide separate remarks regarding the previous claim interpretation under 35 U.S.C. § 112(f). Accordingly, the previous 35 U.S.C. 112(f) claim interpretation is maintained. In regards to the previous rejections under 35 U.S.C. § 101: the amendments to the claims overcome the previous 35 USC § 101 rejection. Therefore, the previous 35 USC § 101 rejection is withdrawn. In regards to the previous rejection under 35 U.S.C. § 102: Applicant's arguments filed 12/31/2025 have been fully considered but they are not persuasive. Applicant argues that the prior art does not explicitly disclose the limitations, “an integration unit configured to integrate, as integration information, the information that is acquired from a plurality of different sources and related to the identical target based on a determination result by the determination unit; and an adjustment unit configured to adjust an information amount of the integration information according to a predetermined selection criterion;”. Applicant further argues on pages. 10-11 of the Remarks, “As described in Takamatsu, however, the first sensor information and the second sensor information are treated as individual pieces of information and compared with each other. Takamatsu fails to teach or suggest a configuration in which the first sensor information and the second sensor information are integrated. In addition, since the second sensor information serves as a comparative reference against the first sensor information, there is no technical significance in integrating the second sensor information into the first sensor information… As such, Takamatsu fails to teach or suggest a configuration in which information about the identical object, i.e., an object detected by both the own device and the external device, is integrated. Takamatsu additional fails to teach or suggest adjusting an information amount of the integration information according to a predetermined selection criterion. With specific reference to the language of claim 1, Takamatsu fails to teach or suggest an integration unit configured to integrate, as integration information, the information that is acquired from a plurality of different sources and related to the identical target based on a determination result by the determination unit, and an adjustment unit configured to adjust an information amount of the integration information according to a predetermined selection criterion, as recited by claim 1.”. Examiner respectfully disagrees. Applicant is reminded claims must be given their broadest reasonable interpretation. Takamatsu discloses the idea of an information processing apparatus and handling a difference between detection results of sensors. Takamatsu receives sensor information about a mobile object, and the sensor information includes a plurality of parameters (see at least Takamatsu, para. [0063]). Further Takamatsu discloses the sensor apparatus comparing first sensor information and second sensor information about the same mobile object and outputs a comparison result (see at least Takamatsu, para. [0068-0071]). Takamatsu further compares the coordinates of the mobile object detection from the first and second sensor falling within a predetermined distance, it is determined the mobile target is the same in both sensor information (see at least Takamatsu, para. [0072]). Further Takamatsu integrates the information of the first and second sensor information accumulates the values of the differences from both sensor information to adjust a parameter and perform calibrations for the sensor information (see at least Takamatsu, para. [0085-0086] & para. [0094]). The sensor information being updated and adjusted based on a difference of the sensor information is interpreted as being integrating both sensor information. In view of the arguments above, the 102 rejection is maintained. In regards to the previous rejection under 35 U.S.C. § 103: Applicant’s remaining arguments with respect to the claim(s) have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. A new grounds of rejection is made in view of US 2019/0361436A1 (“Ueda”) & US 2019/0333386A1 (“Horita”). Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “an acquisition unit configured to acquire…” “a comparison unit configured to compare…” “a determination unit configured to determine…” “an integration unit configured to integrate…perform…adds” “an adjustment unit configured to adjust…” in claims 1-11. A review of the specification shows that the following appears to be the corresponding structure for the above limitation described in the specification: (see at least Applicant Specification, para. [0035]: The controller 10 is configured as a computer system including a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), an input-output interface, and the like (not shown), and controls the entire vehicle device 1 by executing computer programs stored in the storage 11.) Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1-9, 11-12, & 15-17 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by US 2021/0124956A1 (“Takamatsu”). As per claim 1 Takamatsu discloses A vehicle device comprising (see at least Takamatsu, para. [0035]: For example, a sensor apparatus (onboard sensor apparatus 100 described with reference to FIG. 2) mounted on the vehicle 10A and a sensor apparatus (environment installation sensor apparatus 200 described with reference to FIG. 3) mounted on the traffic light 20A perform sensing regarding the vehicle 10A or a driver of the vehicle 10A.): an acquisition unit configured to acquire internal information detected by a sensor mounted on a subject vehicle (see at least Takamatsu, para. [0063]: For example, sensor information can include information regarding a mobile object. The sensor information can include, for example, at least one of the position, size, type, speed, acceleration, moving direction, detection accuracy, and detection time of the mobile object. The sensor information can include the above-described information of each of a plurality of mobile objects. Here, the mobile object may be an own vehicle, a different vehicle, or any mobile object such as a pedestrian.); a comparison unit configured to compare the internal information with external information that is acquired outside the subject vehicle via a communication unit mounted on the subject vehicle (see at least Takamatsu, para. [0068]: The onboard sensor apparatus 100 (e.g., control section 170) may control processing based on a comparison result of first sensor information and second information detected for the same mobile object. & para. [0071]: A person l0C is walking on the crosswalk. In such a situation, for example, sensor apparatuses (onboard sensor apparatuses 100 or environment installation sensor apparatuses 200) mounted on the vehicles 10A and 10B and the traffic light 20 share sensor information regarding the person l0C that is a mobile object.); a determination unit configured to determine whether both of the internal information and the external information include information related to an identical target based on a comparison result by the comparison unit (see at least Takamatsu, para. [0072]: For example, in the case where the coordinates of mobile objects targeted by the respective pieces of sensor information fall within a predetermined distance, it can be determined that the mobile targets are the same. In addition, in the case where the same identification information is recognized from mobile objects targeted by the respective pieces of sensor information, it can be determined that the mobile objects are the same.); and an integration unit configured to integrate, as integration information, the information that is acquired from a plurality of different sources and related to the identical target based on a determination result by the determination unit (see at least Takamatsu, para. [0071-0072]: A sharing result of the sensor information is illustrated in FIG. 5. Coordinates 40A represent the position of the person 10C detected by an onboard sensor apparatus 100A mounted on the vehicle l0A. Coordinates 40B represent the position of the person l0C detected by an onboard sensor apparatus 100B mounted on the vehicle l0B. Coordinates 40C represent the position of the person OC detected by the environment installation sensor apparatus 200 mounted on the traffic light 20… For example, in the case where the coordinates of mobile objects targeted by the respective pieces of sensor information fall within a predetermined distance, it can be determined that the mobile targets are the same. In addition, in the case where the same identification information is recognized from mobile objects targeted by the respective pieces of sensor information, it can be determined that the mobile objects are the same. para. [0085-0086]: For example, the onboard sensor apparatus 100 (e.g., detection section 110 and control section 170) may adjust the detection section 110 to decrease an erroneous difference between first sensor information and second sensor information. Specifically, the onboard sensor apparatus 100 performs calibration by adjusting a parameter of each sensor or recognition module included in the detection section 110. For example, the onboard sensor apparatus 100 accumulates evaluation values of differences calculated with the equation (1) or equation (2) above, and performs calibration to make a change in the direction in which the accumulated evaluation values decrease. The degree of reliability may be taken into consideration for calibration…For example, the onboard sensor apparatus 100 (e.g., control section 170) may select sensor information to be used. For example, instead of first sensor information detected by the detection section 110 determined to be abnormal, the onboard sensor apparatus 100 may use the corresponding second sensor information ( detected by the same type of sensor).& para. [0094]); and an adjustment unit configured to adjust an information amount of the integration information according to a predetermined selection criterion (see at least Takamatsu, para. [0073]: The onboard sensor apparatus 100 may use only sensor information having the degree of reliability greater than or equal to a predetermined value as a comparison target.. para. [0085]: For example, the onboard sensor apparatus 100 accumulates evaluation values of differences calculated with the equation (1) or equation (2) above, and performs calibration to make a change in the direction in which the accumulated evaluation values decrease. The degree of reliability may be taken into consideration for calibration.); wherein a speed of the subject vehicle is controlled based on at least one of the internal information, the external information, or the integration information (see at least Takamatsu, para. [0089]: Therefore, the onboard sensor apparatus 100 (e.g., driving control section 150 and control section 170) may control an own vehicle such that the own vehicle stays longer in another sensor apparatus's detection area of second sensor information. For example, the onboard sensor apparatus 100 controls an own vehicle such that the own vehicle decreases speed, selectively travels on a road having a large number of other sensor apparatuses, and turns at a corner where the environment installation sensor apparatus 200 is installed.). As per claim 2 Takamatsu discloses wherein the integration unit is configured to integrate the information related to a mobile object (see at least Takamatsu, para. [0112-0014]: As illustrated in FIG. 11, the control section 170 first pairs first sensor information and second sensor information detected for the same mobile object (step S302). In the case where second sensor information is acquired from a plurality of other sensor apparatuses, a plurality of pairs are generated…The control section 170 then determines that a sensor which detects sensor information whose proportion of pairs having evaluation values greater than or equal to the threshold Y2 is greater than or equal to a threshold Y3 is abnormal (step S308).). As per claim 3 Takamatsu discloses wherein the integration unit is configured to integrate the information related to the subject vehicle (see at least Takamatsu, para. [0112-0014]: As illustrated in FIG. 11, the control section 170 first pairs first sensor information and second sensor information detected for the same mobile object (step S302). In the case where second sensor information is acquired from a plurality of other sensor apparatuses, a plurality of pairs are generated…The control section 170 then determines that a sensor which detects sensor information whose proportion of pairs having evaluation values greater than or equal to the threshold Y2 is greater than or equal to a threshold Y3 is abnormal (step S308).). As per claim 4 Takamatsu discloses wherein the integration unit is configured to perform integration that enables utilization for controlling traveling of the subject vehicle (see at least Takamatsu, para. [0121-0122]: In the case where it is determined that the majority of the sensors are abnormal (step S602/YES), the driving control section 150 performs automated driving on the basis of second sensor information (step S604). The control section 170 then causes an own vehicle to stop in another sensor apparatus's detection area of second sensor information (step S606)…In contrast, in the case where it is determined that the majority of the sensors are normal (step S602/NO), the driving control section 150 performs automated driving on the basis of first sensor information and second sensor information (step S608).). As per claim 5 Takamatsu discloses wherein the integration unit is configured to perform integration that enables utilization for notification to a user (see at least Takamatsu, para. [0102]: The control section 170 then compares the first sensor information and the second sensor information to determine a sensor abnormality by (step S300), and performs calibration (step S400). Next, the control section 170 notifies a user and/or another apparatus of a warning showing an abnormality of the detection section 110 (step S500). The driving control section 150 then performs automated driving (step S600). & para. [0118]: In the case where there is a sensor that fails in calibration (step S502NES), the control section 170 controls the first notification section 130 or the second notification section 140 such that a user or a nearby different vehicle is notified of a warning showing that a sensor abnormality occurs (step S504). In the case where there is no sensor that fails in calibration (step S502/NO), the control section 170 issues no warning.). As per claim 6 Takamatsu discloses wherein the integration unit is configured to integrate the information based on each information reliability when the acquired internal information related to the identical target is different from the acquired external information related to the identical target (see at least Takamatsu, para. [0077]: For example, it is assumed that n pairs of first sensor information and second sensor information having degrees of reliability higher than or equal to a threshold are acquired within a predetermined time. In that case, the onboard sensor apparatus 100 calculates an evaluation value with an equation (1) below, and determines a sensor abnormality in the case where the evaluation value is greater than a threshold.). As per claim 7 Takamatsu discloses wherein the integration unit adds, as acquired information that is not acquired by the subject vehicle, information that is not in the internal information and is in the external information to an integration target (see at least Takamatsu, para. [0093]: It is assumed that the vehicle l0A is provided with a sensor on the front side, and has a blind spot 60 because of the influence of a wall 50 provided on the inside of the comer. In that case, the onboard sensor apparatus 100 mounted on the vehicle l0A acquires second sensor information from the environment installation sensor apparatus 200 provided to a surveillance camera 20 that is provided in the blind spot 60 or has the blind spot 60 as a detection area, and complements first sensor information. This allows the onboard sensor apparatus 100 to perform automated driving that takes into consideration the presence of the vehicle 10B that is present in the blind spot 60, increasing an accident avoidance rate.). As per claim 8 Takamatsu discloses wherein the integration unit adds, as acquired information that is not acquired by the subject vehicle, information that is in the internal information and is not in the external information to an integration target (see at least Takamatsu, Fig. 10 & para. [0109-0110]: As illustrated in FIG. 10, the control section 170 first calculates position information of a blind spot (step S212). For example, the control section 170 recognizes an obstacle such as a wall on the basis of a captured image obtained by imaging the area in front of an own vehicle, and calculates position information of the area behind the recognized obstacle on the basis of position information of the own vehicle…The control section 170 then acquires second sensor information from another sensor apparatus having position information included in the calculated position information of the blind spot among the sensor apparatuses in the vicinity (step S214). Note that the determination of a sensor apparatus in the vicinity can be made similarly to the processing described above with reference to FIG. 9.). As per claim 9 Takamatsu discloses wherein the determination unit is configured to determine whether the identical target exists based on at least one information of a position of an object in the internal information and the external information, a speed of the object, an orientation of the object, a three-dimensional shape of the object, or a consistency with a peripheral environment (see at least Takamatsu, para. [0071]: A sharing result of the sensor information is illustrated in FIG. 5. Coordinates 40A represent the position of the person 1 0C detected by an onboard sensor apparatus 100A mounted on the vehicle lOA. Coordinates 40B represent the position of the person l0C detected by an onboard sensor apparatus l00B mounted on the vehicle 10B. Coordinates 40C represent the position of the person OC detected by the environment installation sensor apparatus 200 mounted on the traffic light 20. For example, since the coordinates 40A are apart from the coordinates 40B and 40C, the onboard sensor apparatus l00A can determine that its sensor is abnormal. & para. [0199]). As per claim 11 Takamatsu discloses An information integration method for a vehicle and for integrating information acquired from a plurality of different sources of a vehicle (see at least Takamatsu, para. [0057]: The onboard sensor apparatus 100 may preferentially acquire second sensor information from another sensor apparatus capable of detecting the area that overlaps with the area which the onboard sensor apparatus 100 is capable of detecting. In that case, the onboard sensor apparatus 100 is capable of knowing multifaceted sensor information regarding the same area.), the method comprising: acquiring internal information detected by a sensor mounted on a subject vehicle (see at least Takamatsu, para. [0063]: For example, sensor information can include information regarding a mobile object. The sensor information can include, for example, at least one of the position, size, type, speed, acceleration, moving direction, detection accuracy, and detection time of the mobile object. The sensor information can include the above-described information of each of a plurality of mobile objects. Here, the mobile object may be an own vehicle, a different vehicle, or any mobile object such as a pedestrian.); comparing the internal information with external information that is acquired outside the subject vehicle via a communication unit mounted on the subject vehicle (see at least Takamatsu, para. [0068]: The onboard sensor apparatus 100 (e.g., control section 170) may control processing based on a comparison result of first sensor information and second information detected for the same mobile object. & para. [0071]: A person l0C is walking on the crosswalk. In such a situation, for example, sensor apparatuses (onboard sensor apparatuses 100 or environment installation sensor apparatuses 200) mounted on the vehicles 10A and 10B and the traffic light 20 share sensor information regarding the person l0C that is a mobile object.); determining whether both of the internal information and the external information include information related to an identical target (see at least Takamatsu, para. [0072]: For example, in the case where the coordinates of mobile objects targeted by the respective pieces of sensor information fall within a predetermined distance, it can be determined that the mobile targets are the same. In addition, in the case where the same identification information is recognized from mobile objects targeted by the respective pieces of sensor information, it can be determined that the mobile objects are the same.); and integrating, as integration information, the information that is acquired from the plurality of different sources and related to the identical target based on a determination result (see at least Takamatsu, para. [0071-0072]: A sharing result of the sensor information is illustrated in FIG. 5. Coordinates 40A represent the position of the person 10C detected by an onboard sensor apparatus 100A mounted on the vehicle l0A. Coordinates 40B represent the position of the person l0C detected by an onboard sensor apparatus 100B mounted on the vehicle l0B. Coordinates 40C represent the position of the person OC detected by the environment installation sensor apparatus 200 mounted on the traffic light 20… For example, in the case where the coordinates of mobile objects targeted by the respective pieces of sensor information fall within a predetermined distance, it can be determined that the mobile targets are the same. In addition, in the case where the same identification information is recognized from mobile objects targeted by the respective pieces of sensor information, it can be determined that the mobile objects are the same. para. [0085-0086]: For example, the onboard sensor apparatus 100 (e.g., detection section 110 and control section 170) may adjust the detection section 110 to decrease an erroneous difference between first sensor information and second sensor information. Specifically, the onboard sensor apparatus 100 performs calibration by adjusting a parameter of each sensor or recognition module included in the detection section 110. For example, the onboard sensor apparatus 100 accumulates evaluation values of differences calculated with the equation (1) or equation (2) above, and performs calibration to make a change in the direction in which the accumulated evaluation values decrease. The degree of reliability may be taken into consideration for calibration…For example, the onboard sensor apparatus 100 (e.g., control section 170) may select sensor information to be used. For example, instead of first sensor information detected by the detection section 110 determined to be abnormal, the onboard sensor apparatus 100 may use the corresponding second sensor information ( detected by the same type of sensor).& para. [0094]); adjusting an information amount of the integration information according to a predetermined selection criterion (see at least Takamatsu, para. [0073]: The onboard sensor apparatus 100 may use only sensor information having the degree of reliability greater than or equal to a predetermined value as a comparison target. para. [0085]: For example, the onboard sensor apparatus 100 accumulates evaluation values of differences calculated with the equation (1) or equation (2) above, and performs calibration to make a change in the direction in which the accumulated evaluation values decrease. The degree of reliability may be taken into consideration for calibration.); wherein a speed of the subject vehicle is controlled based on at least one of the internal information, the external information, or the integration information (see at least Takamatsu, para. [0089]: Therefore, the onboard sensor apparatus 100 (e.g., driving control section 150 and control section 170) may control an own vehicle such that the own vehicle stays longer in another sensor apparatus's detection area of second sensor information. For example, the onboard sensor apparatus 100 controls an own vehicle such that the own vehicle decreases speed, selectively travels on a road having a large number of other sensor apparatuses, and turns at a corner where the environment installation sensor apparatus 200 is installed.). As per claim 12 Takamatsu discloses A vehicle device comprising (see at least Takamatsu, para. [0035]: For example, a sensor apparatus (onboard sensor apparatus 100 described with reference to FIG. 2) mounted on the vehicle 10A and a sensor apparatus (environment installation sensor apparatus 200 described with reference to FIG. 3) mounted on the traffic light 20A perform sensing regarding the vehicle 10A or a driver of the vehicle 10A.): a processor (see at least Takamatsu, para. [0142]); and a memory coupled to the processor and storing program instructions that when executed by the processor cause the processor to at least (see at least Takamatsu, para. [0142]): acquire internal information detected by a sensor mounted on a subject vehicle (see at least Takamatsu, para. [0063]: For example, sensor information can include information regarding a mobile object. The sensor information can include, for example, at least one of the position, size, type, speed, acceleration, moving direction, detection accuracy, and detection time of the mobile object. The sensor information can include the above-described information of each of a plurality of mobile objects. Here, the mobile object may be an own vehicle, a different vehicle, or any mobile object such as a pedestrian.); compare the internal information with external information that is acquired outside the subject vehicle (see at least Takamatsu, para. [0068]: The onboard sensor apparatus 100 (e.g., control section 170) may control processing based on a comparison result of first sensor information and second information detected for the same mobile object. & para. [0071]: A person l0C is walking on the crosswalk. In such a situation, for example, sensor apparatuses (onboard sensor apparatuses 100 or environment installation sensor apparatuses 200) mounted on the vehicles 10A and 10B and the traffic light 20 share sensor information regarding the person l0C that is a mobile object.); determine whether both of the internal information and the external information include information related to an identical target based on a comparison result of the internal information with external information (see at least Takamatsu, para. [0072]: For example, in the case where the coordinates of mobile objects targeted by the respective pieces of sensor information fall within a predetermined distance, it can be determined that the mobile targets are the same. In addition, in the case where the same identification information is recognized from mobile objects targeted by the respective pieces of sensor information, it can be determined that the mobile objects are the same.); and integrate, as integration information, the information that is acquired from a plurality of different sources and related to the identical target based on a determination result of whether both of the internal information and the external information include the information (see at least Takamatsu, para. [0071-0072]: A sharing result of the sensor information is illustrated in FIG. 5. Coordinates 40A represent the position of the person 10C detected by an onboard sensor apparatus 100A mounted on the vehicle l0A. Coordinates 40B represent the position of the person l0C detected by an onboard sensor apparatus 100B mounted on the vehicle l0B. Coordinates 40C represent the position of the person OC detected by the environment installation sensor apparatus 200 mounted on the traffic light 20… For example, in the case where the coordinates of mobile objects targeted by the respective pieces of sensor information fall within a predetermined distance, it can be determined that the mobile targets are the same. In addition, in the case where the same identification information is recognized from mobile objects targeted by the respective pieces of sensor information, it can be determined that the mobile objects are the same. para. [0085-0086]: For example, the onboard sensor apparatus 100 (e.g., detection section 110 and control section 170) may adjust the detection section 110 to decrease an erroneous difference between first sensor information and second sensor information. Specifically, the onboard sensor apparatus 100 performs calibration by adjusting a parameter of each sensor or recognition module included in the detection section 110. For example, the onboard sensor apparatus 100 accumulates evaluation values of differences calculated with the equation (1) or equation (2) above, and performs calibration to make a change in the direction in which the accumulated evaluation values decrease. The degree of reliability may be taken into consideration for calibration…For example, the onboard sensor apparatus 100 (e.g., control section 170) may select sensor information to be used. For example, instead of first sensor information detected by the detection section 110 determined to be abnormal, the onboard sensor apparatus 100 may use the corresponding second sensor information ( detected by the same type of sensor).& para. [0094]); and adjust an information amount of the integration information according to a predetermined selection criterion (see at least Takamatsu, para. [0073]: The onboard sensor apparatus 100 may use only sensor information having the degree of reliability greater than or equal to a predetermined value as a comparison target. para. [0085]: For example, the onboard sensor apparatus 100 accumulates evaluation values of differences calculated with the equation (1) or equation (2) above, and performs calibration to make a change in the direction in which the accumulated evaluation values decrease. The degree of reliability may be taken into consideration for calibration.); wherein a speed of the subject vehicle is controlled based on at least one of the internal information, the external information, or the integration information (see at least Takamatsu, para. [0089]: Therefore, the onboard sensor apparatus 100 (e.g., driving control section 150 and control section 170) may control an own vehicle such that the own vehicle stays longer in another sensor apparatus's detection area of second sensor information. For example, the onboard sensor apparatus 100 controls an own vehicle such that the own vehicle decreases speed, selectively travels on a road having a large number of other sensor apparatuses, and turns at a corner where the environment installation sensor apparatus 200 is installed.). As per claim 15 Takamatsu discloses further comprising: an electronic control unit configured to control a brake system of the subject vehicle and to slow down the subject vehicle by controlling the brake system based on at least one of the internal information, the external information, or the integration information (see at least Takamatsu, para. [0089]: Therefore, the onboard sensor apparatus 100 (e.g., driving control section 150 and control section 170) may control an own vehicle such that the own vehicle stays longer in another sensor apparatus's detection area of second sensor information. For example, the onboard sensor apparatus 100 controls an own vehicle such that the own vehicle decreases speed, selectively travels on a road having a large number of other sensor apparatuses, and turns at a corner where the environment installation sensor apparatus 200 is installed. & para. [0139]: Note that the power generation apparatus 918, the braking apparatus 920, the steering 922, and the lamp activation apparatus 924 may come into operation on the basis of a manual operation performed by a driver or on the basis of an automatic operation performed by the electronic control unit 902.). As per claim 16 Takamatsu discloses further comprising: controlling, with an electronic control unit, a brake system of the subject vehicle to slow down the subject vehicle based on at least one of the internal information, the external information, or the integration information (see at least Takamatsu, para. [0089]: Therefore, the onboard sensor apparatus 100 (e.g., driving control section 150 and control section 170) may control an own vehicle such that the own vehicle stays longer in another sensor apparatus's detection area of second sensor information. For example, the onboard sensor apparatus 100 controls an own vehicle such that the own vehicle decreases speed, selectively travels on a road having a large number of other sensor apparatuses, and turns at a corner where the environment installation sensor apparatus 200 is installed. & para. [0139]: Note that the power generation apparatus 918, the braking apparatus 920, the steering 922, and the lamp activation apparatus 924 may come into operation on the basis of a manual operation performed by a driver or on the basis of an automatic operation performed by the electronic control unit 902.). As per claim 17 Takamatsu discloses further comprising: an electronic control unit configured to control a brake system of the subject vehicle and to slow down the subject vehicle by controlling the brake system based on at least one of the internal information, the external information, or the integration information (see at least Takamatsu, para. [0089]: Therefore, the onboard sensor apparatus 100 (e.g., driving control section 150 and control section 170) may control an own vehicle such that the own vehicle stays longer in another sensor apparatus's detection area of second sensor information. For example, the onboard sensor apparatus 100 controls an own vehicle such that the own vehicle decreases speed, selectively travels on a road having a large number of other sensor apparatuses, and turns at a corner where the environment installation sensor apparatus 200 is installed. & para. [0139]: Note that the power generation apparatus 918, the braking apparatus 920, the steering 922, and the lamp activation apparatus 924 may come into operation on the basis of a manual operation performed by a driver or on the basis of an automatic operation performed by the electronic control unit 902.). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Takamatsu, in view of US 2019/0361436A1 (“Ueda”). As per claim 13 Takamatsu does not explicitly disclose wherein the adjustment unit is further configured to adjust the information amount of the integration information according to a priority set in the predetermined selection criterion based on a degree of influence on a control of the subject vehicle or on a risk prediction. However Ueda teaches wherein the adjustment unit is further configured to adjust the information amount of the integration information according to a priority set in the predetermined selection criterion based on a degree of influence on a control of the subject vehicle or on a risk prediction (see at least , para. [0081]: Transmission data amount adjuster 114 adjusts a data amount of the sensed data to be transmitted to remote control device 50 based on the risk degree calculated by risk degree calculator112 or the communication delay amount estimated by communication delay estimator 113.Transmission data amount adjuster 114 increases a data amount of the sensed data to be transmitted as the risk degree is higher or as the communication delay amount is smaller.). Claim(s) 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Takamatsu, in view of US 2019/0333386A1 (“Horita”) . As per claim 14 Takamatsu does not explicitly disclose wherein the integration unit is further configured to perform integration that enables utilization for notification to a user, the vehicle device further comprising: a risk prediction unit configured to cause a display device or an audio device to perform the notification using a state obtained by the integration that enables the utilization for the notification to the user. However Horita teaches wherein the integration unit is further configured to perform integration that enables utilization for notification to a user, the vehicle device further comprising: a risk prediction unit configured to cause a display device or an audio device to perform the notification using a state obtained by the integration that enables the utilization for the notification to the user (see at least Horita, para. [0042]: The in-vehicle HMI apparatus 80 is constituted by a speaker, a display apparatus and the like mounted on the vehicle 2. The in-vehicle HMI apparatus 80 is configured to perform, through sound or a screen, notification to a driver about driving assistance of the vehicle 2 based on information output from the surrounding environment recognizing apparatus 10 and/or information output from the driving control apparatus 70. & para. [0052]: The surrounding environment recognizing apparatus 10 of the driving control system 1 in the present embodiment executes a surrounding environment recognition process like the one explained below based on information about the vehicle 2 and/or environment factors around the vehicle 2 acquired individually from the own vehicle position determining apparatus 30, external sensor group 40, vehicle sensor group 50 and map information managing apparatus 60, which are external apparatuses, and creates a risk-of-driving map of an area around the vehicle 2 like the aforementioned map. Then, it outputs the generated risk-of-driving map to the driving control apparatus 70 and/or in-vehicle HMI apparatus 80 to thereby provide driving assistance of the vehicle 2.). Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MOHAMED ABDO ALGEHAIM whose telephone number is (571)272-3628. The examiner can normally be reached Monday-Friday 8-5PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Fadey Jabr can be reached at 571-272-1516. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MOHAMED ABDO ALGEHAIM/Primary Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Jul 08, 2024
Application Filed
Sep 26, 2025
Non-Final Rejection — §102, §103
Dec 12, 2025
Applicant Interview (Telephonic)
Dec 12, 2025
Examiner Interview Summary
Dec 31, 2025
Response Filed
Feb 07, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594963
DETECTING AN UNKNOWN OBJECT BY A LEAD AUTONOMOUS VEHICLE (AV) AND UPDATING ROUTING PLANS FOR FOLLOWING AVs
2y 5m to grant Granted Apr 07, 2026
Patent 12597865
INVERTER
2y 5m to grant Granted Apr 07, 2026
Patent 12589978
TRUCK-TABLET INTERFACE
2y 5m to grant Granted Mar 31, 2026
Patent 12565235
DETECTING A CONSTRUCTION ZONE BY A LEAD AUTONOMOUS VEHICLE (AV) AND UPDATING ROUTING PLANS FOR FOLLOWING AVs
2y 5m to grant Granted Mar 03, 2026
Patent 12559228
THERMAL MANAGEMENT SYSTEM FOR AN AIRCRAFT INCLUDING AN ELECTRIC PROPULSION ENGINE
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
59%
Grant Probability
81%
With Interview (+21.9%)
3y 3m
Median Time to Grant
Moderate
PTA Risk
Based on 207 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month