Prosecution Insights
Last updated: April 19, 2026
Application No. 16/525,824

ENHANCED DISCRIMINATION METHOD AND APPARATUS FOR CONTROLLING AN ACTUATABLE PROTECTION DEVICE

Final Rejection §103
Filed
Jul 30, 2019
Examiner
ALGEHAIM, MOHAMED A
Art Unit
3668
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
ZF Friedrichshafen AG
OA Round
6 (Final)
59%
Grant Probability
Moderate
7-8
OA Rounds
3y 3m
To Grant
81%
With Interview

Examiner Intelligence

Grants 59% of resolved cases
59%
Career Allow Rate
122 granted / 207 resolved
+6.9% vs TC avg
Strong +22% interview lift
Without
With
+21.9%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
37 currently pending
Career history
244
Total Applications
across all art units

Statute-Specific Performance

§101
14.8%
-25.2% vs TC avg
§103
49.6%
+9.6% vs TC avg
§102
15.6%
-24.4% vs TC avg
§112
15.3%
-24.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 207 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims Claims 1, & 3-18 of U.S. Application No. 16/525824 filed on 12/10/2025 have been examined. Office Action is in response to the Applicant's amendments and remarks filed 12/10/2025. Claim 1, is presently amended. Claims 2 & 19-21 are cancelled. Claims 1, & 3-18 are presently pending and are presented for examination. Response to Arguments In regards to the previous rejection under 35 U.S.C. § 103: Applicant argues that the prior art does not disclose the limitation “wherein discriminating the pole side impact crash event from the barrier side impact crash event comprises evaluating crash event indications determined from multi-axis vehicle acceleration parameters measured from a non-impact side of the vehicle”. Applicant further argues on page. 11-15 of the Remarks, “ While focused on side-impact protection, Hu notes that the disclosed teachings can apply to any structure where one region's motion is object-type dependent, and another region's motion is severity dependent (see, the paragraph beginning at column 8, line 64). It is clear that Hu does not teach the use of multi-axis vehicle acceleration sensors, only structural relative-motion sensors. Hu does not use any sensed metric from a side opposite a crash-side of the vehicle to discriminate a pole crash from a barrier crash. All of the sensing in Hu is made with relative displacements sensed in the door being impacted. A person having ordinary skill in the art would not consider the method of claim 1 obvious over Yoshida in view of Hu. As admitted in the office action, Yoshida does not disclose detecting a pole side impact and discriminating the pole side impact from a barrier side impact. Hu discloses discriminating a pole from a barrier side impact, but does so in a manner wholly different than that recited in claim 1, i.e., using a pair of door mounted relative displacement sensors. Additionally, neither Yoshida nor Hu disclose a method for discriminating a pole side impact from a barrier side impact that utilizes any metrics recorded at the non-impact side of the vehicle.”. Examiner respectfully disagrees. Applicant is reminded claims must be given their broadest reasonable interpretation. As recited in the Office Action, Yoshida discloses a method for actuating an actuatable safety device to protect a vehicle occupant, further when the crash event occurs Yoshida contains different regions to indicate if the crash event is in front of the vehicle (region 1) or on the sides of the vehicle (region 2 or region 3) (see at least Yoshida, para. [0066] & para. [0072]). Further the way Yoshida senses a collision has occurred is through a satellite collision sensor that is built onto the vehicle (see at least Yoshida, para. [0053-0055]). Miyata is further incorporated to teach the idea of discriminating between different types of impacts from pole, barrier, oblique and further impacts based on multi-axis acceleration parameters based on a comparison from both sides of the vehicle, the impact side and the non-impact side (see at least Miyata, para. [0160]). Applicant does not provide separate remarks regarding Miyata, which is the prior art utilized to reject the limitation being argued regarding discriminating a pole from a barrier side impact that utilizes any metrics recorded from the non-impact side. Miyata utilizes acceleration parameters from the widthwise acceleration parameter and front/rear acceleration parameters. In conclusion, the 103 rejection is maintained in view of the arguments above. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 1, 16, and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 2022/0001821A1 (“Yoshida”), in view of US 6644688B1 (“Hu”), in view of US 2017/0232919A1 (“Miyata”). As per claim 1 Yoshida discloses A method for controlling an actuatable safety device for helping to protect a vehicle occupant, the method comprising: sensing a plurality of vehicle acceleration parameters comprising a longitudinal vehicle acceleration parameter and a lateral vehicle acceleration parameter (see at least Yoshida, para. [0035]: The satellite collision sensor 16 is configured to generate an output corresponding to the acceleration acting on the vehicle 1. Specifically, in the present embodiment, the satellite collision sensor 16 is a biaxial acceleration sensor capable of outputting a combined waveform of the longitudinal direction acceleration and the left-right direction acceleration. The left-right direction acceleration is also referred to as a lateral acceleration or an acceleration in the Y direction.); executing one or more metrics that evaluate the multi-axis vehicle acceleration parameters to determine whether vehicle crash thresholds are exceeded and producing crash event indications in response thereto (see at least Yoshida, para. [0053-0055]: The collision detection unit 24 detects occurrence of a collision between the vehicle 1 and an object B based on the acceleration acquired by the built-in collision sensor 12a and the satellite collision sensor 16. Specifically, in the present embodiment, the collision detection unit 24 determines that a collision has occurred when both of the following two conditions are satisfied. Acceleration detection value GF from built-in collision sensor 12a>Threshold GFth. Acceleration detection value GS from satellite collision sensor 16>Threshold GSth. & para. [0063]: Further, the collision detection unit 24 detects occurrence of a collision based on a combined waveform between the X-direction acceleration and the Y-direction acceleration at the satellite collision sensor 16, which is a biaxial sensor. When both the result of the collision occurrence detection by the built-in collision sensor 12a and the result of the collision occurrence detection by the satellite collision sensor 16 are affirmative, the collision detection unit 24 outputs a collision affirmative signal to drive the occupant protection device(s) 11.); evaluating the crash event indications to identify a side impact (see at least Yoshida, para. [0066]: In addition, the collision form estimation unit 22 estimates the collision form based on the distance, direction, relative velocity, and collision probability acquired as object detection results. Specifically, the collision form estimation unit 22 estimates the collision form based on the collision probability in the first region R1, the second region R2, and the third region R3. The first region R1 is a region ahead of the vehicle 1. The second region R2 is a region shifted in the vehicle width direction so that it is on one side of the first region R1. The third region R3 is a region shifted in the vehicle width direction so that it is on the other side of the first region R1. Therefore, the collision form can be estimated with good accuracy. & para. [0072]: Specifically, the driving control device 12 determines whether any one of PC>PCth1, PR>PRth1, and PL>PLth1 is satisfied. PC is the presence probability in the first region R1. PR is the presence probability in the second region R2. PL is the presence probability in the third region R3. PCth1, PRth1, and PLth1 are reference values for determining the presence or absence of an object B that may collide with the vehicle in step 302.); and controlling deployment of the actuatable safety device in response to identifying the side impact crash event (see at least Yoshida, para. [0064]: The driving control device 25 selects which of the occupant protection devices 11 should be driven when a collision occurs based on the collision form estimated by the collision form estimation unit 22. Further, the driving control device 25 drives one or more selected occupant protection devices 11 when it is determined that a collision has occurred, in other words, when a collision is detected and the collision detection unit 24 outputs a collision affirmative signal. & para. [0078]: In step 305, the driving control device 12, that is, the collision detection unit 24 determines whether a collision has occurred. When it is determined that a collision has not occurred (that is, step 305=NO), the process returns to step 301. On the other hand, when it is determined that a collision has occurred (that is, step 305=YES), the process proceeds to step 306. In step 306, the driving control device 12, that is, the driving control device 25 selectively drives the occupant protection device(s) 11 corresponding to the collision form estimated in step 304.). Yoshida does not explicitly disclose evaluating the crash event indications to identify a pole side impact crash event, wherein identifying a pole side impact crash event comprises discriminating the pole side impact crash event from a barrier side impact crash event wherein discriminating the pole side impact crash event from the barrier side impact crash event comprises evaluating crash event indications determined from multi-axis vehicle acceleration parameters measured from a non-impact side of the vehicle; and controlling deployment of the actuatable safety device in response to identifying the pole side impact crash event. Hu teaches evaluating the crash event indications to identify a pole side impact crash event, wherein identifying a pole side impact crash event comprises discriminating the pole side impact crash event from a barrier side impact crash event (see at least Hu, col. 3 lines 43-54: Referring to FIG. 4, one example of an associated crash type discrimination algorithm, a relative displacement 46.1 measurement from the first relative motion sensor 14.1 is compared with respective thresholds in both the positive and negative directions of relative motion, wherein the particular threshold values are dependent upon the characteristics of the particular vehicle 12. In the example of FIG. 4 for a side-impact crash, if the relative displacement 46.1 is greater than the positive threshold, then the crash type is assumed to be a pole crash; whereas if the relative displacement 46.1 is less than the negative threshold, then the crash type is assumed to be a barrier crash.), controlling deployment of the actuatable safety device in response to identifying the pole side impact crash event (see at least Hu, col. 4 lines 30-37: Referring to FIGS. 6a and 6b, the satellite sensor detects the type of object which determines which crash severity threshold curve to use in discriminating the crash, and the principal Sensor detects a measure of crash severity and compares this with the crash object dependent crash severity threshold, wherein when the measure of crash severity exceeds the crash object dependent crash Severity, the associated Side airbag safety restraint System is activated (fired).). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yoshida to incorporate the teaching of evaluating the crash event indications to identify a pole side impact crash event, wherein identifying a pole side impact crash event comprises discriminating the pole side impact crash event from a barrier side impact crash event and controlling deployment of the actuatable safety device in response to identifying the pole side impact crash event of Hu in order to mitigate injury to an occupant from a side-impact crash (see at least Hu, col. 1 lines 66-67). Miyata teaches wherein discriminating the pole side impact crash event from the barrier side impact crash event comprises evaluating crash event indications determined from multi-axis vehicle acceleration parameters measured from a non-impact side of the vehicle (see at least Miyata, para. [0074-0075]: The front left sensor 41 is configured to detect an acceleration in the vehicle front-rear direction acting on the front left sensor 41 itself (hereinafter referred to as “front/rear acceleration GLx”). The front/rear acceleration GLx is set to represent an acceleration toward a vehicle rear direction as a positive value. The front left sensor 41 is configured to further detect an acceleration in the vehicle widthwise direction acting on the front left sensor 41 itself (hereinafter referred to as “widthwise direction acceleration GLy” or “first lateral acceleration GLy”). The widthwise direction acceleration GLy is set to represent an acceleration toward an inside of the vehicle (namely, a right direction with respect to a forward direction of the vehicle) as a positive value. & para. [0081]: The airbag ECU 45 (hereinafter also simply referred to as “ECU 45”) is fixed to the floor constituting the cabin. The ECU 45 is connected to the front left sensor 41, the front right sensor 42 and the floor sensor 43, and is configured to receive the respective accelerations detected by those sensors. para. [0113]: The head-on collision threshold defines a threshold for the front/rear acceleration (Gx) corresponding to the velocity decrease amount (Vx), is experimentally determined in advance and is stored in the ROM. The head-on collision threshold is one kind of collision determination thresholds, and is set so as to start changing synchronously with a time point at which the actual velocity decrease amount Vx calculated by the velocity decrease amount calculation unit 50 increases from “0”. It should be noted that each of collision determination thresholds described later (that is, a pole collision threshold, an offset collision threshold, a small overlap collision threshold, and an oblique collision determination threshold) also defines a threshold for the front/rear acceleration (Gx) corresponding to the velocity decrease amount(Vx). Those thresholds are also experimentally defined in advance, are stored in the ROM, and are set so as to start changing synchronously with the time point at which the actual velocity decrease amount Vx calculated by the velocity decrease amount calculation unit 50 increases from “0”. para. [0160]: When the value of the counter Cnt becomes equal to or more than the collision form decision threshold C decision, the CPU makes a determination of “Yes” in Step 1165, and proceeds to Step1175 to decide the collision form determined for this time (that is, the collision form based on the determination in Step 1145 carried out immediately before) as a final collision form. [Examiner Note: The counter collision is the side of the non-impact side that is evaluated and counted]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yoshida to incorporate the teaching of wherein discriminating the pole side impact crash event from the barrier side impact crash event comprises evaluating crash event indications determined from multi-axis vehicle acceleration parameters measured from a non-impact side of the vehicle of Miyata in order to more precisely discriminate between collision forms, thereby carrying out more appropriate activation control (see at least Miyata, para. [0009]). As per claim 16 Yoshida discloses A vehicle safety system comprising: one or more vehicle safety devices (see at least Yoshida, para. [0025]: The vehicle 1 is provided with a plurality of occupant protection devices 11. Specifically, corresponding to the front seats, a driver's seat front airbag 11a, a passenger's seat front airbag 11b, a driver's seat side airbag 11c, and a passenger's seat side airbag 11d are provided. The driver's seat front airbag 11a is provided to be deployed in front of the upper body of the occupant seated in the driver's seat.); and a controller configured to execute the method recited in claim 1 and to actuate the one or more vehicle safety devices in response thereto (see at least Yoshida, para. [0029-0030]: The occupant protection system 10 includes a driving control device 12 in addition to the multiple occupant protection devices 11 described above. The driving control device 12 is an in-vehicle microcomputer which may also be referred to as an airbag ECU or a protection device ECU, and it is configured to control the driving of the occupant protection device 11.). As per claim 18 Yoshida discloses wherein the one or more vehicle safety devices comprise at least one of a side airbag and a curtain airbag (see at least Yoshida, para. [0025]: The vehicle 1 is provided with a plurality of occupant protection devices 11. Specifically, corresponding to the front seats, a driver's seat front airbag 11a, a passenger's seat front airbag 11b, a driver's seat side airbag 11c, and a passenger's seat side airbag 11d are provided. The driver's seat front airbag 11a is provided to be deployed in front of the upper body of the occupant seated in the driver's seat.). Claim 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yoshida, in view of Hu, in view of Miyata, further in view of US 6170864B1 (“Fujita”). As per claim 3 Yoshida discloses a satellite safety sensor (see at least Yoshida, para. [0126]: Specifically, for example, a satellite collision sensor 16 may be provided at each of the right and left side-members 6.). However Yoshida does not explicitly disclose wherein discriminating the pole side impact crash event from the barrier side impact crash event comprises: measuring via a biaxial sensor (SS) a vehicle X-axis acceleration (BS_X) and a vehicle Y-axis acceleration (BS_Y); determining from the BS_X a vehicle X-axis relative velocity; determining from the BS_Y a vehicle Y-axis relative velocity; comparing the BS_X_Rel_Vel to the BS_Y_Rel_Vel to classify a side impact crash event as being either a pole side impact crash event or a barrier side impact crash event. Fujita teaches wherein discriminating the pole side impact crash event from the barrier side impact crash event comprises: measuring via a biaxial sensor a vehicle X-axis acceleration (BS_X) and a vehicle Y-axis acceleration (BS_Y) (see at least Fujita, col. 25 lines 34-40: The biaxial sensor 90 detects the direction of an impact applied to the vehicle. More concretely the biaxial sensor 90 always measures a deceleration Gx applied in the direction of the length of the vehicle 46 (hereinafter referred to as the direction x) and a deceleration Gy applied in the direction of the width of the vehicle 46 (hereinafter referred to as the direction y) as shown in FIG. 23.); determining from the BS_X a vehicle X-axis relative velocity (see at least Fujita, col. 25 lines 50-60: As shown in FIG. 22, the integration unit 94 in the threshold variation pattern changing unit 92 integrates the measurements Gx and Gy output from the biaxial sensor 90 (that is, the decelerations in the directions x and y) once with respect to the time t, so as to yield an integral intg.Gxdt in the direction x and an integral intg.Gydt in the direction y. The value obtained by integrating the deceleration once with respect to the time t represents the velocity v of a non-stationary object in the vehicle as mentioned above, and the integrals .intg.Gxdt and .intg.Gydt thus respectively denote the velocities of the non-stationary object in the direction x and in the direction y.); determining from the BS_Y a vehicle Y-axis relative velocity (see at least Fujita, col. 25 lines 50-60: As shown in FIG. 22, the integration unit 94 in the threshold variation pattern changing unit 92 integrates the measurements Gx and Gy output from the biaxial sensor 90 (that is, the decelerations in the directions x and y) once with respect to the time t, so as to yield an integral intg.Gxdt in the direction x and an integral intg.Gydt in the direction y. The value obtained by integrating the deceleration once with respect to the time t represents the velocity v of a non-stationary object in the vehicle as mentioned above, and the integrals .intg.Gxdt and .intg.Gydt thus respectively denote the velocities of the non-stationary object in the direction x and in the direction y.); comparing the BS_X_Rel_Vel to the BS_Y_Rel_Vel to classify an impact crash event as an oblique impact crash event or a barrier side impact crash event (see at least Fujita, col. 26 lines 8-24: FIGS. 24(a) and 24(b) are characteristic charts showing the integrals .intg.Gxdt and .intg.Gydt in the directions x and y obtained by the integration unit 94 of FIG. 22 in a rectangular coordinate system. The integral .intg.Gxdt in the direction x is plotted as ordinate and the integral .intg.Gydt in the direction y as abscissa. FIG. 24(a) shows integral curves in the case of an oblique collision of a vehicle S1 against a vehicle S0 and in the case of an oblique side collision of a vehicle S2 against the vehicle S0. M1 denotes an integral curve in the case of an oblique collision of the vehicle S1, and M2 denotes an integral curve in the case of an oblique side collision of the vehicle S2. N1 represents the direction of an impact applied to the vehicle S0 when the vehicle S1 collides against the vehicle S0, and N2 represents the direction of an impact applied to the vehicle S0 when the vehicle S2 collides against the vehicle S0.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yoshida to incorporate the teaching of measuring via a biaxial sensor (SS) a vehicle X-axis acceleration (BS_X) and a vehicle Y-axis acceleration (BS_Y), determining from the BS_X a vehicle X-axis relative velocity; determining from the BS_Y a vehicle Y-axis relative velocity of Fujita in order to enable a passive vehicle occupant restraint to be activated with high accuracy, irrespective of the type of a collision (see at least Fujita, col. 2 lines 24-25). Hu teaches classify a side impact crash event as being either a pole side impact crash event or a barrier side impact crash event (see at least Hu, col. 3 lines 43-54: Referring to FIG. 4, one example of an associated crash type discrimination algorithm, a relative displacement 46.1 measurement from the first relative motion sensor 14.1 is compared with respective thresholds in both the positive and negative directions of relative motion, wherein the particular threshold values are dependent upon the characteristics of the particular vehicle 12. In the example of FIG. 4 for a side-impact crash, if the relative displacement 46.1 is greater than the positive threshold, then the crash type is assumed to be a pole crash; whereas if the relative displacement 46.1 is less than the negative threshold, then the crash type is assumed to be a barrier crash.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yoshida to incorporate the teaching of classify a side impact crash event as being either a pole side impact crash event or a barrier side impact crash event of Hu in order to mitigate injury to an occupant from a side-impact crash (see at least Hu, col. 1 lines 66-67). Claim 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yoshida, in view of Hu, in view of Miyata, in view of US 2018/0111574A1 (“Okamura”). As per claim 13 Yoshida does not disclose wherein identifying the pole side impact crash event comprises discriminating a rear pole side impact crash event from a front side pole or front barrier impact crash event. Okamura teaches wherein identifying the pole side impact crash event comprises discriminating a rear pole side impact crash event from a front side pole or front barrier impact crash event (see at least Okamura, para. [0070-0071]: As shown in FIG. 5, in the present embodiment, a plurality of areas concerning the collision position is set on the map in response to various combinations (i.e., the directions of various vectors A) of a value showing the size of behaviors of the vehicle M and a value showing the deformation amount of the vehicle M. The plurality of areas includes, for example, an engine room area B1, a front seat occupant area B2, a rear seat occupant area B3, a fuel tank area B4, a trunk room area B5 and the like. And, the collision determination section 12 determines the collision position in the vehicle M based on a combination of the value showing the size of behaviors of the vehicle M (value based on the detection result of the X-direction acceleration sensor 21b or the value based on the detection result of the yaw rate sensor 28c) and the value showing the deformation amount of the vehicle M (value based on the detection result of the Y-direction acceleration sensor 21a). Namely, the collision determination section 12 determines the collision position in the vehicle M by the direction (inclination) of the vector A on the map shown by the combination of the value showing the size of behaviors of the vehicle M and the value showing the deformation amount of the vehicle M. For example, in the case where the combination (the end point of the vector A) of the value showing the size of behaviors of the vehicle M and the value showing the deformation amount of the vehicle M is located in the front seat occupant area B2 on the map, the collision determination section 12 determines that the collision has occurred near a front seat in the vehicle M. ). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yoshida to incorporate the teaching of wherein identifying the pole side impact crash event comprises discriminating a rear pole side impact crash event from a front side pole or front barrier impact crash event of Okamura in order for occupants to be protected at a higher level by effectively performing the operation of the occupant protection member (see at least Okamura, para. [0023]). Claim 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yoshida, in view of Hu, in view of Miyata, in view of Okamura, in view of Fujita, further in view of US 2016/0137152A1 (“Park”). As per claim 14 Yoshida does not explicitly disclose wherein discriminating the rear pole side impact crash event from a front side impact crash event and from a barrier side impact crash event comprises: measuring via a satellite safety sensor (SSS) a vehicle Y-axis acceleration (SSS_Y); measuring via an airbag ECU (ACU) a vehicle Y-axis acceleration (ACU_Y); determining from the SSS_Y a vehicle Y-axis relative velocity (SSS_Y_Rel_Vel); determining from the ACU_Y a vehicle Y-axis relative velocity (ACU_Y_Rel_Vel); and comparing the SSS_Y_Rel_Vel to the ACU_Y_Rel_Vel to classify a side impact crash event as rear pole side impact crash or a front side impact crash event. Miyata teaches wherein discriminating the rear pole side impact crash event from a front side impact crash event and from a barrier side impact crash event comprises: measuring via an airbag ECU (ACU) a vehicle Y-axis acceleration (ACU_Y) (see at least Miyata, para. [0072]: The first device includes a front left sensor (front left side acceleration sensor) 41, a front right sensor (front right side acceleration sensor) 42, a floor sensor (floor acceleration sensor) 43, other sensors 44 (for example, a vehicle velocity sensor, not shown in FIG. 1) and an airbag ECU (activation control ECU) 45. & para. [0074-0075]: The front left sensor 41 is configured to detect an acceleration in the vehicle front-rear direction acting on the front left sensor 41 itself (hereinafter referred to as “front/rear acceleration GLx”). The front/rear acceleration GLx is set to represent an acceleration toward a vehicle rear direction as a positive value. The front left sensor 41 is configured to further detect an acceleration in the vehicle widthwise direction acting on the front left sensor 41 itself (hereinafter referred to as “widthwise direction acceleration GLy” or “first lateral acceleration GLy”). The widthwise direction acceleration GLy is set to represent an acceleration toward an inside of the vehicle (namely, a right direction with respect to a forward direction of the vehicle) as a positive value. & para. [0081]: The airbag ECU 45 (hereinafter also simply referred to as “ECU 45”) is fixed to the floor constituting the cabin. The ECU 45 is connected to the front left sensor 41, the front right sensor 42 and the floor sensor 43, and is configured to receive the respective accelerations detected by those sensors.); comparing the SLy_Rel_Vel to the SRy_Rel_Vel to classify a side impact crash event as rear pole side impact crash or a front side impact crash event (see at least Miyata, Fig. 4A & para. [0103-0105]: As described above, the locus of the point P draws a waveform specific to each of the collision forms. The first device is configured to identify the collision form based on this viewpoint. That is, a “discrimination map A (collision form identification map)” illustrated in FIG. 4A is generated in advance, and the discrimination map A is stored in the ROM of the ECU 45. This discrimination map A is a map having the same axes as those of the discrimination map A of FIG. 3, and regions corresponding to the collision forms are set in advance in the discrimination map A. It should be noted that a front pole collision, a right side offset pole collision and a left side offset pole collision are treated as “pole collision” in the first device. Further, the right side offset collision and the left side offset collision are treated as “offset collision”. The right side small overlap collision and the left side small overlap collision are treated as “small overlap collision”, and the right side oblique collision and the left side oblique collision are treated as “oblique collision”. The ECU 45 is configured to monitor in which region of the discrimination map A the point P= (SLy,SRy) exists, and determine that the collision form is “a collision corresponding to the region in which the point P exists”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yoshida to incorporate the teaching of wherein discriminating the rear pole side impact crash event from a front side impact crash event and from a barrier side impact crash event comprise, measuring via an airbag ECU (ACU) a vehicle Y-axis acceleration (ACU_Y), comparing the SLy_Rel_Vel to the SRy_Rel_Vel to classify a side impact crash event as rear pole side impact crash or a front side impact crash event of Miyata in order to more precisely discriminate between collision forms, thereby carrying out more appropriate activation control (see at least Miyata, para. [0009]). Fujita teaches measuring via a BS_Y a vehicle Y-axis acceleration (BS_Y) (see at least Fujita, col. 25 lines 50-60: As shown in FIG. 22, the integration unit 94 in the threshold variation pattern changing unit 92 integrates the measurements Gx and Gy output from the biaxial sensor 90 (that is, the decelerations in the directions x and y) once with respect to the time t, so as to yield an integral intg.Gxdt in the direction x and an integral intg.Gydt in the direction y. The value obtained by integrating the deceleration once with respect to the time t represents the velocity v of a non-stationary object in the vehicle as mentioned above, and the integrals .intg.Gxdt and .intg.Gydt thus respectively denote the velocities of the non-stationary object in the direction x and in the direction y.); determining from the BS_Y a vehicle Y-axis relative velocity (BS_Y_Rel_Vel) (see at least Fujita, col. 25 lines 50-60: As shown in FIG. 22, the integration unit 94 in the threshold variation pattern changing unit 92 integrates the measurements Gx and Gy output from the biaxial sensor 90 (that is, the decelerations in the directions x and y) once with respect to the time t, so as to yield an integral intg.Gxdt in the direction x and an integral intg.Gydt in the direction y. The value obtained by integrating the deceleration once with respect to the time t represents the velocity v of a non-stationary object in the vehicle as mentioned above, and the integrals .intg.Gxdt and .intg.Gydt thus respectively denote the velocities of the non-stationary object in the direction x and in the direction y.); It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yoshida to incorporate the teaching of measuring via a BS_Y a vehicle Y-axis acceleration (BS_Y), determining from the BS_Y a vehicle Y-axis relative velocity (BS_Y_Rel_Vel) of Fujita in order to enable a passive vehicle occupant restraint to be activated with high accuracy, irrespective of the type of a collision (see at least Fujita, col. 2 lines 24-25). Park teaches determining from the ACU_Y a vehicle Y-axis relative velocity (ACU_Y_Rel_Vel) (see at least Park, para. [0015-0016]: The control unit may calculate a speed and a speed change of the first side two-axis sensor using a Y-axis acceleration value obtained by the first side two-axis sensor, calculates speed of the second side two-axis sensor using a Y-axis acceleration value obtained by the second side two-axis sensor, and determines whether the vehicle passenger is in the broadside collision situation, using the speed and the speed change of the first side two-axis sensor and the speed of the second side two-axis sensor. The control unit may calculate a speed and a speed change of the first side two-axis sensor using a Y-axis acceleration value obtained by the first side two-axis sensor, calculates speed of the second side two-axis sensor using the Y-axis acceleration value obtained by the second side two-axis sensor, calculates a plural threshold values using the speed of the second side two-axis sensor, and determines that the vehicle passenger is in the broadside collision situation when the speed and the speed change of the first side two-axis sensor are equal to or higher than the threshold values, and average deceleration of the front two-axis sensor or average deceleration of the second side two-axis sensor are higher than predetermined safing threshold value. & para. [0080-0081]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yoshida to incorporate the teaching of determining from the ACU_Y a vehicle Y-axis relative velocity (ACU_Y_Rel_Vel) of Park in order to more accurately sense a collision and operating a passenger protection unit (see at least Park, para. [0028]). Claim 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Yoshida, in view of Hu, in view of Miyata, in view of US 2015/0266439A1 (“Foo”). As per claim 17 Yoshida discloses further comprising: a satellite safety sensor (SSS) configured to be mounted in a roof of the vehicle along a vehicle Y-axis above rear row seating in a vehicle (see at least Yoshida, para. [0126]: Specifically, for example, a satellite collision sensor 16 may be provided at each of the right and left side-members 6. Further, the satellite collision sensor 16 may be provided in the middle part in the vehicle overall length direction, for example, at or near a position corresponding to the B pillar (not shown). Further, the satellite collision sensor 16 may be provided in the rear part in the vehicle overall length direction, for example, at or near a position corresponding to the C-pillar (not shown).); and an airbag control unit (ACU) configured to be mounted in an instrument panel of the vehicle along the vehicle Y-axis (see at least Yoshida, para. [0032]: The driving control device 12 has a box-shaped housing and positioned on the vehicle center line L in a plan view.), wherein the controller is implemented in the ACU and wherein the SSS is configured to communicate with the ACU (see at least Yoshida, para. [0032]: The driving control device 12 includes a built-in collision sensor 12a inside the housing. The built-in collision sensor 12a is a collision sensor built in the driving control device 12, and is configured to detect a collision between the vehicle 1 and an object B. The built-in collision sensor 12a is a uniaxial acceleration sensor, also called a floor G sensor, and is configured to generate an output corresponding to the longitudinal acceleration acting on the vehicle 1. & para. [0035-0036]: Further, the vehicle 1 is provided with a satellite collision sensor 16 and an electromagnetic wave radar sensor 17. The satellite collision sensor 16 is a collision sensor provided separately from the driving control device 12, and is connected to the driving control device 12 via the above mentioned in-vehicle safety system network. The electromagnetic wave radar sensor 17 is connected to the driving control device 12 via an in-vehicle network conforming to a certain communication standard such as CAN.). Yoshida does not explicitly disclose a left B-pillar side impact sensor (LBP_SIS) configured to be mounted on a left B-pillar of the vehicle; a right B-pillar side impact sensor (RBP_SIS) configured to be mounted on a right B-pillar of the vehicle; wherein the controller is implemented in the ACU and wherein the LBP_SIS, RBP_SIS, and SSS is configured to communicate with the ACU. Foo discloses a left B-pillar side impact sensor (LBP_SIS) configured to be mounted on a left B-pillar of the vehicle (see at least Foo, Fig. 1 & para. [0030-0033]: The MAS sensor 72 includes two side-impact-satellite (“SIS”) crash acceleration sensors for sensing crash acceleration in the X-direction (sensor 74) and the Y-direction (sensor 78). The SIS sensor 74 provides a crash acceleration signal designated as LBX-SIS and the SIS sensor 78 provides a crash acceleration signal designated as LBY-SIS, both having frequency and amplitude characteristics indicative of crash acceleration in the X-axis direction and the Y-axis, respectively. These output signals are also connected to the ACU 40 for processing and evaluation.); a right B-pillar side impact sensor (RBP_SIS) configured to be mounted on a right B-pillar of the vehicle (see at least Foo, Fig. 1 & para. [0030-0033]: A remote located passenger's multi-axis satellite sensor (“MAS”) 80 is mounted on the passenger's side of the vehicle such as at the passenger's side B-pillar and includes an X-direction side-impact-satellite (“SIS”) sensor 82 and a Y-direction side-impact-satellite (“SIS”) sensor 83. The SIS sensor 82 provides a crash acceleration signal designated as RBX-SIS having frequency and amplitude characteristics indicative of crash acceleration in the X-direction. The SIS sensor 83 provides a crash acceleration signal designated as RBY-SIS having frequency and amplitude characteristics indicative of crash acceleration in the Y-direction. These output signals are also connected to the ACU 40 for processing and evaluation.); wherein the controller is implemented in the ACU and wherein the LBP_SIS, RBP_SIS, and SSS is configured to communicate with the ACU (see at least Foo, Fig. 1 & para. [0030-0033]: These output signals of the CZS 64, 66, 68, and 70 are connected to the ACU 40 for processing and evaluation. The SIS sensor 74 provides a crash acceleration signal designated as LBX-SIS and the SIS sensor 78 provides a crash acceleration signal designated as LBY-SIS, both having frequency and amplitude characteristics indicative of crash acceleration in the X-axis direction and the Y-axis, respectively. These output signals are also connected to the ACU 40 for processing and evaluation. The SIS sensor 83 provides a crash acceleration signal designated as RBY-SIS having frequency and amplitude characteristics indicative of crash acceleration in the Y-direction. These output signals are also connected to the ACU 40 for processing and evaluation.). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Yoshida to incorporate the teaching of a left B-pillar side impact sensor (LBP_SIS) configured to be mounted on a left B-pillar of the vehicle, a right B-pillar side impact sensor (RBP_SIS) configured to be mounted on a right B-pillar of the vehicle, wherein the controller is implemented in the ACU and wherein the LBP_SIS, RBP_SIS, and SSS is configured to communicate with the ACU of Foo in order to provide multi-region enhanced discrimination of vehicle crash events using an event classification arrangement that can discriminate a high speed frontal rigid barrier impact event, an offset deformable barrier impact event, an oblique angular frontal rigid barrier impact event, and a small/narrow overlap impact event (see at least Foo, para. [0002]). Allowable Subject Matter Claims 4-12, & 15 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MOHAMED ABDO ALGEHAIM whose telephone number is (571)272-3628. The examiner can normally be reached Monday-Friday 8-5PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Fadey Jabr can be reached at 571-272-1516. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MOHAMED ABDO ALGEHAIM/Primary Examiner, Art Unit 3668
Read full office action

Prosecution Timeline

Jul 30, 2019
Application Filed
Feb 11, 2022
Non-Final Rejection — §103
May 18, 2022
Response Filed
Jun 17, 2022
Final Rejection — §103
Sep 22, 2022
Response after Non-Final Action
Dec 21, 2022
Request for Continued Examination
Dec 22, 2022
Response after Non-Final Action
Dec 30, 2022
Non-Final Rejection — §103
Apr 13, 2023
Response Filed
Jul 14, 2023
Final Rejection — §103
Oct 20, 2023
Notice of Allowance
Dec 20, 2023
Response after Non-Final Action
Dec 31, 2023
Response after Non-Final Action
Mar 07, 2024
Response after Non-Final Action
May 09, 2024
Response after Non-Final Action
May 13, 2024
Response after Non-Final Action
May 14, 2024
Response after Non-Final Action
May 14, 2024
Response after Non-Final Action
May 28, 2025
Response after Non-Final Action
Jul 29, 2025
Request for Continued Examination
Aug 03, 2025
Response after Non-Final Action
Sep 05, 2025
Non-Final Rejection — §103
Dec 10, 2025
Response Filed
Jan 10, 2026
Final Rejection — §103
Apr 14, 2026
Response after Non-Final Action

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594963
DETECTING AN UNKNOWN OBJECT BY A LEAD AUTONOMOUS VEHICLE (AV) AND UPDATING ROUTING PLANS FOR FOLLOWING AVs
2y 5m to grant Granted Apr 07, 2026
Patent 12597865
INVERTER
2y 5m to grant Granted Apr 07, 2026
Patent 12589978
TRUCK-TABLET INTERFACE
2y 5m to grant Granted Mar 31, 2026
Patent 12565235
DETECTING A CONSTRUCTION ZONE BY A LEAD AUTONOMOUS VEHICLE (AV) AND UPDATING ROUTING PLANS FOR FOLLOWING AVs
2y 5m to grant Granted Mar 03, 2026
Patent 12559228
THERMAL MANAGEMENT SYSTEM FOR AN AIRCRAFT INCLUDING AN ELECTRIC PROPULSION ENGINE
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

7-8
Expected OA Rounds
59%
Grant Probability
81%
With Interview (+21.9%)
3y 3m
Median Time to Grant
High
PTA Risk
Based on 207 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month