DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Effective Filing Date
The actual filing date for the instant application is 09/06/2024. However, the instant application is based on 2 provisional applications 63/541,556 filed on 09/29/2023 and 63/541,581 filed on 09/29/2023. Looking at provisional 63/541,581, the application is related to chemistry and therefore is not related to the current application. Looking at provisional 63/541,556, the specification is 38 pages and related to the current application. As such, the effective filing date of each of the instant application’s claims under examination may be as recent as the instant application’s actual filing date of 09/06/2024, or potentially as early as the filing date of 09/29/2023, depending on whether there is appropriate specification support for each particular claim in one or more of the earlier-filed specifications. In the case that a prior art rejection to one or more claims made in an Office action during prosecution of the instant application includes one or more prior art references that fall somewhere between 10/18/2024 and 10/27/2017 (an "intervening" reference), that is because the examiner did not see sufficient support for that/those particular claim/-s in one or more of those applications in the continuity chain that pre-date that/those particular reference/-s. If that occurs, if Applicant can specifically identify appropriate specification support for each of these claims in an earlier filed portion of this complex continuity chain, then the Examiner may determine that one or more of these prior art rejections against one or more of these claims will need to be withdrawn.
Drawings
Applicant’s amendment to the specification filed on 12/31/2025 has been fully considered and overcomes the previous drawing objection.
Status of Claims
Amended: 1 – 4, 6, and 8 – 20
Rejected: 1 – 4, 6, and 8 – 22
Cancelled: 5 and 7
New: 21 – 22
Response to Arguments
Applicant’s arguments on pages 11 – 13, regarding the 101 rejection has been fully considered but is not persuasive. After further analyzing the claims following the interview on December 17th, 2025, the examiner notes that claims 1 – 4, 6, 8, and 17 – 20 are still rejected (plus newly added claims 21 – 22) under 35 U.S.C. 101 because it is still directed to a mental process. Sending data is not a practical application because the vehicle is merely transmitting information after an event. The vehicle is not physically performing an action and is therefore still a mental process.
Applicant’s arguments on pages 13 – 17 pertaining to prior art rejections have been fully considered but are moot due to the new rejections at least based on Kim (US Pub No: 2022/0048502 A1), which were necessitated by amendment.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
101 Analysis: Step 1
Claims 1 – 4, 6, 8, and 17 – 22 are rejected under 35 U.S.C. 101 because the claimed subject matter is drawn to an abstract idea without significantly more, nor is the abstract idea as a judicial exception integrated into a practical application. With regards to step 1, the claimed invention is directed to a method.
101 Analysis: Step 2A, Prong 1
For step 2A, prong 1, the claims are to be analyzed under MPEP 2106.04 to determine whether they recite subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes.
Independent claim 1 includes limitations that recite an abstract idea (emphasized below in bold text).
Claim 1 recites:
A computing system for on-vehicle event detection using image data from a rear-facing vehicle camera, the computing system comprising:
one or more processors, and
memory operably connected to the one or more processors, the memory storing computer-executable instructions that, when executed by the one or more processors, cause the computing system to perform operations including:
receiving image data captured by a rear-facing camera installed on a vehicle operating in a driving environment;
storing the image data within a storage unit on the vehicle;
determining, based at least in part on an on-vehicle analysis of the image data, a characteristic of a second vehicle located behind the vehicle;
detecting an event during operation of the vehicle within a driving environment; and
in response to detecting the event in the driving environment transmitting the image data and the characteristic of the second vehicle from the storage unit to a remote off-vehicle server.
These limitations, as drafted, are a method that, under broadest reasonable interpretation, covers performance of the limitation as a mental concept. That is, nothing in the claim elements preclude the steps from practically being performed as a mental process.
101 Analysis: Step 2A, Prong 2
Regarding Prong 2 of the Step 2A analysis in the MPEP 2106.04(d), the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract idea into a practical application. As noted in the MPEP 2106.04(d), it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a "practical application.”
In the present case, the additional elements beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional elements” while the bolded portions continue to represent the “abstract idea”):
Claim 1 recites:
A computing system for on-vehicle event detection using image data from a rear-facing vehicle camera, the computing system comprising:
one or more processors, and
memory operably connected to the one or more processors, the memory storing computer-executable instructions that, when executed by the one or more processors, cause the computing system to perform operations including:
receiving image data captured by a rear-facing camera installed on a vehicle operating in a driving environment;
storing the image data within a storage unit on the vehicle;
determining, based at least in part on an on-vehicle analysis of the image data, a characteristic of a second vehicle located behind the vehicle;
detecting an event during operation of the vehicle within a driving environment; and
in response to detecting the event in the driving environment transmitting the image data and the characteristic of the second vehicle from the storage unit to a remote off-vehicle server.
For the following reason(s), the examiner submits that the above identified additional elements do not integrate the above-noted abstract idea into a practical application.
Regarding the element of “A computing system,” “the computing system comprising: one or more processors, and memory operably connected to the one or more processors, the memory storing computer-executable instructions that, when executed by the one or more processors, cause the computing system to perform operations including,” and “transmitting the image data and the characteristic of the second vehicle from the storage unit to a remote off-vehicle server” is merely describing generic computing components which allow the abstract idea to be applied on a computer or to merely use a computer as a tool to perform the abstract idea (MPEP § 2106.05(f)). Thus, taken alone, these additional elements do not integrate the abstract idea into a practical application.
Regarding the additional element of “for on-vehicle event detection using image data from a rear-facing vehicle camera,” “based at least in part on an on-vehicle analysis of the image data,” “based at least in part on determining the event” and “in response to detecting the event in the driving environment.” This additional element is merely indicating a field of use or technological environment in which to apply a judicial exception, and does not amount to significantly more than the exception itself, and cannot integrate the judicial exception into a practical application.
101 Analysis: Step 2B
Regarding Step 2B in the MPEP 2106.05, independent claim 1 does not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application, regardless of whether they are looked at individually or in combination.
As discussed above, the additional elements of “A computing system,” “the computing system comprising: one or more processors, and memory operably connected to the one or more processors, the memory storing computer-executable instructions that, when executed by the one or more processors, cause the computing system to perform operations including,” and “transmitting the image data to a remote server” each amount to mere instructions to apply the exception. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general-purpose computer or computer components after the fact to an abstract idea does not provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit).
Independent claim 17’s analysis and rejection are considered analogous to the analysis and rejection of independent claim 1 above, and are not being repeated for the sake of brevity.
Dependent claims 2 – 4, 6, 8, and 18 - 22 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of dependent claims are directed toward additional aspects of the judicial exception and/or also contain well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application.
Claims 9 – 16 are not rejected under 35 USC 101 because a control operation is initiated to control the vehicle. Something similar should be added to the other independent claims.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1 – 3, 6, 8 – 9, 12 – 13 and 15 – 22 are rejected under 35 U.S.C. 103 as being unpatentable over Kiefer et al. (US Pub No: 2020/0286308 A1, hereinafter Kiefer) in view of Kim (US Pub No: 2022/0048502 A1, hereinafter Kim).
Regarding Claim 1:
Kiefer discloses:
A computing system for on-vehicle event detection using image data from a rear-facing vehicle camera, the computing system comprising: one or more processors, and memory operably connected to the one or more processors, the memory storing computer-executable instructions that, when executed by the one or more processors, cause the computing system to perform operations including. Paragraph [0029] describes a processor 210 and a memory 260.
receiving image data captured by a rear-facing camera installed on a vehicle operating in a driving environment. Paragraph [0032] describes a rear mounted camera on a motor vehicle. These cameras are on the vehicle and therefore operate in a driving environment.
storing the image data within a storage unit on the vehicle. Paragraph [0027] describes storing image data collected via various cameras. Paragraph [0028] describes that data can be combined with vehicle telemetry and camera images.
detecting an event during operation of the vehicle within a driving environment. Paragraph [0032] describes a near contact event being determines in response to a contact detection algorithm. If no contact event is detected 310, some of the data stored in the buffer memory is deleted or overwritten and more recent data is stored.
and in response to detecting the event in the driving environment transmitting, the image data and the characteristic of the second vehicle from the storage unit to a remote off-vehicle server. Paragraph [0033] describes that data can be transmitted by the vehicle to a remote server in response to a request for data.
Keifer does not disclose determining a characteristic of a second vehicle located behind the vehicle.
Kim, in an analogous field of endeavor, teaches:
determining, based at least in part on an on-vehicle analysis of the image data, a characteristic of a second vehicle located behind the vehicle. Paragraph [0007] describes an image device that records an event occurrence and determines an object in the rear. The system determines that the speed and distance of the object in rear and displays the information.
Therefore, it would have been prima facie obvious to one of the ordinary skill in the art before the effective filing date, with a reasonable expectation for success, to have modified Kiefer to incorporate the teachings of Kim to show determining a characteristic of a second vehicle located behind the vehicle. One would have been motivated to do so to identify the exact cause of an accident based on information before and after the accident (Abstract of Kim).
Keifer and Kim teach:
and in response to detecting the event in the driving environment transmitting, the image data and the characteristic of the second vehicle from the storage unit to a remote off-vehicle server. Paragraph [0033] of Keifer describes that data can be transmitted by the vehicle to a remote server in response to a request for data. Paragraph [0051] of Kim describes a central server 20 that automatically stores and transmits images to a cloud server through a software.
Claims 9 and 17 are substantially similar to claim 1 and are rejected on the same grounds.
Regarding Claim 2:
Keifer discloses:
The vehicle-based computing system of claim 1, wherein detecting the event comprises at least one of: determining a collision or a near-miss collision involving the vehicle; determining damage to the vehicle; determining a high-risk driving scenario within the driving environment; or determining an expiration of a periodic timer associated with the rear-facing camera. Paragraph [0032] describes a near contact event being determines in response to a contact detection algorithm. If no contact event is detected 310, some of the data stored in the buffer memory is deleted or overwritten and more recent data is stored.
Regarding Claim 3:
Keifer discloses:
The vehicle-based computing system of claim 1, wherein detecting the event comprises: receiving additional data associated with the image data, from at least one of: a telematics system of the vehicle; a front-facing or side-facing camera of the vehicle; or a road condition or weather condition data source associated with the vehicle; and determining the event, based at least in part on the additional data. Paragraph [0032] describes front, side and rear mounted cameras. Paragraph [0027] describes storing image data collected via various cameras. Paragraph [0028] describes that data can be combined with vehicle telemetry and camera images.
Claims 16 and 20 are substantially similar to claim 3 and are rejected on the same grounds.
Regarding Claim 18:
Keifer and Kim teach:
The vehicle-based computing system of claim 1, wherein transmitting the image data and the characteristic of the second vehicle to the remote off-vehicle server comprises at least one of: transmitting the image data and the characteristic of the second vehicle via an Internet connection associated with the vehicle; transmitting the image data and the characteristic of the second vehicle via a wireless network connection associated with the rear-facing camera; or transmitting the image data and the characteristic of the second vehicle via a mobile device of an occupant of the vehicle. Paragraph [0033] of Keifer describes a request for data uses the internet. Paragraph [0028] of Keifer describes transmitting information via a wireless communications channel or network via V2V, V2I or V2X. Paragraph [0007] of Kim describes an image device that records an event occurrence and determines an object in the rear. The system determines that the speed and distance of the object in rear and displays the information. Paragraph [0051] of Kim describes a central server 20 that automatically stores and transmits images to a cloud server through a software.
The reason to combine Kim with Keifer is for the same reason as in claim 1.
Regarding Claim 6:
Kim teaches
The vehicle-based computing system of claim 1, wherein the characteristic of the second vehicle comprises at least one of: a vehicle identifier of the second vehicle; a following distance of the second vehicle; a model of the second vehicle; a speed of the second vehicle; an acceleration of the second vehicle; or a behavior of a driver of the second vehicle. Paragraph [0007] describes an image device that records an event occurrence and determines an object in the rear. The system determines that the speed and distance of the object in rear and displays the information.
The reason to combine Kim with Keifer is for the same reason as in claim 1.
Claims 13, 15 and 19 are substantially similar to claim 6 and are rejected on the same grounds.
Regarding Claim 8:
Keifer discloses:
The vehicle-based computing system of claim 1, wherein storing the image data comprises: causing the image data to be retained within the storage unit for a duration of time, based upon determining a driving condition of the vehicle. Paragraph [0032] describes a near contact event being determines in response to a contact detection algorithm. If no contact event is detected 310, some of the data stored in the buffer memory is deleted or overwritten and more recent data is stored.
Regarding Claim 12:
Kiefer discloses:
The vehicle-based computing system of claim 9, wherein initiating the control operation on the vehicle comprises: determining a deceleration value for a braking maneuver performed by the vehicle. Paragraph [0002] describes when vehicle to vehicle contact event is detected, vehicle speed, various accelerations, braking system engagement and steering information can be stored.
Regarding Claim 21:
Kim teaches:
The vehicle-based computing system of claim 1, wherein receiving the image data captured by the rear-facing camera comprises: automatically activating the rear-facing camera to capture the image data, based at least in part on an activation criteria, wherein the activation criteria comprises at least one of: a speed threshold of the vehicle; an acceleration threshold of the vehicle; a braking threshold of the vehicle; or a detection of a bump or jerk during operation of the vehicle. Paragraph [0031] describes an event occurrence determinator 12 can determine the occurrence of an event when a relative speed exceeds a certain speed. Paragraph [0037] describes that when an event occurs, the information is stored in the storage 16 in real time.
Regarding Claim 22:
Kim teaches:
The vehicle-based computing system of claim 1, wherein storing the image data within the storage unit comprises: automatically retaining the captured image data within storage unit, based at least in part on a storage criteria, wherein the storage criteria comprises at least one of: a speed threshold of the vehicle; an acceleration threshold of the vehicle; a braking threshold of the vehicle; or a detection of a bump or jerk during operation of the vehicle. Paragraph [0031] describes an event occurrence determinator 12 can determine the occurrence of an event when a relative speed exceeds a certain speed. Paragraph [0037] describes that when an event occurs, the information is stored in the storage 16 in real time.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 4 is rejected under 35 U.S.C. 103 as being unpatentable over Keifer in view of Kim and further in view of Vollmer (US Pub No: 2017/0072949 A1, hereinafter Vollmer).
Regarding Claim 4:
Keifer and Kim teach the above limitations in claim 1. Keifer and Kim do not teach a subset of additional data based on the impact surface of the vehicle.
Vollmer, in an analogous field of endeavor, and Kim teach:
The vehicle-based computing system of claim 3, wherein the event comprises a collision on an impact surface of the vehicle, and wherein the operations further include: determining a subset of the additional data, based at least in part on the impact surface of the vehicle; and transmitting the subset of the additional data to the remote off-vehicle server. Paragraph [0006] of Vollmer describes a collision site between a vehicle and an object. The data set includes a first subset of a plurality of possible impact regions with a first collision response instruction. Paragraph [0051] of Kim describes a central server 20 that automatically stores and transmits images to a cloud server through a software.
Therefore, it would have been prima facie obvious to one of the ordinary skill in the art before the effective filing date, with a reasonable expectation for success, to have modified Keifer to incorporate the teachings of Vollmer to show a subset of additional data based on the impact surface of the vehicle. One would have been motivated to do so to help minimize injuries during sudden vehicle stops ([0002] of Vollmer).
The reason to combine Kim with Keifer is for the same reason as in claim 1.
Claim(s) 10 – 11 are rejected under 35 U.S.C. 103 as being unpatentable over Keifer in view of Kim and further in view of Oboril et al. (US Pub No: 2025/0074464 A1, hereinafter Oboril)
Regarding Claim 10:
Keifer and Kim teach the above limitations in claim 1. Keifer and Kim do not teach initiating a steering maneuver by the vehicle.
Oboril, in an analogous field of endeavor, teaches:
The vehicle-based computing system of claim 9, wherein initiating the control operation on the vehicle comprises at least one of: activating a brake light or hazard light on the vehicle; initiating an acceleration maneuver by the vehicle; or initiating a steering maneuver by the vehicle. Paragraph [0052] describes helping a driver negotiate a high-risk area by adding torque to a steering wheel in a certain direction.
Therefore, it would have been prima facie obvious to one of the ordinary skill in the art before the effective filing date, with a reasonable expectation for success, to have modified Keifer to incorporate the teachings of Oboril to show initiating a steering maneuver by the vehicle. One would have been motivated to do so to allow the driver to negotiate a high-risk area ([0052] of Oboril).
Regarding Claim 11:
Oboril teaches:
The vehicle-based computing system of claim 9, wherein initiating the control operation on the vehicle comprises: transmitting a notification identifying the high-risk driving condition, via an audio or visual system of the vehicle, to a driver of the vehicle. Paragraph [0058] describes a driving instruction module 316 that provides an audio and/or haptic instruction.
The reason to combine Oboril with Keifer is for the same reasons as in claim 10.
Claim(s) 14 is rejected under 35 U.S.C. 103 as being unpatentable over Keifer in view of Kim and further in view of Tanaka et al. (US Pub No: 2019/0193728 A1, hereinafter Tanaka).
Regarding Claim 14:
Keifer and Kim teach the above limitations in claim 1. Keifer and Kim do not teach a backup camera that is activated when the vehicle is put into reverse.
Tanaka, in an analogous field of endeavor, teaches:
The vehicle-based computing system of claim 1, wherein the rear-facing camera comprises an integrated backup camera of the vehicle, wherein integrated backup camera is configured to activate the rear-facing camera when the vehicle is put into reverse. Paragraph [0066] describes a vehicle 11 that supplies a rear-view camera. The rear-view camera 39 is activated when a shift lever is moved to reverse.
Therefore, it would have been prima facie obvious to one of the ordinary skill in the art before the effective filing date, with a reasonable expectation for success, to have modified Keifer to incorporate the teachings of Tanaka to show a backup camera that is activated when the vehicle is put into reverse. One would have been motivated to do so to supply visual information in the blind spot of a driver ([0066] of Tanaka).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Salerno (US Pub No: 2023/0078631 A1): Systems and methods are provided to detect a motion event corresponding to at least one of a vehicle tow assembly, a winch assembly, or a gear guard cable. With the use of one or more cameras, the system captures images of a cable (e.g., a gear guard cable, a winch cable, or a tow cable). With the use of processing circuitry, a cable is identified in the captured images. The movement of the cable is tracked in the captured images. A motion event of the cable is identified based on the movement. In response to detecting the motion event, a vehicle comprised of the system and the cable performs an action.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAY KHANDPUR whose telephone number is (571)272-5090. The examiner can normally be reached Monday - Friday 8:30 - 6:30.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas Worden can be reached at (571) 272-4876. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JAY KHANDPUR/Primary Patent Examiner, Art Unit 3658