Prosecution Insights
Last updated: April 18, 2026
Application No. 17/965,701

PROPERTY LOSS PREVENTION

Final Rejection §101§103
Filed
Oct 13, 2022
Examiner
KNUDSON, ELLE ROSE
Art Unit
3667
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Toyota Motor North America, Inc.
OA Round
4 (Final)
73%
Grant Probability
Favorable
5-6
OA Rounds
2y 10m
To Grant
99%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
11 granted / 15 resolved
+21.3% vs TC avg
Strong +44% interview lift
Without
With
+44.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
27 currently pending
Career history
42
Total Applications
across all art units

Statute-Specific Performance

§101
26.7%
-13.3% vs TC avg
§103
46.2%
+6.2% vs TC avg
§102
11.1%
-28.9% vs TC avg
§112
14.1%
-25.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 15 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Response to Amendment This FINAL action is in response to amendment filed on 12/18/2025. Claim(s) 1, 2, 5, 7, 8, 9, 12, 14, 15, 16, 20 is/are amended. Claim(s) 3, 4, 6, 10, 11, 13, 17, 18, 19 is/are previously presented. Information Disclosure Statement The information disclosure statement (IDS) submitted on 11/23/2025, 03/03/2026 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 101 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed inventions are directed to a judicial exception without significantly more, as determined by the Subject Matter Eligibility Test detailed below. Step 1 Step 1 of the Subject Matter Eligibility Test entails considering whether the claimed subject matter falls within the four statutory categories of patentable subject matter identified by 35 U.S.C. 101: process, machine, manufacture, or composition of matter. Independent claims -1, 8, and 15 are directed towards a method, an apparatus, and a non-transitory computer-readable storage medium, respectively. Therefore, each of the independent claims 1, 8, and 15, and the corresponding dependent claims 2-7, 9-14, and 16-20 are directed to a statutory category of invention under step 1. Step 2A, Prong 1 If the claim recites a statutory category of invention, the claim requires further analysis in Step 2A. Step 2A of the Subject Matter Eligibility Test is a two-prong inquiry. In Prong 1, examiners evaluate whether the claim recites a judicial exception. Regarding Prong 1, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes. Independent claim 1 recites abstract limitations, including those shown in bold below. A method comprising: capturing image data from a cabin of a vehicle by a software application; determining an object is present in the vehicle based on a proximity of the object to an occupant identified within the image data and a timestamp of when the object is identified; determining a type of the object based on execution of a neural network on the image data by the software application; storing the type of the object and the timestamp in a table; detecting an opening of a door of the vehicle; Generating a notification comprising an image of the object and the timestamp from the table; and transmitting the notification to a device associated with the occupant based on the type of the object when detecting the opening of the door. These limitations, as drafted, describe a process that, under its broadest reasonable interpretation, covers performance of the limitations in the mind, or by a human using pen and paper, and therefore recites mental processes. For example, “determining an object is present in the vehicle based on a proximity of the object to an occupant identified and a timestamp of when the object is identified” may be interpreted as a mental determination made according to observable data, such as an occupant of the vehicle looking around the cabin of a vehicle and identifying that another occupant is nearby an object of their possession and noting the time at which this realization occurs. Additionally, “determining a type of the” may be interpreted as a mental determination made according to observable data, such as a vehicle occupant looking at an object in the vehicle, noticing that it is fuzzy and two feet long, and thus determining that it is a type of pet. Additionally, “storing the type of the object and the timestamp in a table” can be considered a mental process of memorizing information through association between different data sets. Additionally, “detecting an opening of a door of the vehicle” may be interpreted as a mental determination made according to observable data, such as an occupant of a vehicle using their senses of sight and/or sound to notice that a vehicle door has been opened. Thus, the claim recites an abstract idea. Claims 8 and 15 recite abstract limitations analogous to those identified above with respect to claim 1, and therefore recite abstract ideas per the same analysis. Step 2A, Prong 2 If the claim recites a judicial exception in Step 2A, Prong 1, the claim requires further analysis in Step 2A, Prong 2. In Step 2A, Prong 2, examiners evaluate whether the claim recites additional elements that integrate the exception into a practical application of that exception. Regarding Prong 2, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract idea into a practical application. As noted in MPEP § 2106.04(d), it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra-solution activity, or generally linking the use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application”. Claim 1 recites additional elements including those underlined below. A method comprising: capturing image data from a cabin of a vehicle by a software application; determining an object is present in the vehicle based on a proximity of the object to an occupant identified within the image data and a timestamp of when the object is identified; determining a type of the object based on execution of a neural network on the image data by the software application; storing the type of the object and the timestamp in a table; detecting an opening of a door of the vehicle; Generating a notification comprising an image of the object and the timestamp from the table; and transmitting the notification to a device associated with the occupant based on the type of the object when detecting the opening of the door. Regarding the additional limitations of “by a software application” and “based on execution of a neural network on the image data by the software application” the examiner submits that these limitations are an attempt to generally link additional elements to a technological environment. In particular, the determining by a software application is recited at a high level of generality and merely automates the determining steps, therefore acting as a generic computer to perform the abstract idea. The software application is claimed generically and is operating in its ordinary capacity and does not use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the exception. The additional limitation is no more than mere instructions to apply the exception using a computer (the software). The recitation of capturing image data from a cabin of a vehicle amounts to mere data receiving, which is a form of insignificant extra-solution activity. Furthermore, the recitation of Generating a notification comprising an image of the object and the timestamp from the table and transmitting the notification to a device associated with the occupant based on the type of the object when detecting the opening of the door amounts to sending or displaying information, which is a form of insignificant extra-solution activity. Accordingly, in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limits on practicing the abstract idea. Step 2B If the additional elements do not integrate the exception into a practical application in step 2A Prong 2, then the claim is directed to the recited judicial exception, and requires further analysis under Step 2B to determine whether it provides an inventive concept (i.e., whether the additional elements amount to significantly more than the exception itself). As discussed above, based on execution of a neural network on the image data by the software application amounts to mere instructions to apply the exception. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. Use of a computer or other machinery in its ordinary capacity for economic or other tasks (e.g., to receive, store, or transmit data) or simply adding a general-purpose computer or computer components after the fact to an abstract idea does not provide significantly more. See Affinity Labs v. DirecTV, 838 F.3d 1253, 1262, 120 USPQ2d 1201, 1207 (Fed. Cir. 2016) (cellular telephone); TLI Communications LLC v. AV Auto, LLC, 823 F.3d 607, 613, 118 USPQ2d 1744, 1748 (Fed. Cir. 2016) (computer server and telephone unit). As discussed above, capturing image data from a cabin of a vehicle by a software application amounts to insignificant extra-solution activity. MPEP § 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well-understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here). As discussed above, Generating a notification comprising an image of the object and the timestamp from the table; and transmitting the notification to a device associated with the occupant based on the type of the object when detecting the opening of the door amounts to insignificant extra-solution activity. MPEP 2106.05(d)(II), and the cases cited therein, including in Trading Techs. Int’l v. IBG LLC, 921 F.3d 1084, 1093 (Fed. Cir. 2019), and Intellectual Ventures I LLC v. Erie Indemnity Co., 850 F.3d 1315, 1331 (Fed. Cir. 2017), for example, indicated that the mere displaying of data (i.e., notifying an individual of some status or characteristic of the vehicle) is a well understood, routine, and conventional function. Thus, even when viewed as an ordered combination, nothing in the claims adds significantly more (i.e., an inventive concept) to the abstract idea. Claims 8 and 15 further recite “A system, comprising: a processor; and a memory, wherein the processor and the memory are communicably coupled, and the processor is configured to…” and “a non-transitory computer-readable storage medium”, respectively, which amount to merely generic components which allow the abstract idea to be applied (MPEP § 2106.05(f)(2)). The examiner submits that these elements are mere computers or other machinery used as a tool to perform the existing process. Dependent claims 2-7, 9-14, and 16-20 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the various limitations of dependent claims are directed toward additional aspects of the judicial exception and/or well-understood, routine, and conventional additional elements that do not integrate the judicial exception into a practical application (i.e., further characterizing the notification and determination steps). Therefore, dependent claims 2-7, 9-14, and 16-20 are not patent eligible under the same rationale as provided for in the rejection of independent claims 1, 8, and 15. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 2, 3, 6, 7, 8, 9, 10, 13, 14, 15, 16, 17, 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over US 10303961 B1 Stoffel; Christopher John et al. (hereinafter Stoffel), in view of US 20200202148 A1 Wright; Thomas S. et al. (hereinafter Wright), further in view of KR 20190100894 A SORYOUNG KIM (hereinafter Soryoung). Regarding claim 1, Stoffel discloses: A method (see Stoffel at least [col. 1, lines 55-57] a method to detect objects in a vehicle and provide one or more notifications to passengers) comprising: capturing image data from a cabin of a vehicle by a software application (see Stoffel at least [col. 2, lines 27-30] the vehicle can include one or more sensors inside the vehicle to detect and classify objects. In some examples, the system may include, for example, one or more cameras and computer vision software); determining an object is present in the vehicle based on a proximity of the object to an occupant identified within the image data (see Stoffel at least [col. 4, lines 39-43] the vehicle 102 can comprise a plurality of interior sensors to detect objects and passengers in the interior space 118 of the vehicle 102. To this end, in some examples, the system 100 can comprise one or more interior imagers 124 and [col. 2, lines 33-37] If a user places a drink in the cup holder of the vehicle, for example, this can be detected using a combination of sensors such as, for example, a weight sensor, a moisture sensor, and a vision system); determining a type of the object based on execution of a neural network on the image data by the software application (see Stoffel at least [col. 8, lines 57-61] the image processor 302 can include image recognition software, for example, to enable the object classification system 300 to distinguish between objects and passengers, among other things and [col. 8, line 65-col. 9, line 5] image data can be analyzed according to one or more machine learning procedures, such as, for example, a convolutional neural network (CNN)… Such analysis may include, for example, object detection, recognition, segmentation, classification, and the like); and detecting an opening of a door of the vehicle (see Stoffel at least [col. 12, line 65-col. 13, line 1] The sensors 124, 130, 146, 148, 150 can provide information about the location of the object, for example, and/or indicate a door 116 being opened). Stoffel does not teach: determining a timestamp of when the object is identified; storing the type of the object and the timestamp in a table; Generating a notification comprising an image of the object and the timestamp from the table; and transmitting the notification to a device associated with the occupant based on the type of the object when detecting the opening of the door. However, Wright teaches: determining a timestamp of when the object is identified (see Wright at least [0054] based on the time of occupancy of a passenger in combination with the time of identification of the item 26, the controller 100 may infer that the item 26 belongs to the passenger immediately preceding the identification); storing the type of the object (see Wright at least [0021] the detection of the substance 24, the item 26 and/or any other detection by the detection devices 18 may be referred to as the detection and/or reporting of various vehicle states. By comparing the image data in the passenger compartment captured by the one or more image sensors 20 upon a clean inspection relative to an arrival or departure of a passenger, the imaging system 18a may be configured to capture image data documenting changes in the state or condition of the passenger compartment 14 over time. As further discussed herein, such image data may be documented by the system 10 (e.g. by storing image data locally or in a remote server) in order to assign liability and/or ownership to one or more passengers associated with the vandalism, damage, and/or lost items demonstrated in the image data); Generating a notification comprising an image of the object (See Wright at least [0054] the system 10 may communicate a message, image data depicting the item 26, and/or provide an audible or visual alert from the vehicle 12 that the item is located in the vehicle 12 following the departure of the passenger); and transmitting the notification to a device associated with the occupant based on the type of the object when detecting the opening of the door (See Wright at least [0054] the system 10 may communicate a message, image data depicting the item 26, and/or provide an audible or visual alert from the vehicle 12 that the item is located in the vehicle 12 following the departure of the passenger. Again, the departure may be determined based on a detection via one or more weight sensors, a door sensor, the imaging system 18a, and/or various additional devices discussed herein and [0055] the system may be configured to ... provide an alert via the mobile device 60). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring method disclosed by Stoffel to include the timestamping and occupant lost-item notification of Wright. One of ordinary skill in the art would have been motivated to make this modification because detecting an object associated with a vehicle user and subsequent notification facilitates reunion of the item with its owner, as suggested by Wright (see Wright at least [0055] the system 10 may provide for the valuable service of identifying the item 26 and, in some instances, may further be configured to arrange for return of the item 26). Stoffel and Wright do not disclose: storing the timestamp; and Generating a notification comprising the timestamp. However, Soryoung teaches: storing the timestamp (see Soryoung at least [pg. 25, para. 4, beginning with “The first memory”] The first memory may store information of the occupant, estimated loss location information and [pg. 26, para. 11, beginning with “The collection method”] The estimated loss location information may include at least one of a loss behavior occurrence time corresponding to the expected loss location); and Generating a notification comprising the timestamp (see Soryoung at least [pg. 23, para. 7, beginning with “The alarm unit 1220”] the loss information and the lost behavior information may be displayed at the estimated loss time at the corresponding location). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring method disclosed by Stoffel and Wright to include the consideration of timing of a user losing and vehicle identifying a lost object of Soryoung. One of ordinary skill in the art would have been motivated to make this modification because the timing of an occupant leaving behind a belonging may help to later identify where the object ended up in order to reunite object and owner, as suggested by Soryoung (see Soryoung at least [pg. 27, para. 8, beginning with “The server 1300”] By including the step of requesting and receiving movement information at the time of loss from the user terminal as described above, the present invention can increase the likelihood of recovery of lost items by anticipating the point of occurrence of the loss of belongings of the user). Regarding claim 2, Stoffel, Wright, and Soryoung disclose: The method of claim 1, further comprising determining to notify the occupant of the object being left in the vehicle based on the type of the object and the proximity of the object to the occupant within the image data (see Stoffel at least [col. 16, lines 17-22] the system 100 can receive data from the interior sensors (e.g., sensors 124, 130, 146, 148) related to whether there are any objects in the interior space 118 of the vehicle 102 with the passenger(s). This can include weight and location information provided by the various weight sensors 130, video from the interior imagers 124 and [col. 18, lines 6-15] if, on the other hand, the passenger is exiting the vehicle (e.g., door sensor 150 “open”, weight sensors—zero), but has left the object behind, the system 100 (or, the object classification system 300) can determine if the classification of the object warrants additional action. If the object is a drink or newspaper (Stage 1), for example, the system 100 may simply reset under the assumption that the passenger either intended to leave the object behind (i.e., the object is trash) or that it is simply not valuable enough for additional action). Regarding claim 3, Stoffel, Wright, and Soryoung disclose: The method of claim 1, wherein the transmitting comprises transmitting the notification to the device associated with the occupant while the occupant is in a process of exiting the vehicle and the object remains in the vehicle (see Stoffel at least [col. 18, lines 6-10] At 524, if, on the other hand, the passenger is exiting the vehicle (e.g., door sensor 150 “open”, weight sensors—zero), but has left the object behind, the system 100 (or, the object classification system 300) can determine if the classification of the object warrants additional action and [col. 18, lines 22-25] At 526, if, on the other hand, the object is classified such that additional action is warranted (in this case, Stage 2 or higher) the vehicle notification system 406 can provide a third reminder). Regarding claim 6, Stoffel, Wright, and Soryoung disclose: The method of claim 1, further comprising: determining a value of the object (see Stoffel at least [col. 16, lines 38-42] objects can be classified according to their intrinsic or personal value to the passenger. In other words, a laptop may be very expensive, while the contents of a wallet or a purse may simply be difficult, or impossible, to replace) and notifying a server associated with the vehicle when the value is greater than a threshold (see Stoffel at least [col. 14, lines 23-26] In other examples, such as for particularly valuable items, the vehicle notification system 406 may contact law enforcement instead or, or in addition to, the central control and/or maintenance facility). Regarding claim 7, Stoffel, Wright, and Soryoung disclose: The method of claim 1, wherein the transmitting comprises capturing an image of the object and adding the captured image of the object to the notification (See Wright at least [0054] the system 10 may communicate a message, image data depicting the item 26, and/or provide an audible or visual alert from the vehicle 12 that the item is located in the vehicle 12 following the departure of the passenger). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring method disclosed by Stoffel to include the timestamping and occupant lost-item notification of Wright. One of ordinary skill in the art would have been motivated to make this modification because detecting an object associated with a vehicle user and subsequent notification facilitates reunion of the item with its owner, as suggested by Wright (see Wright at least [0055] the system 10 may provide for the valuable service of identifying the item 26 and, in some instances, may further be configured to arrange for return of the item 26). Regarding claim 8, Stoffel discloses: A system (see Stoffel at least [col. 2, lines 13-15] a system for detecting and/or classifying objects in autonomous, semi-autonomous, or manual vehicles, and notifying users when necessary), comprising: a processor (see Stoffel at least [col. 12, lines 30-31] The electronic device 400 can also include one or more processors 410); and a memory, wherein the processor and the memory are communicably coupled (see Stoffel at least [col. 12, lines 25-26] The electronic device 400 can comprise memory 402 configured to include computer-executable instructions and [claim 1] memory storing instructions that, when executed by one or more processors, cause the one or more processors to…), and the processor is configured to: capture image data from a cabin of a vehicle by a software application (see Stoffel at least [col. 2, lines 27-30] the vehicle can include one or more sensors inside the vehicle to detect and classify objects. In some examples, the system may include, for example, one or more cameras and computer vision software); determine an object is present in the vehicle based on a proximity of the object to an occupant identified within the image data (see Stoffel at least [col. 4, lines 39-43] the vehicle 102 can comprise a plurality of interior sensors to detect objects and passengers in the interior space 118 of the vehicle 102. To this end, in some examples, the system 100 can comprise one or more interior imagers 124 and [col. 2, lines 33-37] If a user places a drink in the cup holder of the vehicle, for example, this can be detected using a combination of sensors such as, for example, a weight sensor, a moisture sensor, and a vision system); determine a type of the object based on execution of a neural network on the image data by the software application (see Stoffel at least [col. 8, lines 57-61] the image processor 302 can include image recognition software, for example, to enable the object classification system 300 to distinguish between objects and passengers, among other things and [col. 8, line 65-col. 9, line 5] image data can be analyzed according to one or more machine learning procedures, such as, for example, a convolutional neural network (CNN)… Such analysis may include, for example, object detection, recognition, segmentation, classification, and the like); detect an opening of a door of the vehicle (see Stoffel at least [col. 12, line 65-col. 13, line 1] The sensors 124, 130, 146, 148, 150 can provide information about the location of the object, for example, and/or indicate a door 116 being opened). Stoffel does not teach: determine a timestamp of when the object is identified; store the type of the object and the timestamp in a table; generate a notification comprising an image of the object and the timestamp from the table; and transmit the notification to a device associated with the occupant based on the type of the object when detecting the opening of the door. However, Wright teaches: determine a timestamp of when the object is identified (see Wright at least [0054] based on the time of occupancy of a passenger in combination with the time of identification of the item 26, the controller 100 may infer that the item 26 belongs to the passenger immediately preceding the identification); store the type of the object (see Wright at least [0021] the detection of the substance 24, the item 26 and/or any other detection by the detection devices 18 may be referred to as the detection and/or reporting of various vehicle states. By comparing the image data in the passenger compartment captured by the one or more image sensors 20 upon a clean inspection relative to an arrival or departure of a passenger, the imaging system 18a may be configured to capture image data documenting changes in the state or condition of the passenger compartment 14 over time. As further discussed herein, such image data may be documented by the system 10 (e.g. by storing image data locally or in a remote server) in order to assign liability and/or ownership to one or more passengers associated with the vandalism, damage, and/or lost items demonstrated in the image data); generate a notification comprising an image of the object (See Wright at least [0054] the system 10 may communicate a message, image data depicting the item 26, and/or provide an audible or visual alert from the vehicle 12 that the item is located in the vehicle 12 following the departure of the passenger); and transmit the notification to a device associated with the occupant based on the type of the object when detecting the opening of the door (See Wright at least [0054] the system 10 may communicate a message, image data depicting the item 26, and/or provide an audible or visual alert from the vehicle 12 that the item is located in the vehicle 12 following the departure of the passenger. Again, the departure may be determined based on a detection via one or more weight sensors, a door sensor, the imaging system 18a, and/or various additional devices discussed herein and [0055] the system may be configured to ... provide an alert via the mobile device 60). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring system disclosed by Stoffel to include the timestamping and occupant lost-item notification of Wright. One of ordinary skill in the art would have been motivated to make this modification because detecting an object associated with a vehicle user and subsequent notification facilitates reunion of the item with its owner, as suggested by Wright (see Wright at least [0055] the system 10 may provide for the valuable service of identifying the item 26 and, in some instances, may further be configured to arrange for return of the item 26). Stoffel and Wright do not teach: store the timestamp; and generate a notification comprising the timestamp. However, Soryoung teaches: store the timestamp (see Soryoung at least [pg. 25, para. 4, beginning with “The first memory”] The first memory may store information of the occupant, estimated loss location information and [pg. 26, para. 11, beginning with “The collection method”] The estimated loss location information may include at least one of a loss behavior occurrence time corresponding to the expected loss location); and generate a notification comprising the timestamp (see Soryoung at least [pg. 23, para. 7, beginning with “The alarm unit 1220”] the loss information and the lost behavior information may be displayed at the estimated loss time at the corresponding location). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring system disclosed by Stoffel and Wright to include the consideration of timing of a user losing and vehicle identifying a lost object of Soryoung. One of ordinary skill in the art would have been motivated to make this modification because the timing of an occupant leaving behind a belonging may help to later identify where the object ended up in order to reunite object and owner, as suggested by Soryoung (see Soryoung at least [pg. 27, para. 8, beginning with “The server 1300”] By including the step of requesting and receiving movement information at the time of loss from the user terminal as described above, the present invention can increase the likelihood of recovery of lost items by anticipating the point of occurrence of the loss of belongings of the user). Regarding claim 9, Stoffel, Wright, and Soryoung disclose: The system of claim 8, wherein the processor is further configured to determine to notify the occupant of the object being left in the vehicle based on the type of the object and the proximity of the object to the occupant within the image data (see Stoffel at least [col. 16, lines 17-22] the system 100 can receive data from the interior sensors (e.g., sensors 124, 130, 146, 148) related to whether there are any objects in the interior space 118 of the vehicle 102 with the passenger(s). This can include weight and location information provided by the various weight sensors 130, video from the interior imagers 124 and [col. 18, lines 6-15] if, on the other hand, the passenger is exiting the vehicle (e.g., door sensor 150 “open”, weight sensors—zero), but has left the object behind, the system 100 (or, the object classification system 300) can determine if the classification of the object warrants additional action. If the object is a drink or newspaper (Stage 1), for example, the system 100 may simply reset under the assumption that the passenger either intended to leave the object behind (i.e., the object is trash) or that it is simply not valuable enough for additional action). Regarding claim 10, Stoffel, Wright, and Soryoung disclose: The system of claim 8, wherein the processor is configured to transmit the notification to the occupant while the occupant is in a process of an exit of the vehicle and the object remains in the vehicle (see Stoffel at least [col. 18, lines 6-10] At 524, if, on the other hand, the passenger is exiting the vehicle (e.g., door sensor 150 “open”, weight sensors—zero), but has left the object behind, the system 100 (or, the object classification system 300) can determine if the classification of the object warrants additional action and [col. 18, lines 22-25] At 526, if, on the other hand, the object is classified such that additional action is warranted (in this case, Stage 2 or higher) the vehicle notification system 406 can provide a third reminder). Regarding claim 13, Stoffel, Wright, and Soryoung disclose: The system of claim 8, wherein the processor is configured to determine a value of the object (see Stoffel at least [col. 16, lines 38-42] objects can be classified according to their intrinsic or personal value to the passenger. In other words, a laptop may be very expensive, while the contents of a wallet or a purse may simply be difficult, or impossible, to replace) and notify a server associated with the vehicle when the value is greater than a threshold (see Stoffel at least [col. 14, lines 23-26] In other examples, such as for particularly valuable items, the vehicle notification system 406 may contact law enforcement instead or, or in addition to, the central control and/or maintenance facility). Regarding claim 14¸ Stoffel, Wright, and Soryoung disclose: The system of claim 8, wherein the processor is configured to capture an image of the object and add the captured image of the object to the notification (See Wright at least [0054] the system 10 may communicate a message, image data depicting the item 26, and/or provide an audible or visual alert from the vehicle 12 that the item is located in the vehicle 12 following the departure of the passenger). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring method disclosed by Stoffel to include the timestamping and occupant lost-item notification of Wright. One of ordinary skill in the art would have been motivated to make this modification because detecting an object associated with a vehicle user and subsequent notification facilitates reunion of the item with its owner, as suggested by Wright (see Wright at least [0055] the system 10 may provide for the valuable service of identifying the item 26 and, in some instances, may further be configured to arrange for return of the item 26). Regarding claim 15, Stoffel discloses: A non-transitory computer-readable storage medium comprising instructions, that when read by a processor, cause the processor to (see Stoffel at least [col. 15, lines 11-13] The memory 402, removable storage 412, and non-removable storage 414 are all examples of non-transitory computer-readable media and [claim 1] memory storing instructions that, when executed by one or more processors, cause the one or more processors to…) perform: Capturing image data from a cabin of a vehicle by a software application (see Stoffel at least [col. 2, lines 27-30] the vehicle can include one or more sensors inside the vehicle to detect and classify objects. In some examples, the system may include, for example, one or more cameras and computer vision software); determining an object is present in the vehicle based on a proximity of the object to an occupant identified within the image data (see Stoffel at least [col. 4, lines 39-43] the vehicle 102 can comprise a plurality of interior sensors to detect objects and passengers in the interior space 118 of the vehicle 102. To this end, in some examples, the system 100 can comprise one or more interior imagers 124 and [col. 2, lines 33-37] If a user places a drink in the cup holder of the vehicle, for example, this can be detected using a combination of sensors such as, for example, a weight sensor, a moisture sensor, and a vision system); determining a type of the object based on execution of a neural network on the image data by the software application (see Stoffel at least [col. 8, lines 57-61] the image processor 302 can include image recognition software, for example, to enable the object classification system 300 to distinguish between objects and passengers, among other things and [col. 8, line 65-col. 9, line 5] image data can be analyzed according to one or more machine learning procedures, such as, for example, a convolutional neural network (CNN)… Such analysis may include, for example, object detection, recognition, segmentation, classification, and the like); and detecting an opening of a door of the vehicle (see Stoffel at least [col. 12, line 65-col. 13, line 1] The sensors 124, 130, 146, 148, 150 can provide information about the location of the object, for example, and/or indicate a door 116 being opened). Stoffel does not teach: determining a timestamp of when the object is identified; storing the type of the object and the timestamp in a table; generating a notification comprising an image of the object and the timestamp from the table; and transmitting the notification to a device associated with the occupant based on the type of the object when detecting the opening of the door. However, Wright teaches: determining a timestamp of when the object is identified (see Wright at least [0054] based on the time of occupancy of a passenger in combination with the time of identification of the item 26, the controller 100 may infer that the item 26 belongs to the passenger immediately preceding the identification); storing the type of the object (see Wright at least [0021] the detection of the substance 24, the item 26 and/or any other detection by the detection devices 18 may be referred to as the detection and/or reporting of various vehicle states. By comparing the image data in the passenger compartment captured by the one or more image sensors 20 upon a clean inspection relative to an arrival or departure of a passenger, the imaging system 18a may be configured to capture image data documenting changes in the state or condition of the passenger compartment 14 over time. As further discussed herein, such image data may be documented by the system 10 (e.g. by storing image data locally or in a remote server) in order to assign liability and/or ownership to one or more passengers associated with the vandalism, damage, and/or lost items demonstrated in the image data); generating a notification comprising an image of the object (See Wright at least [0054] the system 10 may communicate a message, image data depicting the item 26, and/or provide an audible or visual alert from the vehicle 12 that the item is located in the vehicle 12 following the departure of the passenger); and transmitting the notification to a device associated with the occupant based on the type of the object when detecting the opening of the door (See Wright at least [0054] the system 10 may communicate a message, image data depicting the item 26, and/or provide an audible or visual alert from the vehicle 12 that the item is located in the vehicle 12 following the departure of the passenger. Again, the departure may be determined based on a detection via one or more weight sensors, a door sensor, the imaging system 18a, and/or various additional devices discussed herein and [0055] the system may be configured to ... provide an alert via the mobile device 60). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring system disclosed by Stoffel to include the timestamping and occupant lost-item notification of Wright. One of ordinary skill in the art would have been motivated to make this modification because detecting an object associated with a vehicle user and subsequent notification facilitates reunion of the item with its owner, as suggested by Wright (see Wright at least [0055] the system 10 may provide for the valuable service of identifying the item 26 and, in some instances, may further be configured to arrange for return of the item 26). Stoffel and Wright do not teach: storing the timestamp; and generating a notification comprising the timestamp. However, Soryoung teaches: storing the timestamp (see Soryoung at least [pg. 25, para. 4, beginning with “The first memory”] The first memory may store information of the occupant, estimated loss location information and [pg. 26, para. 11, beginning with “The collection method”] The estimated loss location information may include at least one of a loss behavior occurrence time corresponding to the expected loss location); and generating a notification comprising the timestamp (see Soryoung at least [pg. 23, para. 7, beginning with “The alarm unit 1220”] the loss information and the lost behavior information may be displayed at the estimated loss time at the corresponding location). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring system disclosed by Stoffel and Wright to include the consideration of timing of a user losing and vehicle identifying a lost object of Soryoung. One of ordinary skill in the art would have been motivated to make this modification because the timing of an occupant leaving behind a belonging may help to later identify where the object ended up in order to reunite object and owner, as suggested by Soryoung (see Soryoung at least [pg. 27, para. 8, beginning with “The server 1300”] By including the step of requesting and receiving movement information at the time of loss from the user terminal as described above, the present invention can increase the likelihood of recovery of lost items by anticipating the point of occurrence of the loss of belongings of the user). Regarding claim 16, Stoffel, Wright, and Soryoung disclose: The non-transitory computer-readable storage medium of claim 15, wherein the processor is further configured to perform determining to notify the occupant of the object being left in the vehicle based on the type of the object and the proximity of the object to the occupant (see Stoffel at least [col. 16, lines 17-22] the system 100 can receive data from the interior sensors (e.g., sensors 124, 130, 146, 148) related to whether there are any objects in the interior space 118 of the vehicle 102 with the passenger(s). This can include weight and location information provided by the various weight sensors 130, video from the interior imagers 124 and [col. 18, lines 6-15] if, on the other hand, the passenger is exiting the vehicle (e.g., door sensor 150 “open”, weight sensors—zero), but has left the object behind, the system 100 (or, the object classification system 300) can determine if the classification of the object warrants additional action. If the object is a drink or newspaper (Stage 1), for example, the system 100 may simply reset under the assumption that the passenger either intended to leave the object behind (i.e., the object is trash) or that it is simply not valuable enough for additional action). Regarding claim 17, Stoffel, Wright, and Soryoung disclose: The non-transitory computer-readable storage medium of claim 15, wherein the transmitting comprises transmitting the notification to the device associated with the occupant is in a process of exiting the vehicle and the object remains in the vehicle (see Stoffel at least [col. 18, lines 6-10] At 524, if, on the other hand, the passenger is exiting the vehicle (e.g., door sensor 150 “open”, weight sensors—zero), but has left the object behind, the system 100 (or, the object classification system 300) can determine if the classification of the object warrants additional action and [col. 18, lines 22-25] At 526, if, on the other hand, the object is classified such that additional action is warranted (in this case, Stage 2 or higher) the vehicle notification system 406 can provide a third reminder). Regarding claim 19, Stoffel, Wright, and Soryoung disclose: The non-transitory computer-readable storage medium of claim 15, wherein the processor is further configured to perform determining a value of the object (see Stoffel at least [col. 16, lines 38-42] objects can be classified according to their intrinsic or personal value to the passenger. In other words, a laptop may be very expensive, while the contents of a wallet or a purse may simply be difficult, or impossible, to replace) and notifying a server associated with the vehicle when the value is greater than a threshold (see Stoffel at least [col. 14, lines 23-26] In other examples, such as for particularly valuable items, the vehicle notification system 406 may contact law enforcement instead or, or in addition to, the central control and/or maintenance facility). Claim(s) 4, 11, 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Stoffel, in view of Wright, further in view of Soryoung, and further in view of KR 20200076972 A CHO HYUNG SOO et al. (hereinafter Cho). Regarding claim 4, Stoffel, Wright, and Soryoung disclose: The method of claim 1. Stoffel, Wright, and Soryoung do not teach: wherein the transmitting comprises transmitting the notification when it is determined that the occupant is in the vehicle and the object is not in the vehicle. However, Cho teaches: wherein the transmitting comprises transmitting the notification when it is determined that the occupant is in the vehicle and the object is not in the vehicle (see Cho at least [pg. 5, para. 9, beginning with “The notification providing”] The notification providing unit 190 may provide a notification for the unsupplied auxiliary article when the driver boards the vehicle without the auxiliary article being brought into the vehicle). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring method disclosed by Stoffel, Wright, and Soryoung to include the notification to a vehicle user when they enter the vehicle without their expected belonging of Cho. One of ordinary skill in the art would have been motivated to make this modification because notifying a user of objects not in the vehicle can help them remember to load their luggage into the vehicle before they drive away, as suggested by Cho (see Cho at least [pg. 5, para. 8, beginning with “The notification providing unit 190 may provide a notification for a static”] the notification providing unit 190 may provide a notification for a static object that has not been received, such as "Your baggage has not been loaded. Please check it out."). Regarding claim 11, Stoffel, Wright, and Soryoung disclose: The system of claim 8. Stoffel, Wright, and Soryoung do not teach: wherein the processor is configured to transmit the notification to the occupant while when it is determined that the occupant is in the vehicle and the object is not in the vehicle. However, Cho teaches: wherein the processor is configured to transmit the notification to the occupant while when it is determined that the occupant is in the vehicle and the object is not in the vehicle (see Cho at least [pg. 5, para. 9, beginning with “The notification providing”] The notification providing unit 190 may provide a notification for the unsupplied auxiliary article when the driver boards the vehicle without the auxiliary article being brought into the vehicle). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring system disclosed by Stoffel, Wright, and Soryoung to include the notification to a vehicle user when they enter the vehicle without their expected belonging of Cho. One of ordinary skill in the art would have been motivated to make this modification because notifying a user of objects not in the vehicle can help them remember to load their luggage into the vehicle before they drive away, as suggested by Cho (see Cho at least [pg. 5, para. 8, beginning with “The notification providing unit 190 may provide a notification for a static”] the notification providing unit 190 may provide a notification for a static object that has not been received, such as "Your baggage has not been loaded. Please check it out."). Regarding claim 18, Stoffel, Wright, and Soryoung disclose: The non-transitory computer-readable storage medium of claim 15. Stoffel, Wright, and Soryoung do not teach: wherein the transmitting comprises transmitting the notification when it is determined that the occupant is in the vehicle and the object is not in the vehicle. However, Cho teaches: wherein the transmitting comprises transmitting the notification when it is determined that the occupant is in the vehicle and the object is not in the vehicle (see Cho at least [pg. 5, para. 9, beginning with “The notification providing”] The notification providing unit 190 may provide a notification for the unsupplied auxiliary article when the driver boards the vehicle without the auxiliary article being brought into the vehicle). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring non-transitory computer-readable storage medium disclosed by Stoffel, Wright, and Soryoung to include the notification to a vehicle user when they enter the vehicle without their expected belonging of Cho. One of ordinary skill in the art would have been motivated to make this modification because notifying a user of objects not in the vehicle can help them remember to load their luggage into the vehicle before they drive away, as suggested by Cho (see Cho at least [pg. 5, para. 8, beginning with “The notification providing unit 190 may provide a notification for a static”] the notification providing unit 190 may provide a notification for a static object that has not been received, such as "Your baggage has not been loaded. Please check it out."). Claim(s) 5, 12, 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Stoffel, in view of Wright, further in view of Soryoung, and further in view of US 10311704 B1 Xu; Qijie et al. (hereinafter Xu). Regarding claim 5, Stoffel, Wright, and Soryoung disclose: The method of claim 1. Stoffel, Wright, and Soryoung do not teach: further comprising determining the object belongs to the occupant based on the proximity of the object to the occupant within the image data. However, Xu teaches: further comprising determining the object belongs to the occupant based on the proximity of the object to the occupant within the image data (see Xu at least [col. 15, lines 11-14] an association is made with the identified item and a passenger identified by a passenger identifier. The association can be based on item proximity to a passenger location within the vehicle cabin and [col. 6, lines 40-47] Camera devices 120 and 122 may capture images of the vehicle cabin 102, and generate data 152 that can be utilized by the vehicle control unit 110 to determine an identification of an item, such as the item 130 located in the vehicle cabin passenger region 124. Once identified, the vehicle control unit 110 operates to associate an identified item, such as item 130, with the passenger 162 identified by the passenger identifier 148). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring method disclosed by Stoffel, Wright, and Soryoung to include the association of belonging between an occupant and an item based on proximity of Xu. One of ordinary skill in the art would have been motivated to make this modification because knowledge of which item is owned by which passenger allows for appropriate notifications in the future, as suggested by Xu (see Xu at least [col. 15, lines 27-30] In this manner, a near real-time inventory can be generated as relating to items identified within vehicle cabin passenger regions for multiple passengers). Regarding claim 12, Stoffel, Wright, and Soryoung disclose: The system of claim 8. Stoffel, Wright, and Soryoung do not teach: wherein the processor is configured to determine the object belongs to the occupant based on the proximity of the object to the occupant within the image data. However, Xu teaches: wherein the processor is configured to determine the object belongs to the occupant based on the proximity of the object to the occupant within the image data (see Xu at least [col. 15, lines 11-14] an association is made with the identified item and a passenger identified by a passenger identifier. The association can be based on item proximity to a passenger location within the vehicle cabin). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring system disclosed by Stoffel, Wright, and Soryoung to include the association of belonging between an occupant and an item based on proximity of Xu. One of ordinary skill in the art would have been motivated to make this modification because knowledge of which item is owned by which passenger allows for appropriate notifications in the future, as suggested by Xu (see Xu at least [col. 15, lines 27-30] In this manner, a near real-time inventory can be generated as relating to items identified within vehicle cabin passenger regions for multiple passengers). Regarding claim 20, Stoffel, Wright, and Soryoung disclose: The non-transitory computer-readable storage medium of claim 15. Stoffel, Wright, and Soryoung do not teach: wherein the processor is further configured to perform determining the object belongs to the occupant based on the proximity of the object to the occupant within the image data. However, Xu teaches: wherein the processor is further configured to perform determining the object belongs to the occupant based on the proximity of the object to the occupant within the image data (see Xu at least [col. 15, lines 11-14] an association is made with the identified item and a passenger identified by a passenger identifier. The association can be based on item proximity to a passenger location within the vehicle cabin). It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the vehicle object and occupant monitoring non-transitory computer-readable storage medium disclosed by Stoffel, Wright, and Soryoung to include the association of belonging between an occupant and an item based on proximity of Xu. One of ordinary skill in the art would have been motivated to make this modification because knowledge of which item is owned by which passenger allows for appropriate notifications in the future, as suggested by Xu (see Xu at least [col. 15, lines 27-30] In this manner, a near real-time inventory can be generated as relating to items identified within vehicle cabin passenger regions for multiple passengers). Response to Arguments Applicant's arguments filed 12/18/2025 have been fully considered. Applicant's amendments overcome the objections to the claims. Regarding the arguments provided for the 35 U.S.C. §101 judicial exception rejection of claims 1-20, the applicant’s arguments have been considered but are not persuasive. (A) Applicant argues, “Applicant disagrees with the Office that the claims are directed to an abstract idea... Therefore, Applicant contends that the inquiry into the abstraction of Applicant’s claims should end at this first prong as Applicant’s claims do not fall into any of the enumerated groupings and therefore cannot be abstract” (from remarks pages 7-8) As to point (A), Examiner respectfully disagrees. Examiner notes that while physical components are claimed in the claimed invention, many are recited at a level of generality to be considered additional elements of using generic computers to apply a judicial exception. The amended limitations of determining the presence of an object in a vehicle and determining a type of the object are judgments that can readily be made in the human mind. The addition of a neural network and the image data simply serve as additional elements and instructions to apply the abstract idea through generic computer. The broad recitation of a neural network does not provide specific detail to preclude the performance of the processing in a human mind. Additionally, detecting a vehicle door opening is considered an abstract idea because detection of many physical interactions can be easily performed by the human mind through processing of visual and/or auditory input. (B) Applicant argues, “Applicant notes that Applicant’s claims impose meaningful, practical, and succinct operations… and thus the claims recite eligible subject matter under Section 101.” (from remarks pages 9) As to point (B), Examiner respectfully disagrees. Examiner notes that while the observational steps of the claimed invention relate to the physical vehicle (e.g., the door opening), this does not preclude the detection of such events from being performed as a mental process. Additionally, there is no claimed actuation of physical vehicle states or events in response to the recited detection or determination steps. As such, the judicial exception is not integrated into practical application. Ultimately, the determination and detection steps are surrounded by insignificant extra-solution activities of pre-solution data gathering, and post-solution sending of data. A simple notification is insufficient to incorporate the recited judicial exceptions into practical application. Regarding the arguments provided for the 35 U.S.C. §103 rejections of claims 1-20 (remarks pages 10-11), the applicant's arguments have been considered but are moot because of new grounds of rejection. Examiner notes that while Stoffel is still cited as a primary reference, Stoffel is not cited as reciting the limitations which Applicant argues against. Examiner notes further that while Applicant states “Stoffel fails to describe or suggest, ‘detecting an opening of a door of the vehicle…’”, Applicant appears to acquiesce regarding Stoffel’s recitation of door opening detection (see remarks pg. 11 “Stoffel simply detects a door being opened”) and as such Examiner agrees that Stoffel recites detecting a door opening, as cited in the prior art rejection(s). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US 20190057595 A1 Yamamoto; Stuart Masakazu discloses notifying a vehicle occupant of rear seat objects depending on open/closed status of vehicle doors Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ELLE ROSE KNUDSON whose telephone number is (703)756-1742. The examiner can normally be reached 1000-1700 ET M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hitesh Patel can be reached at (571) 270-5442. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ELLE ROSE KNUDSON/Examiner, Art Unit 3667 /Hitesh Patel/Supervisory Patent Examiner, Art Unit 3667 4/4/26
Read full office action

Prosecution Timeline

Oct 13, 2022
Application Filed
Dec 19, 2024
Non-Final Rejection — §101, §103
Feb 12, 2025
Examiner Interview Summary
Feb 12, 2025
Applicant Interview (Telephonic)
Mar 15, 2025
Response Filed
May 05, 2025
Final Rejection — §101, §103
Jun 27, 2025
Response after Non-Final Action
Aug 08, 2025
Request for Continued Examination
Aug 12, 2025
Response after Non-Final Action
Sep 22, 2025
Non-Final Rejection — §101, §103
Dec 18, 2025
Response Filed
Apr 03, 2026
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591241
OBJECT ENROLLMENT IN A ROBOTIC CART COORDINATION SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12590444
WORKING VEHICLE AND ATTACHMENT USAGE SYSTEM
2y 5m to grant Granted Mar 31, 2026
Patent 12582045
BASECUTTER AUTOMATED HEIGHT CALIBRATION FOR SUGARCANE HARVESTERS
2y 5m to grant Granted Mar 24, 2026
Patent 12558925
Method and Apparatus for Displaying Function Menu Interface of Automobile Tyre Pressure Monitoring System
2y 5m to grant Granted Feb 24, 2026
Patent 12559907
OPERATOR CONFIRMATION OF MACHINE CONTROL SCHEME
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
73%
Grant Probability
99%
With Interview (+44.4%)
2y 10m
Median Time to Grant
High
PTA Risk
Based on 15 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month