Prosecution Insights
Last updated: April 19, 2026
Application No. 18/203,786

INFORMATION PROCESSOR, INFORMATION PROCESSING METHOD, AND RECORDING MEDIUM

Final Rejection §101§103
Filed
May 31, 2023
Examiner
MILLER, PRESTON JAY
Art Unit
3661
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Toyota Jidosha Kabushiki Kaisha
OA Round
2 (Final)
56%
Grant Probability
Moderate
3-4
OA Rounds
3y 1m
To Grant
75%
With Interview

Examiner Intelligence

Grants 56% of resolved cases
56%
Career Allow Rate
28 granted / 50 resolved
+4.0% vs TC avg
Strong +19% interview lift
Without
With
+18.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
39 currently pending
Career history
89
Total Applications
across all art units

Statute-Specific Performance

§101
17.7%
-22.3% vs TC avg
§103
48.0%
+8.0% vs TC avg
§102
15.3%
-24.7% vs TC avg
§112
17.0%
-23.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 50 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments 2. Applicant's arguments filed 10/09/2025 have been fully considered but they are not persuasive. 3. Applicant argues the amended claims overcome 35 USC § 101 rejection of the claims. Examiner refers Applicant to the Claim Rejections - 35 USC § 101 section for the new ground of rejection under 35 USC § 101. 4. As such, this argument is unpersuasive. 5. Applicant argues the amended claim(s) 1, 7 and 8 is/are allowable over Suzuki et al. (JP-2019032712-A). Applicant continues Suzuki fails to disclose the features regarding “a central processing unit configured to detect presence or absence of an intersection in front of a driver's vehicle; upon detection of the intersection, acquire section position information for identifying an end position on an outer side in a width direction of a first section on a traveling lane for the traveling of a driver's vehicle, an end position on an outer side in a width direction of a second section on an oncoming lane at the intersection located in front of the driver's vehicle, and a boundary position between the first section and the second section; calculate a distance in a width direction from a section boundary position to the driver's vehicle position before entering the intersection; determine whether the driver's vehicle has passed a stop line and entered the intersection; sequentially acquire vehicle behavior information regarding behavior of the driver's vehicle when traveling through the intersection and store, as history information, a history of driving characteristic parameters representing the driving characteristics of a driver included in the sequentially acquired vehicle behavior information; upon determination that the driver's vehicle has entered the intersection, sequentially calculate a driver's vehicle position information for identifying the driver's vehicle position when traveling through the intersection based on the vehicle behavior information; upon determination that the driver's vehicle is turning right or left, determine which of the first section and the second section the driver's vehicle position identified by the driver's vehicle position information is located, based on the section position information; extract a specific driving characteristic parameter corresponding to at least one of the first section and the second section from the history information based on the determination of which of the first section and the second section the driver's vehicle position identified by the driver's vehicle position information is located; and evaluate the driving behavior based on the extracted specific driving characteristic parameter”, as recited in amended independent claim 1 and similarly recited in amended independent claims 7 and 8. 6. However, several of the limitations that the Applicant has cited above are taught by Suzuki or previously rejected. Indeed, Suzuki does not teach “calculate a distance in a width direction from a section boundary position to a driver's vehicle position before entering the intersection,” and “evaluate driving behavior based on the extracted specific driving characteristic parameter using a trained model obtained through machine learning.” As such, the newly amended features of “calculate a distance in a width direction from a section boundary position to a driver's vehicle position before entering the intersection,” and “evaluate driving behavior based on the extracted specific driving characteristic parameter using a trained model obtained through machine learning ” have necessitated new references Agon et al. (US-20210405641-A1) and Innes et al. (US-20170291611-A1). Applicant has not provided any reasons or rationale as why Suzuki fails to teach the remaining limitations and instead only provides conclusory statements. Applicant is referred to the rejection of claims in Claim Rejections - 35 USC § 103 section. 7. As such, this argument is unpersuasive. 8. Applicant argues dependent claim(s) is/are patentable by the virtue of their dependency on independent claims 1 and the additional features recited in the dependent claims. 9. This argument is unpersuasive as each independent claim and dependent claim has been fully rejected and for the reasons given above. Claim Rejections - 35 USC § 101 10. 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. 11. Claim(s) 1-9 is/are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. 12. The determination of whether a claim recites patent ineligible subject matter is a 2 step inquiry. STEP 1: the claim does not fall within one of the four statutory categories of invention (process, machine, manufacture or composition of matter), see MPEP 2106.03, or STEP 2: the claim recites a judicial exception, e.g. an abstract idea, without reciting additional elements that amount to significantly more than the judicial exception, as determined using the following analysis: see MPEP 2106.04 STEP 2A (PRONG 1): Does the claim recite an abstract idea, law of nature, or natural phenomenon? see MPEP 2106.04(II)(A)(1) STEP 2A (PRONG 2): Does the claim recite additional elements that integrate the judicial exception into a practical application? see MPEP 2106.04(II)(A)(2) and 2106.05(a) thru (d) for explanations. STEP 2B: Does the claim recite additional elements that amount to significantly more than the judicial exception? see MPEP 2106.05 101 Analysis – Step 1 13. Claim(s) 1-6 and 9 is/are directed to a processor (i.e. an apparatus). Therefore, claim(s) 1-6 and 9 is/are within at least one of the four statutory categories. 14. Claim(s) 7 is/are directed to a method (i.e. a process). Therefore, claim(s) 7 is/are within at least one of the four statutory categories. 15. Claim(s) 8 is/are directed to a recording medium. The claim does not recite, and the specification does not define, that a recording medium is limited to non-transitory embodiments. A claim encompassing both transitory and non-transitory embodiments, such as Applicant’s claimed recording medium, does not fall within one of the four categories of patent eligible subject matter. See In re Nuijten, 500 F.3d 1346, 1356-57 (Fed. Cir. 2007) (“A transitory, propagating signal like Nuitjen’s is not a process, machine, manufacture, or composition of matter.’ … Thus, such a signal cannot be patentable subject matter.”) Therefore, claim(s) 8 is/are not within at least one of the four statutory categories and the claim is not patent eligible. 101 Analysis – Step 2A, Prong I 16. Regarding Prong I of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the follow groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes. see MPEP 2106(A)(II)(1) and MPEP 2106.04(a)-(c). 17. Independent claim(s) 7 include(s) limitations that recite an abstract idea (emphasized below [with the category of abstract idea in brackets]). Claim 1 will be used as a representative claim for the remainder of the 101 rejection. Claim 7 recites: An information processor comprising a central processing unit configured to: detect presence or absence of an intersection in front of a driver's vehicle [mental process/step]; upon detection of the intersection, acquire section position information for identifying an end position on an outer side in a width direction of a first section on a traveling lane for the traveling of the driver's vehicle and a position of a second section on an oncoming lane at the intersection located in front of the driver's vehicle [mental process/step]; calculate a distance in a width direction from a section boundary position to a driver's vehicle position before entering the intersection [mental process/step]; determine whether the driver's vehicle has passed a stop line and entered the intersection [mental process/step]; sequentially acquire vehicle behavior information regarding behavior of the driver's vehicle when traveling through the intersection and store, as history information, a history of driving characteristic parameters representing the driving characteristics of a driver included in the sequentially acquired vehicle behavior information; upon determination that the driver's vehicle has entered the intersection, sequentially calculate a driver's vehicle position information for identifying the driver's vehicle position when traveling through the intersection based on the vehicle behavior information [mental process/step]; upon determination that the driver's vehicle is turning right or left, determine which of the first section and the second section the driver's vehicle position identified by the driver's vehicle position information is located, based on the section position information [mental process/step]; extract a specific driving characteristic parameter corresponding to at least one of the first section and the second section from the history information based on the determination of which of the first section and the second section the driver's vehicle position identified by the driver's vehicle position information is located; and evaluate driving behavior based on the extracted specific driving characteristic parameter [mental process/step] using a trained model obtained through machine learning. 18. The Examiner submits that the foregoing bolded limitation(s) constitute a “mental process.” 19. The mental process(es) recited in claim 7 is/are: “detect presence or absence of an intersection in front of a driver's vehicle,” “identifying an end position on an outer side in a width direction …,” “calculate a distance in a width direction,” “determine whether the driver's vehicle has passed a stop line and entered the intersection,” “upon determination that the driver's vehicle has entered the intersection, sequentially calculate a driver's vehicle position information for identifying the driver's vehicle position …,” and “evaluate driving behavior based on the extracted specific driving characteristic ...”. 20. Under the broadest reasonable interpretation, this/these limitations is/are process steps that cover mental processes including an observation, evaluation, judgment or opinion that could be performed in the human mind or with the aid of pencil and paper but for the recitation of a generic computer component. If a claim, under its broadest reasonable interpretation, covers a mental process but for the recitation of generic computer components, then it falls within the “Mental Process” grouping of abstract ideas. A person would readily be able to perform this process either mentally or with the assistance of pen and paper. See MPEP § 2106.04(a)(2). 21. For example, “detect presence or absence of an intersection in front of a driver's vehicle,” “identifying an end position on an outer side in a width direction …,” “calculate a distance in a width direction,” “determine whether the driver's vehicle has passed a stop line and entered the intersection,” “upon determination that the driver's vehicle has entered the intersection, sequentially calculate a driver's vehicle position information for identifying the driver's vehicle position …,” step(s) encompass(es) a user, such as the driver of a vehicle, making observation, evaluation or judgement about the location of the vehicle, could all be carried out in one’s mind. The same user looking at the data collected, could form a simple judgement and conclude whether the vehicle is located in an oncoming lane or not and evaluate the driving behavior. 22. As such, the claims recite an abstract idea of both a mental process and mathematical concept. 101 Analysis – Step 2A, Prong II 23. Regarding Prong II of the Step 2A analysis, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract into a practical application. see MPEP 2106.04(II)(A)(2) and MPEP 2106.04(d)(2). It must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.” 24. In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” [with a description of the additional limitations in brackets], while the bolded portions continue to represent the “abstract idea”.): An information processor comprising a central processing unit configured to [applying the abstract idea using generic computing module, Apply it 2106.05(f)]: detect presence or absence of an intersection in front of a driver's vehicle [mental process/step]; upon detection of the intersection, acquire section position information [pre-solution activity (data gathering), 2106.05(g) using generic sensors] for identifying an end position on an outer side in a width direction of a first section on a traveling lane for the traveling of the driver's vehicle and a position of a second section on an oncoming lane at the intersection located in front of the driver's vehicle [mental process/step]; calculate a distance in a width direction from a section boundary position to a driver's vehicle position before entering the intersection [mental process/step]; determine whether the driver's vehicle has passed a stop line and entered the intersection [mental process/step]; sequentially acquire vehicle behavior information regarding behavior of the driver's vehicle when traveling through the intersection and store, as history information, a history of driving characteristic parameters representing the driving characteristics of a driver included in the sequentially acquired vehicle behavior information [pre-solution activity (data gathering), 2106.05(g)]; upon determination that the driver's vehicle has entered the intersection, sequentially calculate a driver's vehicle position information for identifying the driver's vehicle position when traveling through the intersection based on the vehicle behavior information [mental process/step]; upon determination that the driver's vehicle is turning right or left, determine which of the first section and the second section the driver's vehicle position identified by the driver's vehicle position information is located, based on the section position information [mental process/step]; extract a specific driving characteristic parameter corresponding to at least one of the first section and the second section from the history information based on the determination of which of the first section and the second section the driver's vehicle position identified by the driver's vehicle position information is located [pre-solution activity (data gathering), 2106.05(g)]; and evaluate driving behavior based on the extracted specific driving characteristic parameter [mental process/step] using a trained model obtained through machine learning [particular technological environment or field of use without telling how it is accomplished]. 25. For the following reason(s), the examiner submits that the above identified additional limitations do not integrate the above-noted abstract idea into a practical application. 26. Regarding the additional limitation(s) of “an information processor comprising a central processing unit configured to,” it is recited at a high level of generality and merely automates the generating, detecting and predicting steps, therefore acting as a generic computer to perform the abstract idea. The limitation of “an information processor comprising a central processing unit configured to” is claimed generically and is operating in its ordinary capacity and does not use the judicial exception in a manner that imposes a meaningful limit on the judicial exception, such that the claim is more than a drafting effort designed to monopolize the exception. The additional limitation is no more than mere instructions to apply the exception using a computer. 27. Regarding the additional limitation(s) of “acquire section position information …,” “sequentially acquiring vehicle behavior information …,” and “extracting a specific driving characteristic parameter …” the examiner submits that this/these limitation(s) is/are insignificant extra-solution activities that merely use a computer to perform the process. In particular, the “acquiring section position information …” and “sequentially acquiring vehicle behavior information …” “step(s) is/are recited at a high level of generality (i.e. as a general means of gathering data for use in the evaluating step), and amounts to mere data gathering, which is a form of insignificant extra-solution activity. The “extracting a specific driving characteristic parameter …” step(s) is/are also recited at a high level of generality (i.e. as a general means of providing the evaluation result from the evaluating step), and amounts to mere post solution activity, which is a form of insignificant extra-solution activity. 28. Regarding the additional limitation(s) of “using a trained model obtained through machine learning” the examiner submits that this/these limitation(s) is/are merely indicating a field of use or technological environment without telling how it is accomplished. Limiting the use of the idea to one particular environment, does not add significantly more. 29. Thus, taken alone, the additional elements do not integrate the abstract idea into a practical application. Further, looking at the additional limitation(s) as an ordered combination or as a whole, the limitation(s) add nothing that is not already present when looking at the elements taken individually. For instance, there is no indication that the additional elements, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, apply or use the above-noted judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, implement/use the above-noted judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than a drafting effort designed to monopolize the exception. see MPEP § 2106.05. Accordingly, the additional limitation(s) do/does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. 101 Analysis – Step 2B 30. Regarding Step 2B of the Revised Guidance, representative independent claim 1 does not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of “An information processing method executed by a computer” is a vehicle controller to perform the evaluating steps amounts to nothing more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using a generic computer component cannot provide an inventive concept. As discussed above, in regards to the additional limitations of “acquiring section position information …,” “sequentially acquiring vehicle behavior information …,” and “extracting a specific driving characteristic parameter …,” the examiner submits that these limitations are insignificant extra-solution activities. Regarding the additional limitation(s) of “using a trained model obtained through machine learning” the examiner submits that this/these limitation(s) is/are particular technological environment or field of use. 31. As established above claim 1 is representative of all independent claims and therefore claim(s) 7 and 8 is/are rejected for the same reason. 32. Dependent claim(s) 2-6 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of dependent claims are directed toward additional aspects of the judicial exception and do not integrate the judicial exception into a practical application. Therefore, dependent claims 2-6 are not patent eligible under the same rationale as provided for in the rejection of 1. 33. Therefore, claim(s) 1-9 is/are ineligible under 35 USC §101. Claim Rejections - 35 USC § 103 34. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 35. Claim(s) 1-2, and 7-9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Suzuki et al. (JP-2019032712-A) in view of Agon et al. (US-20210405641-A1) and further in view of Innes et al. (US-20170291611-A1). In regards to claim 1 , Suzuki teaches An information processor comprising a central processing unit configured to: (Fig. 1, [0013] The driver information acquisition unit generates the driver information based on past driving history data of the driver. [0022 & 0031] The system controller 20 is an example of a prediction unit, and a driver information acquisition unit which includes a central processing unit (CPU) 22.) detect presence or absence of an intersection in front of a driver's vehicle; ([0059] The judgment position is the position where the predicted trajectory intersects with the oncoming lane and the crosswalk, as this is a position near an intersection where there is a relatively high possibility of contact with other moving bodies. Examiner notes, detecting objects and other moving bodies near an intersection, means the vehicle detects presence or absence of an intersection in front of a driver's vehicle.) upon detection of the intersection, acquire section position information for identifying an end position on an outer side in a width direction of a first section on a traveling lane for the traveling of the driver's vehicle and a position of a second section on an oncoming lane at the intersection located in front of the driver's vehicle; (Fig. 7A, [0053] The controller 20 detects a determination position P1 where the predicted trajectory 102 intersects with an oncoming lane 110 which is a position of a second section on an oncoming lane at an intersection located in front of the driver's vehicle. Examiner notes, as portrayed by Figure 1 (Fig. 7A of Suzuki), the intersection is divided into two sections. The left side of the intersection is the first section and right side of the intersection is the second section. The outer edge of the intersection on the left side is the end position on an outer side in a width direction of a first section on a traveling lane for the traveling of the driver's vehicle.) determine whether the driver's vehicle has passed a stop line and entered the intersection; ([0062] It is possible to judge the risk in terms of whether the vehicle can travel stably through the intersection. Examiner notes, the vehicle judging the risk of traveling through the intersection means the vehicle determining whether the driver's vehicle has entered the intersection which encompasses the scenario that the driver's vehicle passing a stop line and entering the intersection.) sequentially acquire vehicle behavior information regarding behavior of the driver's vehicle when traveling through the intersection and store, as history information, a history of driving characteristic parameters representing the driving characteristics of a driver included in the sequentially acquired vehicle behavior information; ([0013] The driver information acquisition unit generates the driver information based on past driving history data of the driver. Driver information is generated based on the driver's actual driving history which encompasses storing, as history information, a history of driving characteristic parameters representing the driving characteristics of the driver included in the sequentially acquired vehicle behavior information. [0020]-[0023] The independent positioning device 10 includes an acceleration sensor 11, an angular velocity sensor 12, and a distance sensor 13. The acceleration sensor 11 detects the acceleration of the vehicle, and outputs acceleration data. The angular velocity sensor 12 is a vibration gyroscope, which detects the angular velocity of the vehicle when the vehicle changes direction, and outputs angular velocity data and relative direction data. The distance sensor 13 measures a vehicle speed pulse, which is a pulse signal generated in accordance with the rotation of the wheels of the vehicle. The system controller 20 includes an interface 21 that performs an interface operation with the acceleration sensor 11, the angular velocity sensor 12, the distance sensor 13 , and the GPS receiver 18. Information such as acceleration and vehicle speed are vehicle behaviors as defined by paragraph 45 of Applicant’s specification.) upon determination that the driver's vehicle has entered the intersection, sequentially calculate a driver's vehicle position information for identifying the driver's vehicle position when traveling through the intersection based on the vehicle behavior information; ([0010] A setting unit that sets a determination position on the driving trajectory based on the driver information, and a determination unit that determines the driving risk at the determination position based on the speed information and the driving trajectory. [0021] The GPS receiver 18 receives radio waves 19 carrying downlink data including positioning data from a number of GPS satellites. The positioning data is used to determine the absolute position of the vehicle. [0044] The start and end points of the predicted trajectory are set at positions that correspond to when the vehicle travels through the intersection at a standard speed. [0045] Once the initial positions of the start point and end point of the predicted trajectory have been set, the controller 20 then corrects the initial positions of the start point and end point along the direction of each road in accordance with the vehicle travel speed predicted in step S22.) upon determination that the driver's vehicle is turning right or left ([0065] The trajectory of a vehicle turning right or left at an intersection at a constant speed is predicted, and the traveling risk of traveling along that trajectory is determined.), determine which of the first section and the second section the driver's vehicle position identified by the driver's vehicle position information is located, based on the section position information; (Fig. 5, [0052]-[0053] The controller 20 determines whether or not there is an oncoming lane in the intersection based on road information in the vicinity of the intersection (step S31). If there is an oncoming lane (step S31: Yes), the controller 20 detects a determination position P1, which is a position where the predicted trajectory intersects with the oncoming lane (step S32). As mentioned above, position P1 is detected. As such, the vehicle can determine whether its current location is in the first or second section based on the detected position P1.) extract a specific driving characteristic parameter corresponding to at least one of the first section and the second section from the history information based on the determination of which of the first section and the second section the driver's vehicle position identified by the driver's vehicle position information is located; and ([0077] The actual driving history data of the driver is referenced to analyze the driving tendency of the driver when turning right or left at an intersection, and the transition curve or characteristics to be used are changed based on the driving tendency obtained. In this case, it is desirable to accumulate and analyze driving history data from as many intersections as possible to extract the driver's driving tendencies when turning right or left, and then change the type and characteristics of the transition curve based on the driving tendencies obtained. This makes it possible to obtain a predicted trajectory that matches the driving tendencies of the driver not only at intersections that the driver has actually traveled through in the past, but also at intersections that the driver is traveling through for the first time. Extracting the driver's driving tendencies when turning right or left at an intersection encompasses extracting a specific driving characteristic parameter corresponding to at least one of the first section and the second section from the history information based on the result of the determination by the section determination unit.) Suzuki does not teach calculate a distance in a width direction from a section boundary position to a driver's vehicle position before entering the intersection; evaluate driving behavior based on the extracted specific driving characteristic parameter using a trained model obtained through machine learning. However, Agon teaches at block 305, a lateral distance between the vehicle's sensor system and at least one detected lane boundary at the given time is determined ([0095], Fig. 3E). Using the same method as Agon, the lateral distance to a boundary is determined including the distance in a width direction from a section boundary position to a driver's vehicle position before entering the intersection. Innes teaches car sensor information, geo-location, and machine learning from acquired image data is used to generate an evaluation of operator behavior/driving quality ([0091]) which is evaluating driving behavior based on the extracted specific driving characteristic parameter using a trained model obtained through machine learning. Examiner notes, car sensor information includes data such as vehicle speed and acceleration which are driving characteristic parameters. It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify the determination device of Suzuki, by incorporating the teachings of Agon and Innes, such that the lateral distance from the vehicle to the section boundary is calculated and the operator behavior/driving quality is determined based on car sensor information using machine learning. The motivation to modify is that, as acknowledged by Agon, to diagnose an appropriate driving state of a driver when turning left and right at an intersection (Page 12) which one of ordinary skill would have recognized allows the vehicle to safely navigate through the intersection. The motivation to modify is that, as acknowledged by Innes, evaluations can be used by a variety of audiences including: 1) by parents of younger drivers, 2) by insurance companies to evaluate driver behavior and reconstruct accidents, 3) by auto and other manufacturers to gauge customer usage and behavior, and 4) by law enforcement in instances of accidents, carjacking, theft, etc. ([0019]) which one of ordinary skill would have recognized allows the owners to protect their vehicles. In regards to claim 2 , Suzuki, as modified by Agon and Innes, teaches The information processor according to claim1, wherein a boundary position between the first section and the second section is identified as a center line located between the traveling lane and the oncoming lane or a position obtained by extending a position bisecting the width of the driver's vehicle traveling way to the intersection. (Fig. 7A, [0041] The controller 20 sets the point where the vehicle enters the intersection on the running road as the initial position of the start point of the predicted trajectory. The range of the intersection is defined based on the center position (coordinates) of the intersection and the width of the road that intersects the running road, so determine the point where the currently running road intersects the intersection range. [0053] The controller 20 detects a determination position P1 where the predicted track 102 intersects the oncoming lane 110. The oncoming lane 110 acts as the second section on an oncoming lane at an intersection located in front of the driver's vehicle. As mentioned above, the oncoming lane 110, or the second section, is determined by the control unit. As portrayed by Figure 1, the intersection is divided into two sections. The remaining section of the intersection acts as the first section.) PNG media_image1.png 402 533 media_image1.png Greyscale Figure 1 - First and second sections (Fig. 7A of Suzuki) In regards to claim 7 , Suzuki teaches An information processing method executed by a computer, comprising: ([0016] A judgment method executed by a judgment device that judges driving risk before a moving body actually turns right or left at an intersection.) detecting presence or absence of an intersection in front of a driver's vehicle; ([0059] The judgment position is the position where the predicted trajectory intersects with the oncoming lane and the crosswalk, as this is a position near an intersection where there is a relatively high possibility of contact with other moving bodies. Examiner notes, detecting objects and other moving bodies near an intersection, means the vehicle detects presence or absence of an intersection in front of a driver's vehicle.) upon detection of the intersection, acquiring section position information for identifying an end position on an outer side in a width direction of a first section on a traveling lane for the traveling of the driver's vehicle, an end position on an outer side in a width direction of a second section on an oncoming lane at the intersection located in front of the driver's vehicle, and a boundary position between the first section and the second section; (Fig. 7A, [0053] The controller 20 detects a determination position P1 where the predicted trajectory 102 intersects with an oncoming lane 110 which is a position of a second section on an oncoming lane at an intersection located in front of the driver's vehicle. Examiner notes, as portrayed by Figure 1 (Fig. 7A of Suzuki), the intersection is divided into two sections. The left side of the intersection is the first section and right side of the intersection is the second section. The outer edge of the intersection on the left side is the end position on an outer side in a width direction of a first section on a traveling lane for the traveling of the driver's vehicle. The outer edge of the intersection on the right side is the end position on an outer side in a width direction of a second section on an oncoming lane at the intersection located in front of the driver's vehicle. The section of the centerline between the first and the second section is the boundary position between the first section and the second section.) determining whether the driver's vehicle has passed a stop line and entered the intersection; ([0062] It is possible to judge the risk in terms of whether the vehicle can travel stably through the intersection. Examiner notes, the vehicle judging the risk of traveling through the intersection means the vehicle determining whether the driver's vehicle has entered the intersection which encompasses the scenario that the driver's vehicle passing a stop line and entering the intersection.) sequentially acquiring vehicle behavior information regarding behavior of the driver's vehicle when traveling through the intersection and storing, as history information, a history of driving characteristic parameters representing the driving characteristics of a driver included in the sequentially acquired vehicle behavior information; ([0013] The driver information acquisition unit generates the driver information based on past driving history data of the driver. Driver information is generated based on the driver's actual driving history which encompasses storing, as history information, a history of driving characteristic parameters representing the driving characteristics of the driver included in the sequentially acquired vehicle behavior information. [0020]-[0023] The independent positioning device 10 includes an acceleration sensor 11, an angular velocity sensor 12, and a distance sensor 13. The acceleration sensor 11 detects the acceleration of the vehicle, and outputs acceleration data. The angular velocity sensor 12 is a vibration gyroscope, which detects the angular velocity of the vehicle when the vehicle changes direction, and outputs angular velocity data and relative direction data. The distance sensor 13 measures a vehicle speed pulse, which is a pulse signal generated in accordance with the rotation of the wheels of the vehicle. The system controller 20 includes an interface 21 that performs an interface operation with the acceleration sensor 11, the angular velocity sensor 12, the distance sensor 13 , and the GPS receiver 18. Information such as acceleration and vehicle speed are vehicle behaviors as defined by paragraph 45 of Applicant’s specification.) upon determination that the driver's vehicle has entered the intersection, sequentially calculating the driver's vehicle position information for identifying a driver's vehicle position where the driver's vehicle exists when traveling through the intersection based on the vehicle behavior information; ([0010] A setting unit that sets a determination position on the driving trajectory based on the driver information, and a determination unit that determines the driving risk at the determination position based on the speed information and the driving trajectory. [0021] The GPS receiver 18 receives radio waves 19 carrying downlink data including positioning data from a number of GPS satellites. The positioning data is used to determine the absolute position of the vehicle. [0044] The start and end points of the predicted trajectory are set at positions that correspond to when the vehicle travels through the intersection at a standard speed. [0045] Once the initial positions of the start point and end point of the predicted trajectory have been set, the controller 20 then corrects the initial positions of the start point and end point along the direction of each road in accordance with the vehicle travel speed predicted in step S22.) upon determination that the driver's vehicle is turning right or left ([0065] The trajectory of a vehicle turning right or left at an intersection at a constant speed is predicted, and the traveling risk of traveling along that trajectory is determined.), determining which of the first section and the second section the driver's vehicle position identified by the driver's vehicle position information is located, based on the section position information; (Fig. 5, [0052]-[0053] The controller 20 determines whether or not there is an oncoming lane in the intersection based on road information in the vicinity of the intersection (step S31). If there is an oncoming lane (step S31: Yes), the controller 20 detects a determination position P1, which is a position where the predicted trajectory intersects with the oncoming lane (step S32). As mentioned above, position P1 is detected. As such, the vehicle can determine whether its current location is in the first or second section based on the detected position P1.) extracting a specific driving characteristic parameter corresponding to at least one of the first section and the second section from the history information based on the determination of which of the first section and the second section the driver's vehicle position identified by the driver's vehicle position information is located; and ([0077] The actual driving history data of the driver is referenced to analyze the driving tendency of the driver when turning right or left at an intersection, and the transition curve or characteristics to be used are changed based on the driving tendency obtained. In this case, it is desirable to accumulate and analyze driving history data from as many intersections as possible to extract the driver's driving tendencies when turning right or left, and then change the type and characteristics of the transition curve based on the driving tendencies obtained. This makes it possible to obtain a predicted trajectory that matches the driving tendencies of the driver not only at intersections that the driver has actually traveled through in the past, but also at intersections that the driver is traveling through for the first time. Extracting the driver's driving tendencies when turning right or left at an intersection encompasses extracting a specific driving characteristic parameter corresponding to at least one of the first section and the second section from the history information based on the result of the determination in the determining.) Suzuki does not teach calculating a distance in a width direction from a section boundary position to a driver's vehicle position before entering the intersection; evaluating driving behavior based on the extracted specific driving characteristic parameter using a trained model obtained through machine learning. However, Agon teaches at block 305, a lateral distance between the vehicle's sensor system and at least one detected lane boundary at the given time is determined ([0095], Fig. 3E). Using the same method as Agon, the lateral distance to a boundary is determined including the distance in a width direction from a section boundary position to a driver's vehicle position before entering the intersection. Innes teaches car sensor information, geo-location, and machine learning from acquired image data is used to generate an evaluation of operator behavior/driving quality ([0091]) which is evaluating driving behavior based on the extracted specific driving characteristic parameter using a trained model obtained through machine learning. Examiner notes, car sensor information includes data such as vehicle speed and acceleration which are driving characteristic parameters. It would have been obvious to one of ordinary skill in the art before the effective filing date of the application to modify the determination device of Suzuki, by incorporating the teachings of Agon and Innes, such that the lateral distance from the vehicle to the section boundary is calculated and the operator behavior/driving quality is determined based on car sensor information using machine learning. The motivation to do so is the same as acknowledged by Agon in regards to claim 1. The motivation to do so is the same as acknowledged by Innes in regards to claim 1. In regards to claim 8 , Suzuki teaches A recording medium having embodied thereon a program that causes a central processing unit to: (Fig. 1, [0013] The driver information acquisition unit generates the driver information based on past driving history data of the driver. [0022 & 0024 & 0031] The system controller 20 is an example of a prediction unit, a driver information acquisition unit which includes a central processing unit (CPU) 22, a read only memory (ROM) 23, and a random access memory (RAM) 24. The ROM 23 includes a non-volatile memory in which a control program for controlling the system controller 20 and the like are stored Examiner notes, RAM and ROM are recording media.) detect presence or absence of an intersection in front of a driver's vehicle; ([0059] The judgment position is the position where the predicted trajectory intersects with the oncoming lane and the crosswalk, as this is a position near an intersection where there is a relatively high possibility of contact with other moving bodies. Examiner notes, detecting objects and other moving bodies near an intersection, means the vehicle detects presence or absence of an intersection in front of a driver's vehicle.) upon detection of the intersection, acquire section position information for identifying an end position on an outer side in a width direction of a first section on a traveling lane for the traveling of the driver's vehicle, and end position on an outer side in a width direction of a second section on an oncoming lane at the intersection located in front of the driver's vehicle, and a boundary position between the first section and the second section; (Fig. 7A, [0053] The controller 20 detects a determination position P1 where the predicted trajectory 102 intersects with an oncoming lane 110 which is a position of a second section on an oncoming lane at an intersection located in front of the driver's vehicle. Examiner notes, as portrayed by Figure 1 (Fig. 7A of Suzuki), the intersection is divided into two sections. The left side of the intersection is the first section and right side of the intersection is the second section. The outer edge of the intersection on the left side is the end position on an outer side in a width direction of a first section on a traveling lane for the traveling of the driver's vehicle. The outer edge of the intersection on the right side is the end position on an outer side in a width direction of a second section on an oncoming lane at the intersection located in front of the driver's vehicle. The section of the centerline between the first and the second section is the boundary position between the first section and the second section.) determine whether the driver's vehicle has passed a stop line and entered the intersection; ([0062] It is possible to judge the risk in terms of whether the vehicle can travel stably through the intersection. Examiner notes, the vehicle judging the risk of traveling through the intersection means the vehicle determining whether the driver's vehicle has entered the intersection which encompasses the scenario that the driver's vehicle passing a stop line and entering the intersection.) sequentially acquire vehicle behavior information regarding behavior of the driver's vehicle when traveling through the intersection and store, as history information, a history of driving characteristic parameters representing the driving characteristics of a driver included in the sequentially acquired vehicle behavior information; ([0013] The driver information acquisition unit generates the driver information based on past driving history data of the driver. Driver information is generated based on the driver's actual driving history which encompasses storing, as history information, a history of driving characteristic parameters representing the driving characteristics of the driver included in the sequentially acquired vehicle behavior information. [0020]-[0023] The independent positioning device 10 includes an acceleration sensor 11, an angular velocity sensor 12, and a distance sensor 13. The acceleration sensor 11 detects the acceleration of the vehicle, and outputs acceleration data. The angular velocity sensor 12 is a vibration gyroscope, which detects the angular velocity of the vehicle when the vehicle changes direction, and outputs angular velocity data and relative direction data. The distance sensor 13 measures a vehicle speed pulse, which is a pulse signal generated in accordance with the rotation of the wheels of the vehicle. The system controller 20 includes an interface 21 that performs an interface operation with the acceleration sensor 11, the angular velocity sensor 12, the distance sensor 13 , and the GPS receiver 18. Information such as acceleration and vehicle speed are vehicle behaviors as defined by paragraph 45 of Applicant’s specification.) upon determination that the driver's vehicle has entered the intersection, sequentially calculate a driver's vehicle position information for identifying the driver's vehicle position where the driver's vehicle exists when traveling through the intersection based on the vehicle behavior information; ([0010] A setting unit that sets a determination position on the driving trajectory based on the driver information, and a determination unit that determines the driving risk at the determination position based on the speed information and the driving trajectory. [0021] The GPS receiver 18 receives radio waves 19 carrying downlink data including positioning data from a number of GPS satellites. The positioning data is used to determine the absolute position of the vehicle. [0044] The start and end points of the predicted trajecto
Read full office action

Prosecution Timeline

May 31, 2023
Application Filed
Jun 29, 2025
Non-Final Rejection — §101, §103
Sep 15, 2025
Examiner Interview Summary
Oct 09, 2025
Response Filed
Nov 19, 2025
Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12559091
CONTROL DEVICE FOR CONTROLLING SAFETY DEVICE IN VEHICLE
2y 5m to grant Granted Feb 24, 2026
Patent 12490678
VEHICLE LOCATION WITH DYNAMIC MODEL AND UNLOADING CONTROL SYSTEM
2y 5m to grant Granted Dec 09, 2025
Patent 12466388
Method for Operating a Motor Vehicle Drive Train and Electronic Control Unit for Carrying Out Said Method
2y 5m to grant Granted Nov 11, 2025
Patent 12454806
WORK MACHINE
2y 5m to grant Granted Oct 28, 2025
Patent 12447827
Electric Vehicle Control Device, Electric Vehicle Control Method, And Electric Vehicle Control System
2y 5m to grant Granted Oct 21, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
56%
Grant Probability
75%
With Interview (+18.8%)
3y 1m
Median Time to Grant
Moderate
PTA Risk
Based on 50 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month