DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 3/12/2025 was filed. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
Claim elements in this application that use the word “means” (or “step for”) are presumed to invoke 35 U.S.C. 112(f) except as otherwise indicated in an Office action. Similarly, claim elements that do not use the word “means” (or “step for”) are presumed not to invoke 35 U.S.C. 112(f) except as otherwise indicated in an Office action.
Claim limitation “a person detection unit that detects”, “a behavior detection unit that detects a behavior”, “a notification control unit that outputs”, “characteristic extraction unit that extracts” has/have been interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because it uses/they use a generic placeholder “unit” coupled with functional language (the language after unit) without reciting sufficient structure to achieve the function. Furthermore, the generic placeholder is not preceded by a structural modifier.
Since the claim limitation(s) invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, claim(s) 1-10 has/have been interpreted to cover the corresponding structure described in the specification that achieves the claimed function, and equivalents thereof.
A review of the specification shows that the following appears to be the corresponding structure described in the specification for the 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph limitation: The support for the structure is in the Fig. 2 as being contained within a Notification Control Apparatus but is an example only 0008 as such there is no definite structure and he units may be interpreted as software per se (this interpretation will be the reason for the following 101 and 112 rejections).
If applicant wishes to provide further explanation or dispute the examiner’s interpretation of the corresponding structure, applicant must identify the corresponding structure with reference to the specification by page and line number, and to the drawing, if any, by reference characters in response to this Office action.
If applicant does not intend to have the claim limitation(s) treated under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112 , sixth paragraph, applicant may amend the claim(s) so that it/they will clearly not invoke 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, or present a sufficient showing that the claim recites/recite sufficient structure, material, or acts for performing the claimed function to preclude application of 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
For more information, see MPEP § 2173 et seq. and Supplementary Examination Guidelines for Determining Compliance With 35 U.S.C. 112 and for Treatment of Related Issues in Patent Applications, 76 FR 7162, 7167 (Feb. 9, 2011).
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 1-10 rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention.
Claim 1-10 rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being incomplete for omitting essential structural cooperative relationships of elements, such omission amounting to a gap between the necessary structural connections. See MPEP § 2172.01. The omitted structural cooperative relationships are: the relationships between the units.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
As to claim 1-10 the claimed invention is directed to non-statutory subject matter. The claim(s) does/do not fall within at least one of the four categories of patent eligible subject matter because it is software per se (see claim interpretation above regarding “units”).
Alice type rejection – Abstract Idea Mental Process
As to claim 1-10 the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
101 Analysis – Step 1
Claim(s) 1-10 is/are directed to a mental process of determining a motion trajectory (Process claims 10 and apparatus for claim 1-9).
101 Analysis – Step 2A, Prong 1
Regarding Prong I of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the follow groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes.
Independent claim 1 includes limitations that recite an abstract idea – mental process (emphasized below) and will be used as a representative claim for the remainder of the 101 rejection. Claim 1 recites:
A notification control apparatus for a vehicle, comprising:
a person detection unit that detects, based on an image capturing a scene outside a vehicle, a person in a vicinity of the vehicle;
a behavior detection unit that detects a behavior of the person detected by the person detection unit; and
a notification control unit that outputs, when it is determined based on the behavior of the person detected by the behavior detection unit that it is necessary to notify the person from the vehicle, a notification to the person. (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”)
The examiner submits that the foregoing bolded limitation(s) constitute a “mental process” because under its broadest reasonable interpretation, the claim covers performance of the limitation in the human mind. For example, “determined…” in the context of this claim encompasses a person (navigator) looking at data collected and forming a simple judgement (notify). Accordingly, the claim recites at least one abstract idea – mental process.
101 Analysis – Step 2A, Prong 2
Regarding Prong II of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”) See above.
For the following reason(s), the examiner submits that the above identified additional limitations do not integrate the above-noted abstract idea into a practical application. Claim 1 includes a processing apparatus. Regarding the additional limitations of “units” that merely describes how to generally “apply” the otherwise mental judgements in a generic or general-purpose processing environment. The processing is recited at a high level of generality and merely automates the determining process steps.
101 Analysis – Step 2B
Regarding Step 2B of the 2019 PEG, representative independent claim 1 does not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. As discussed above with respect to integration of the mental process into a practical application, the additional element of using a “units” to perform the determining amounts to nothing more than applying the exception using a generic computer component. Generally applying an exception using a generic computer component cannot provide an inventive concept.
Further, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B to determine if they are more than what is well understood, routine, conventional activity in the field. The additional limitations of processing with a processing apparatus are well-understood, routine, and conventional activities because the specification does not provide any indication that the processing apparatus is anything other than a conventional computer. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner.
Dependent claim(s) 2-9 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of dependent claims are directed toward additional aspects of the judicial exception and/or well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application because they merely add to the mental processing. Therefore, dependent claims 2-9 are not patent eligible under the same rationale as provided for in the rejection of independent claims 1 and 10.
Therefore, claim(s) 1-10 is/are ineligible under 35 USC §101. Examiner recommends a controlling step.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-10is/are rejected under 35 U.S.C. 102(a)(1)/102(a)(2) as being anticipated by US 20200223352 A1 hereinafter Toshio.
As to claim 1 and included method claim 10, Toshio discloses a notification control apparatus for a vehicle, comprising:
a person detection unit that detects [Toshio: #60 computer vision system], based on an image capturing a scene outside a vehicle [Toshio: 0074-0082], a person in a vicinity of the vehicle[Toshio: 0074-0082];
a behavior detection unit that detects a behavior of the person detected by the person detection unit [Toshio: 0074-0079 e.g., body movements and gestures]; and
a notification control unit that outputs [Toshio: Control #40 with proposed avatar #20], when it is determined based on the behavior of the person detected by the behavior detection unit that it is necessary to notify the person from the vehicle [Toshio: 0095, determines. 0097-0088 the behavior is used with the computer vision module to have a trained output based on the training.], a notification to the person [Toshio: 0091 “The same personalization could be done regarding the pedestrian gender (e.g.: “Dear lady, please cross”, “Hello, sir. I saw you!”, etc.).” outputs a message based on the scene context and the persons need for a notification.].
As to claim 2, Toshio discloses further comprising: a characteristic extraction unit that extracts a characteristic of the person detected by the person detection unit [Toshio: 0074-0079 includes gender],
wherein the notification control unit outputs a notification including the characteristic of the person extracted by the characteristic extraction unit to the person [Toshio: 0091 “The same personalization could be done regarding the pedestrian gender (e.g.: “Dear lady, please cross”, “Hello, sir. I saw you!”, etc.).” outputs a message based on the scene context and the persons need for a notification. Here gender is detected and incorporated into the message.].
As to claim 3 Toshio discloses wherein, given that there are a plurality of further persons within a predetermined range around a person determined by the behavior detection unit as requiring a notification from the vehicle, the notification control unit outputs a notification including a characteristic of the person to the person [Toshio: 0148 multiple people some with characteristic of crossed and one with the trait of still crossing identifies by the characteristic crossing and makes a message. (e.g. “I see you are still crossing. Take your time, I will wait”)].
As to claim 4, Toshio discloses wherein, given that, within a predetermined range around a person determined by the behavior detection unit as requiring a notification from the vehicle, there is another person having a similar characteristic as the person, the notification control unit outputs a notification including a characteristic of the person to the person [Toshio: 0116-0123 messages that are based on common characteristics are presented “you are now safe to cross” the characteristic being waiting to cross.].
As to claim 5, Toshio discloses further comprising: a person orientation determination unit that detects a face orientation or gaze direction of the person detected by the person detection unit and determines whether the face orientation or gaze direction of the person detected is aligned with a direction of the vehicle [Toshio: 0078 “gaze (e.g., looking to the car, looking to smartphone, distracted or looking to somewhere else);” 0141 using smart phone or without proper attention],
wherein the notification control unit outputs different notifications to the person when the face orientation or gaze direction of the person is aligned with the direction of the vehicle and when the face orientation or gaze direction of the person is not aligned with the direction of the vehicle [Toshio: 0088 the training for desired functionality should use gaze. 0141 determining without proper attention/ lack of looking].
As to claim 6, Toshio discloses the notification control apparatus according to wherein the notification control unit outputs a notification that uses an expression indicating a request when the face orientation or gaze direction of the person is aligned with the direction of the vehicle [Toshio: 0104-0107 the messages are directed at a person that is looking to convey information.]and outputs a notification that uses an expression indicating an inquiry when the face orientation or gaze direction of the person is not aligned with the direction of the vehicle [Toshio: 0146 “the avatar may change the message or even provide other type of alert (e.g., sound, flash light) highlighting that the pedestrian can cross the street.” The sounds or lights are not a request but are inquiring if the person sees. 0078 “gaze (e.g., looking to the car, looking to smartphone, distracted or looking to somewhere else);” 0141 using smart phone or without proper attention].
As to claim 7, Toshio discloses the notification control apparatus according to wherein the notification control unit outputs a notification that uses an expression for calling attention [Toshio: 0146 “the avatar may change the message or even provide other type of alert (e.g., sound, flash light) highlighting that the pedestrian can cross the street.”]when the face orientation or gaze direction of the person is aligned with the direction of the vehicle [Toshio: 0146 “the car already indicated that it will stop or the car has already stopped but the pedestrian is still waiting to cross” the car is stopped and the person has not crossed after an initial message with 0088 training of the notifications being based partly on gaze.] and outputs a notification that uses an expression indicating an advance notice when the face orientation or gaze direction of the person is not aligned with the direction of the vehicle [Toshio: 0088 training based design choices].
As to claim 8, Toshio discloses the notification control apparatus according to wherein the notification control unit determines, when the person is likely to head in a direction of travel of the vehicle, that it is necessary to notify the person from the vehicle [Toshio: 0141].
As to claim 9, Toshio discloses the notification control apparatus according to wherein the notification control unit determines, when the behavior detection unit detects that the person is stationary and is likely to head in the direction of travel of the vehicle [Toshio: 0007], that it is necessary to notify the person from the vehicle [Toshio: 0007 “As observed, pedestrians generally wait for a “human gesture” (eye contact, head nods, hand gestures) to be sure that the driver (or, in this case, the self-driving car) had perceived/recognized them. Even when technologies and algorithms are autonomously driving the cars, people still need to find a way to recreate the subtle interactions that keep them safe on the streets. Therefore, it would be desirable a solution for self-driving cars based on these human behavior, i.e., a vehicle which is able to signal intentions to the environment around the vehicle (including pedestrians, bicycles, and other vehicles) and allows interactions with (more) humanized gestures.”].
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
US 10300846 B2 An image display device includes an illuminator and a detector. The illuminator is configured to send out light on a road surface frontward of a vehicle, to display an image on the road surface. The detector is configured to detect a pedestrian frontward of the vehicle. The illuminator is configured to cause a notification image to be on chronologically-changeable display on the road surface. The notification image notifies information to the pedestrian detected by the detector.
US 10943472 B2 a notification device provided in a vehicle and configured to notify pedestrians of information includes a recognition unit configured to recognize, based on a result of detection performed by an external sensor, a plurality of pedestrians, a notification unit configured to perform notification about the information with respect to an outside of the vehicle, and a notification controller configured to generate the information, about which the notification unit performs notification, for each pedestrian in response to recognizing the plurality of pedestrians by the recognition unit and to cause the notification unit to perform notification about the generated information for each pedestrian.
US 11345277 B2 Various technologies described herein pertain to controlling an autonomous vehicle to provide indicators that signal a driving intent of the autonomous vehicle. The autonomous vehicle includes a plurality of sensor systems that generate a plurality of sensor signals, a notification system, and a computing system. The computing system determines that the autonomous vehicle is to execute a maneuver that will cause the autonomous vehicle to traverse a portion of a driving environment of the autonomous vehicle. The computing system predicts that a person in the driving environment is to traverse the portion of the driving environment based upon the plurality of sensor signals. The computing system then controls the notification system to output a first indicator indicating that the autonomous vehicle plans to yield to the person or a second indicator indicating that the autonomous vehicle plans to execute the maneuver prior to the person traversing the portion of the driving environment.
The examiner has pointed out particular references contained in the prior art of record in the body of this action for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. Applicant should consider the entire prior art as applicable as to the limitations of the claims. It is respectfully requested from the applicant, in preparing the response, to consider fully the entire references as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner.
Inquiry
Any inquiry concerning this communication or earlier communications from the examiner should be directed to FREDERICK M BRUSHABER whose telephone number is (313)446-4839. The examiner can normally be reached Monday-Friday 8am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hunter Lonsberry can be reached at (571) 272-7298. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/FREDERICK M BRUSHABER/
Primary Examiner
Art Unit 3665
/FREDERICK M BRUSHABER/Primary Examiner, Art Unit 3665