Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
This action is in response to the applicant’s amendment filed on 26 November 2025.
Claims 11-22 are pending and examined. Claims 11 and 17 are currently amended. Claims 1-10 are cancelled by Applicant.
Response to Arguments
Rejection as to claims 11 and 17 under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph is withdrawn due to amendment and argument.
With respect to the rejection of claims 11-22 under AIA 35 U.S.C. §102(a)(1) as being anticipated by Jeon et al., US 2019/0011990 A1, seems to argue that Jeon does not teach “predefinable function, which includes at least one of open/close a tailgate of the vehicle, open/close a flap of the vehicle, open/close one or more automatic doors of the vehicle, open/close one or more windows of the vehicle, activate/deactivate a function of a head-up display, and activate/deactivate a content that is output via the head-up display” as amended in independent 11 and 17. Examiner respectfully disagrees. McClintic teaches (Fig. 1 and related text; “smart key may be embedded in the wearable device and may be connected in hardware by a connector. According to an embodiment, the acceleration sensor may obtain the gesture information about an X-axis, a Y-axis, and a Z-axis by sensing a specific gesture of gestures that the driver makes while approaching the vehicle electronic device. According to an embodiment, the specific gesture may include at least one of an operation of holding a handle of vehicle door by extending a hand wearing the wearable device, an operation of pressing a handle button of the vehicle door, and an operation of pulling the vehicle door toward a body of the driver to open the vehicle door”, ¶10-12).
Notice re prior art available under both pre-AIA and AIA
In the event the determination of the status of the application as subject to AIA 35 U.S.C. §102 and §103 (or as subject to pre-AIA 35 U.S.C. §102 and §103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Claim Rejections - 35 USC §102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. §102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
Claims 11-22 are rejected under AIA 35 U.S.C. §102(a)(1) as being anticipated by Jeon et al., US 2019/0011990 A1.
As to claim 11, Jeon teaches a system for agile, intuitive control of vehicle functions of a vehicle (“a vehicle electronic device communicating with a wearable device includes; a smart key receiving a searching signal from the vehicle electronic device; an acceleration sensor obtaining gesture information about a specific gesture that the driver makes while approaching the vehicle electronic device; a learning processor learning the specific gesture from the gesture information; a controller determining intention of the driver depending on whether the searching signal is received; and depending on whether the obtained gesture information is recognized as the specific gesture and to control on/off of a sensor measuring the information”, abs), comprising:
a wearable device designed to capture sensor data, the sensor data comprising motion data and optical data relating to a user of the wearable device (Fig. 1 and related text; “wearable device 30”, ¶40; “if the specific gesture that the driver makes while approaching the vehicle electronic device is the learned.”, claim 13; “The wearable device of claim 1, wherein the sensor includes a photo-plethysmo graphy (PPG) sensor”, claim 9);
a computing unit configured to (Fig. 1 and related text; “vehicle electronic device 10”, ¶40):
process the captured sensor data (Fig. 1 and related text; “a vehicle electronic device an acceleration sensor obtaining gesture information about a specific gesture that the driver makes while approaching the vehicle electronic device”, abs);
determine a functional relationship of the processed sensor data with the vehicle (“smart key may be embedded in the wearable device and may be connected in hardware by a connector. According to an embodiment, the acceleration sensor may obtain the gesture information about an X-axis, a Y-axis, and a Z-axis by sensing a specific gesture of gestures that the driver makes while approaching the vehicle electronic device. According to an embodiment, the specific gesture may include at least one of an operation of holding a handle of vehicle door by extending a hand wearing the wearable device, an operation of pressing a handle button of the vehicle door, and an operation of pulling the vehicle door toward a body of the driver to open the vehicle door”, ¶10-12); and
determine an association with a predefinable vehicle function, which includes at least one of open/close a tailgate of the vehicle, open/close a flap of the vehicle, open/close one or more automatic doors of the vehicle, open/close one or more windows of the vehicle, activate/deactivate a function of a head-up display, and activate/deactivate a content that is output via the head-up display, in consideration of the determined functional relationship (Fig. 1 and related text; “smart key may be embedded in the wearable device and may be connected in hardware by a connector. According to an embodiment, the acceleration sensor may obtain the gesture information about an X-axis, a Y-axis, and a Z-axis by sensing a specific gesture of gestures that the driver makes while approaching the vehicle electronic device. According to an embodiment, the specific gesture may include at least one of an operation of holding a handle of vehicle door by extending a hand wearing the wearable device, an operation of pressing a handle button of the vehicle door, and an operation of pulling the vehicle door toward a body of the driver to open the vehicle door”, ¶10-12),
wherein the vehicle comprises a control unit designed to control or regulate the predefinable vehicle function in accordance with the determined association (Fig. 1 and related text, control device 12; “smart key may be embedded in the wearable device and may be connected in hardware by a connector. According to an embodiment, the acceleration sensor may obtain the gesture information about an X-axis, a Y-axis, and a Z-axis by sensing a specific gesture of gestures that the driver makes while approaching the vehicle electronic device. According to an embodiment, the specific gesture may include at least one of an operation of holding a handle of vehicle door by extending a hand wearing the wearable device, an operation of pressing a handle button of the vehicle door, and an operation of pulling the vehicle door toward a body of the driver to open the vehicle door”, ¶10-12).
As to claim 12, Jeon teaches a system wherein the vehicle and the wearable device each comprise a communication unit, and the vehicle and the wearable device are configured to set up a Bluetooth Low Energy connection to one another (Fig. 1 and related text; “communication devices 11, 21 and 31 may transmit and receive wireless signals including data to and from a terminal within a distance from the communication devices through communication schemes, such as … Bluetooth Low Energy (BLE)”, ¶54).
As to claim 13, Jeon teaches a system wherein the computing unit is further configured to:
determine a local relationship of the processed sensor data with the vehicle, the determined local relationship being taken into consideration for determining the association with the predefinable vehicle function in consideration of the determined functional relationship (Fig. 1 and related text, control device 12; “smart key may be embedded in the wearable device and may be connected in hardware by a connector. According to an embodiment, the acceleration sensor may obtain the gesture information about an X-axis, a Y-axis, and a Z-axis by sensing a specific gesture of gestures that the driver makes while approaching the vehicle electronic device. According to an embodiment, the specific gesture may include at least one of an operation of holding a handle of vehicle door by extending a hand wearing the wearable device, an operation of pressing a handle button of the vehicle door, and an operation of pulling the vehicle door toward a body of the driver to open the vehicle door”, ¶10-12).
As to claim 14, Jeon teaches a system wherein the computing unit is further configured to: determine a temporal relationship of the processed sensor data with the vehicle, the determined temporal relationship being taken into consideration for determining the association with the predefinable vehicle function in consideration of the determined functional relationship (Figs. 2Aand 2B and related text; “in FIGS. 2A and 2B, since an operation of opening a vehicle door is nearly similar for respective drivers, a pattern of data may be similar. However, a time period during which the operation is performed may be different. It is understood that FIG. 2A is a graph illustrating information about a gesture, which is slowly made, and it is understood that FIG. 2B is a graph illustrating information about a gesture, which is quickly made.”, ¶64).
As to claim 15, Jeon teaches a system wherein the local relationship comprises the detection of a direction of the motion data relating to the user of the wearable device, the detected direction being taken into consideration for determining the association with the predefinable vehicle function in consideration of the determined functional relationship (“driver approaches the vehicle, and may allow a sensor of a wearable device to automatically turn on or off depending on the recognition of the intention to get on and off a vehicle”, ¶7).
As to claim 16, it is an apparatus claim that recites substantially the same limitations as the apparatus claim 14. As such, claim 16 is rejected for substantially the same reasons given for the claim 14 and are incorporated herein.
As to claims 17, 18, 19, 20 and 21, they are method claims that recite substantially the same limitations as the respective apparatus claims 11, 12, 13, 14, and 15. As such, claims 17, 18, 19, 20 and 21 are rejected for substantially the same reasons given for the claims 11, 12, 13, 14, and 15 and are incorporated herein.
As to claim 22, it is an apparatus claim that recites substantially the same limitations as the apparatus claim 20. As such, claim 22 is rejected for substantially the same reasons given for the claim 20 and are incorporated herein.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Examiner’s Note
The examiner has pointed out particular references contained in the prior art of record in the body of this action for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. Applicant should consider the entire prior art as applicable as to the limitations of the claims. It is respectfully requested from the applicant, in preparing the response, to consider fully the entire references as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner.
Examiner’s Request
The examiner requests, in response to this office action, support must be shown for language added to any original claims on amendment and any new claims. That is, the applicant is requested to indicate support for amended claim language and newly added claim language by specifically pointing to page(s) and line number(s) in the specification and/or drawing figure(s). (MPEP 2163 I. B. New or Amended Claims). This will assist the examiner in prosecuting the application. When responding to this office action, applicant is advised to clearly point out the patentable novelty which he or she thinks the claims present, in view of the state of art disclosed by the references cited or the objections made. He or she must also show how the amendments avoid such references or objections. In amending in reply to a rejection of claims in an application or patent under reexamination, the applicant or patent owner must clearly point out the patentable novelty which he or she thinks the claims present in view the state of the art disclosed by the references cited or the objections made. The applicant or patent owner must also show how the amendments avoid such references or objections.
Inquiry
Any inquiry concerning this communication or earlier communications from the examiner should be directed to YUEN WONG whose telephone number is (313)446-4851. The examiner can normally be reached on M-F 9-5:30 EST.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Faris Almatrahi, can be reached on (313)446-4821. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/YUEN WONG/ Primary Examiner, Art Unit 3667