DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
This is in response to applicant's amendment/response filed on 11/25/2026, which has been entered and made of record. Claims 1, 5-7, 12, 16-18 and 20 have been amended.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-8 and 12-20 are rejected under 35 U.S.C. 103 as being unpatentable over Rieger (US 20080278821 A1) in view of Nguyen (US 20180174449 A1) and further view of Hu (US 20220203930 A1).
Regarding claim 1, Rieger discloses a computer-implemented method of using augmented reality (AR) for visualizing proper fastening of a vehicle seat (Rieger [0015], “a display method of the head-mounted display system”; [0028], “a seatbelt warning of an occupant of the vehicle not being belted in his seat. All such information may be provided as a part of the display information to the occupant or driver 202 wearing the head-mounted display 204.”; [0037], “the head-mounted display achieving an augmented reality”) comprising:
receiving, by the one or more processors, underlay layer data indicative of a field of view (FOV) associated with an AR viewer device (Rieger [0025], “the line of sight may be detected for example by a line of sight sensor 212 … an eye movement sensor or camera incorporated into the head-mounted display 204 detecting the current line of sight of the eyes of the occupant relative to the head-mounted display 204 (underlay data in the line of sight sight/FOV).”);
generating, by the one or more processors, overlay layer data based upon the input data, the overlay layer data including an indication of a fastening of the vehicle seat (Rieger [0028], “a seatbelt warning (comprising an overlay layer data including an indication of a fastening of the vehicle seat) of an occupant of the vehicle not being belted in his seat. All such information may be provided as a part of the display information to the occupant or driver 202 wearing the head-mounted display 204.”; [0035], “the display control unit 304 of the head-mounted display system 300 may generate … a display information including superposed vehicle information”);
correlating, by the one or more processors, the overlay layer data with the underlay layer data (Rieger [0045], “At step 406, display information to be displayed to a user to which the display method is applied is generated from the processed surrounding information … such display information may include a combination (a correlation by the processing unit) … the front view (exemplary overlay data) represents the main portion of the display information. Furthermore, the display information may include additional information (exemplary underlay data)”);
creating, by the one or more processors, an AR display based upon the correlation (Rieger [0045], “the head-mounted display may provide (create an AR display based upon the correlation/combination) the transparent view only for parts of the display area of the head-mounted display achieving an augmented reality, including a combination (a correlation) of the processed display information and the real view directly through the head-mounted display”); and
presenting, by the one or more processors to the AR viewer device, the AR display (Rieger [0028], “a seatbelt warning of an occupant of the vehicle not being belted in his seat. All such information may be provided as a part of the display information to the occupant or driver 202 wearing the head-mounted display 204.”).
Rieger discloses occupant data (Rieger [0022] “A line of sight sensor 212, which may be a camera observing the head movements of the occupants”; [0023], “a processing unit 302 processing the surrounding information received from the sensor 102”);
But does not disclose (highlighted)
receiving, by one or more processors, input data, the input data including one or more of vehicle specification data, vehicle seat specification data, or child data.
However, Nguyen discloses (highlighted)
receiving, by one or more processors, input data, the input data including one or more of vehicle specification data, vehicle seat specification data, or child data (Nguyen [0311], “elements of the VSS 610 that may be tracked by the vehicle's systems, an embedded device, or an associated device that may form at least a portion of the VSS 610 including at least one of the vehicle class 612, the vehicle specification 614, and the vehicle status 616.”; [0322], “Vehicle specification 614 may include one or more data elements, processes, or functions used for identifying or measuring … seatbelt use”).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Rieger with Nguyen to utilize vehicle specification data for a vehicle. This would have been done to verify that data collected from the vehicle complies with the specification and the vehicle is operated in a safe manner.
Rieger in view of Nguyen does not disclose (highlighted)
generating, by the one or more processors, overlay layer data based upon the input data, the overlay layer data including an indication of a proper fastening of the vehicle seat.
However, Hu discloses (highlighted)
generating, by the one or more processors, overlay layer data based upon the input data, the overlay layer data including an indication of a proper fastening of the vehicle seat (Hu [0078], “A system for seatbelt localization may combine or otherwise augment determined seatbelt models together to determine a final seatbelt model”; [0081], “FIG. 7 illustrates … a modeled input image 706 (comprising overlay layer data including an indication of a proper fastening of the vehicle seat) ”).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Rieger further with Hu to generate and display proper seatbelt fastening information. This would have been done to inform user the proper way of wearing seatbelts and thereby improve safety. See for example Hu [0022], “there are instances where the occupant has the seatbelt on, but is wearing it incorrectly (e.g., the seatbelt is worn behind the back). To combat this and improve traffic safety, the techniques described herein use video/images captured by a camera on board the vehicle to determine whether a seatbelt is being properly worn.”.
Regarding claim 2, Rieger in view of Nguyen and further view of Hu discloses the computer-implemented method of claim 1, wherein the AR viewer device is one of: (i) a smartphone, (ii) smart glasses, (iii) an AR headset, (iv) a virtual reality (VR) headset, or (v) a mixed reality (MR) headset (Rieger [0037], “the head-mounted display achieving an augmented reality”).
Regarding claim 3, Rieger in view of Nguyen and further view of Hu discloses the computer-implemented method of claim 1, wherein at least a portion of the underlay layer data is generated by the AR viewer device (Rieger [0035], “the display control unit 304 of the head-mounted display system 300 may generate display information”; [0038], “The display control unit 304 may then use the information from the line of sight sensor to generate display information (at least a portion of the underlay layer data is generated by the AR viewer device) according to the line of sight of the user.”).
Regarding claim 4, Rieger in view of Nguyen and further view of Hu discloses the computer-implemented method of claim 1, wherein at least a portion of the underlay layer data is generated by one or more sensors of a vehicle (Rieger [0025], “the line of sight may be detected for example by a line of sight sensor 212, which may be a camera mounted inside the vehicle monitoring the occupant's head to determine the line of sight of the occupant (at least a portion of the underlay layer data is generated by one or more sensors of a vehicle)”).
Regarding claim 5, Rieger in view of Nguyen and further view of Hu discloses the computer-implemented method of claim 1,
wherein the indication of a proper fastening of the vehicle seat includes the proper fastening of one or more fastening elements (Hu [0023], “to determine whether the seatbelt is being properly worn (e.g., the seatbelt is in the locked position (proper fastening of one or more fastening elements) and is worn diagonally and across the occupant's chest).”; [0084], “Referring to FIG. 7, a modeled input image 706 may depict, on the left, a box that indicates a passenger's seatbelt and a label “Seatbelt: ON (indication of a proper fastening of the vehicle seat)” that indicates that the passenger's seatbelt is worn and applied correctly.”), and
wherein correlating the overlay layer data with the underlay layer data includes positioning the overlay layer data at a location within the underlay layer data corresponding to where the one or more fastening elements of the vehicle seat will be positioned when properly fastened (Hu [0061], “An automatic location mask generation 110 may generate a seatbelt location mask, which may be a visual indication of a location of a seatbelt within an image … Camera localization may utilize one or more models of a vehicle to determine seatbelt anchors' coordinates or positions. 3D reconstruction may comprise processes that may calculate a relative relationship between seatbelt anchors (fastening elements of the vehicle seat)”; Hu [0082], “A seatbelt localizer 704 may analyze an input image 702 to identify and model seatbelts (underlay layer data) of the input image 702. The seatbelt localizer 704 may visualize the modeled seatbelts (overlay layer data) through a modeled input image 706 (modeled image comprises data corresponding to where the anchor/one or more fastening elements of the vehicle seat, will be positioned when properly fastened as indicated by the ON indicator).”).
Regarding claim 6, Rieger in view of Nguyen and further view of Hu discloses the computer-implemented method of claim 5, further comprising:
detecting, by the one or more processors, that the one or more fastening elements are aligned with the indication of the proper fastening of the vehicle seat (Hu [0023], “a system first analyzes a captured image to classify whether pixels are a part of a seatbelt … the system then parameterizes the seatbelt, and models the seatbelt's shape using a high order polynomial curve … The final seatbelt curve can be used to enhance occupant safety, such as to determine whether the seatbelt is being properly worn (e.g., the seatbelt is in the locked position and is worn diagonally and across the occupant's chest (detecting that one or more fastening elements are aligned with the indication of the proper fastening of the vehicle seat)).”); and
wherein creating the AR display comprises generating, by the one or more processors, a notification indicating that the vehicle seat is properly fastened (Hu [0078], “A system for seatbelt localization may combine or otherwise augment determined seatbelt models together to determine a final seatbelt model”; [0081], “FIG. 7 illustrates … a modeled input image 706 (AR display comprising a notification indicating that the vehicle seat is properly fastened) ”).
Regarding claim 7, Rieger in view of Nguyen and further view of Hu discloses the computer-implemented method of claim 5, further comprising:
detecting, by the one or more processors, that the one or more fastening elements are not aligned with the indication of the proper fastening of the vehicle seat (Hu [0086], “An input image 802 may depict a driver wearing a seatbelt incorrectly (e.g., the driver is wearing the seatbelt behind the back). A seatbelt localizer 804 may analyze an input image 802 to identify and model a seatbelt of the input image 802 (detecting that one or more fastening elements are not aligned with the indication of the proper fastening of the vehicle seat). The seatbelt localizer 804 may visualize the modeled seatbelt through a modeled input image 806.”); and
wherein creating the AR display comprises generating, by the one or more processors, a notification indicating that the vehicle seat is not properly fastened (Hu, [0087], “FIG. 8, a modeled input image 806 may depict a box that indicates a driver's seatbelt and a label “Seatbelt: OFF” that indicates that the driver's seatbelt is worn but applied incorrectly.”).
Regarding claim 8, Rieger in view of Nguyen and further view of Hu discloses the computer-implemented method of claim 7, wherein the notification is one or more of: (i) a visual notification, (ii) a textual notification, (iii) an audio notification, or (iv) a haptic notification (Hu [0081], “FIG. 7 illustrates … a modeled input image 706 (visual notification)”).
Claim 12 recites a computer system which corresponds to the function performed by the method of claim 1. As such, the mapping and rejection of claim 1 above is considered applicable to the computer system of claim 12.
Additionally, Rieger discloses
a computer system of using augmented reality (AR) for visualizing proper fastening of a vehicle seat comprising: one or more processors; and a non-transitory program memory coupled to the one or more processors and storing executable instructions that, when executed by the one or more processors (Rieger [0050]).
Claim 13 recites a computer system which corresponds to the function performed by the method of claim 2. As such, the mapping and rejection of claim 2 above is considered applicable to the computer system of claim 13.
Claim 14 recites a computer system which corresponds to the function performed by the method of claim 3. As such, the mapping and rejection of claim 3 above is considered applicable to the computer system of claim 14.
Claim 15 recites a computer system which corresponds to the function performed by the method of claim 4. As such, the mapping and rejection of claim 4 above is considered applicable to the computer system of claim 15.
Claim 16 recites a computer system which corresponds to the function performed by the method of claim 5. As such, the mapping and rejection of claim 5 above is considered applicable to the computer system of claim 16.
Claim 17 recites a computer system which corresponds to the function performed by the method of claim 6. As such, the mapping and rejection of claim 6 above is considered applicable to the computer system of claim 17.
Claim 18 recites a computer system which corresponds to the function performed by the method of claim 7. As such, the mapping and rejection of claim 7 above is considered applicable to the computer system of claim 18.
Claim 19 recites a computer system which corresponds to the function performed by the method of claim 8. As such, the mapping and rejection of claim 8 above is considered applicable to the computer system of claim 19.
Claim 20 recites a tangible, non-transitory computer-readable medium which corresponds to the function performed by the method of claim 1. As such, the mapping and rejection of claim 1 above is considered applicable to the a tangible, non-transitory computer-readable medium of claim 20.
Additionally, Rieger discloses
A tangible, non-transitory computer-readable medium storing executable instructions of using augmented reality (AR) for visualizing proper fastening of a vehicle seat, the instructions, when executed by one or more processors of a computer system, cause the computer system (Rieger [0050]).
Claims 9-11 are rejected under 35 U.S.C. 103 as being unpatentable over Rieger in view of Hu and further view of Kathiresan et al (US 20230146434 A1).
Regarding claim 9, Rieger in view of Nguyen and further view of Hu discloses the computer-implemented method of claim 8, but does not disclose wherein the visual notification is a change in display of the indication of the one or more fastening elements.
However Kathiresan discloses
the visual notification is a change in display of the indication of the one or more fastening elements (Kathiresan fig. 3B-C; [0049], “FIG. 3C depicts a virtual model 314 of a seatbelt which has been composited over the frame, said virtual model of the seatbelt being fastened … the passenger may see how to fasten the seatbelt 306”).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Rieger further with Kathiresan to display a moving visual for a seatbelt. This would have been done to provide users with an informative display that would help them to properly fasten the seatbelt.
Regarding claim 10, Rieger in view of Nguyen and further view of Hu and further view of Kathiresan the computer-implemented method of claim 9, wherein creating the AR display comprises:
detecting, by the one or more processors, that a first fastening element of the one or more fastening elements is not aligned with the indication of the proper fastening of the vehicle seat (Hu [0086], “An input image 802 may depict a driver wearing a seatbelt incorrectly (e.g., the driver is wearing the seatbelt behind the back). A seatbelt localizer 804 may analyze an input image 802 to identify and model a seatbelt of the input image 802 (detecting that a first fastening element of the one or more fastening elements is not aligned with the indication of the proper fastening of the vehicle seat). The seatbelt localizer 804 may visualize the modeled seatbelt through a modeled input image 806.”); and
generating, by the one or more processors, an indication associated with the first fastening element have a first set of display settings and an indication of other fastening elements of the one or more fastening elements having a second set of display settings (Hu fig. 7-706 and fig. 8-806 represent an equivalent of an indication associated with the first fastening element have a first set of display settings and an indication of other fastening elements of the one or more fastening elements having a second set of display settings).
Regarding claim 11, Rieger in view of Nguyen and further view of Hu discloses the computer-implemented method of claim 1, but does not disclose wherein the AR display includes one or more instructions on how to properly fasten the vehicle seat.
However Kathiresan discloses
the AR display includes one or more instructions on how to properly fasten the vehicle seat (Kathiresan fig. 3B-C; [0049], “FIG. 3C depicts a virtual model 314 of a seatbelt which has been composited over the frame, said virtual model of the seatbelt being fastened … the passenger may see how to fasten the seatbelt 306”).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Rieger further with Kathiresan to display a moving visual for a seatbelt. This would have been done to provide users with an informative display that would help them to properly fasten the seatbelt.
Response to Arguments
Applicant's arguments filed 11/25/2026 have been fully considered but they are moot in view of the amendments made to the claims. The amendments necessitated further consideration, search and new grounds of rejections.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
See the notice of references cited (PTO-892) for prior art made of record, including art that is not relied upon but considered pertinent to applicant's disclosure.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JITESH PATEL whose telephone number is (571)270-3313. The examiner can normally be reached 8am - 5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Said A. Broome can be reached at (571) 272-2931. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JITESH PATEL/Primary Examiner, Art Unit 2612