DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1-20 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of copending Application No. 18/416,826 (reference application). Although the claims at issue are not identical, they are not patentably distinct from each other because this application is a continuation of 18/416,826 and this application claims with more words but in a broader manner the invention concisely claimed in 18/416,826.
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
The claims map to each other as follows:
Instant Application
Co-pending 18/416,826
Claim 1
A computer-implemented method of using augmented reality (AR) for validating proper fastening of a vehicle seat
via an application executing on a mobile computing device, the method comprising:
the application causing one or more
processors of the mobile computing device to obtain input data via an input interface of the mobile computing device, the input data including one or more of vehicle data, vehicle seat data, or child data;
the application causing the one or more processors to obtain underlay layer data generated by an image sensor of the mobile computing device;
the application causing the one or more processors to generate overlay layer data based upon the input data and/or the underlay layer data, the overlay layer data including an indication associated with proper fastening of the vehicle seat;
the application causing the one or more processors to corelate the overlay layer data with the underlay layer data;
the application causing the one or more processors to create an AR display based upon the correlation; and
the application causing the one or more processors to present the AR display via an AR interface of the mobile computing device.
Claim 1
A computer-implemented method of using augmented reality (AR) for visualizing proper fastening of a vehicle seat comprising:
Claim 2, The computer-implemented method of claim 1, wherein the AR viewer device is one of: (i) a smartphone (interpreted as reading on an application executing on a smartphone).
receiving, by one or more processors, input data, the input data including one or more of vehicle specification data, vehicle seat specification data, or child data;
receiving, by the one or more processors, underlay layer data indicative of a field of view (FOV) associated with an AR viewer device;
generating, by the one or more processors, overlay layer data based upon the input data, the overlay layer data including an indication of a proper fastening of the vehicle seat;
correlating, by the one or more processors, the overlay layer data with the underlay layer data;
creating, by the one or more processors, an AR display based upon the correlation; and
presenting, by the one or more processors to the AR viewer device, the AR display.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-2, 4-7, 9, 12-14, 16-18 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Rieger (US 20080278821 A1) in view of Hu (US 20220203930 A1).
Regarding claim 1, A computer-implemented method of using augmented reality (AR) for validating proper fastening of a vehicle seat via an application executing on a mobile computing device (Rieger [0015], “a display method of the head-mounted display system”; [0028], “a seatbelt warning of an occupant of the vehicle not being belted in his seat. All such information may be provided as a part of the display information to the occupant or driver 202 wearing the head-mounted display 204 (a mobile computing device).”; [0037], “the head-mounted display achieving an augmented reality”; [0050], “program for use by … device (an application program executing on the mobile computing device).”), the method comprising:
the application causing one or more processors of the mobile computing device to obtain input data via an input interface of the mobile computing device, the input data including one or more of vehicle data, vehicle seat data, or child data (Rieger [0028], “receive information from the vehicle concerning the state of the vehicle … a seatbelt warning of an occupant of the vehicle not being belted in his seat”; [0050], “processor-containing system … program for use by … device (an application program executing on the mobile computing device).”);
the application causing the one or more processors to generate overlay layer data based upon the input data and/or the underlay layer data, the overlay layer data including an indication associated with fastening of the vehicle seat (Rieger [0028], “receive information from the vehicle concerning the state of the vehicle … a seatbelt warning (generate overlay layer data based upon the input data) of an occupant of the vehicle not being belted in his seat (an indication associated with fastening of the vehicle seat)”);
the application causing the one or more processors to obtain underlay layer data generated by an image sensor of the mobile computing device (Rieger [0025], “the line of sight may be detected for example by a line of sight sensor 212 … an eye movement sensor or camera incorporated into the head-mounted display 204 detecting the current line of sight of the eyes of the occupant relative to the head-mounted display 204 (underlay data in the line of sight sight/FOV).”);
the application causing the one or more processors to corelate the overlay layer data with the underlay layer data (Rieger [0045], “At step 406, display information to be displayed to a user to which the display method is applied is generated from the processed surrounding information … such display information may include a combination (a correlation by the processing unit) … the front view (exemplary overlay data) represents the main portion of the display information. Furthermore, the display information may include additional information (exemplary underlay data)”);
the application causing the one or more processors to create an AR display based upon the correlation (Rieger [0045], “the head-mounted display may provide (create an AR display based upon the correlation/combination) the transparent view only for parts of the display area of the head-mounted display achieving an augmented reality, including a combination (a correlation) of the processed display information and the real view directly through the head-mounted display”); and
the application causing the one or more processors to present the AR display via an AR interface of the mobile computing device (Rieger [0028], “a seatbelt warning of an occupant of the vehicle not being belted in his seat. All such information may be provided as a part of the display information to the occupant or driver 202 wearing the head-mounted display 204.”).
Rieger does not disclose (highlighted)
the overlay layer data including an indication associated with proper fastening of the vehicle seat
However, Hu discloses
However, Hu discloses (highlighted)
the overlay layer data including an indication associated with proper fastening of the vehicle seat (Hu [0078], “A system for seatbelt localization may combine or otherwise augment determined seatbelt models together to determine a final seatbelt model”; [0081], “FIG. 7 illustrates … a modeled input image 706 (comprising overlay layer data including an indication of a proper fastening of the vehicle seat) ”).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Rieger with Hu to generate and display proper seatbelt fastening information. This would have been done to inform user the proper way of wearing seatbelts and thereby improve safety. See for example Hu [0022], “there are instances where the occupant has the seatbelt on, but is wearing it incorrectly (e.g., the seatbelt is worn behind the back). To combat this and improve traffic safety, the techniques described herein use video/images captured by a camera on board the vehicle to determine whether a seatbelt is being properly worn.”
Regarding claim 2, Rieger in view of Hu discloses the computer-implemented method of claim 1, wherein the mobile computing device is communicatively coupled with one or more sensors of a vehicle in which the vehicle seat resides (Rieger fig. 2; 214, 214; [0022], “transceiver 214 located, for example, in the dash board of the vehicle 100 and the transceiver 206 may be utilized for transmitting and receiving … to and from the head-mounted display 204.”; [0025], “sensor 212, which may be a camera mounted inside the vehicle (the mobile computing device is communicatively coupled with one or more sensors of a vehicle in which the vehicle seat resides)”), and the method further comprises:
obtaining, from the one or more sensors, sensor data (Hu [0022], “A camera that is set up in a vehicle captures an image and the image is analyzed to determine whether the occupant is wearing a seatbelt properly”); and
the application causing the one or more processors to generate overlay layer data based upon the sensor data (Hu [0081], “FIG. 7 illustrates … a modeled input image 706 (comprising overlay layer data generated based on sensor data) ”).
Regarding claim 4, Rieger in view of Hu discloses the computer-implemented method of claim 1, wherein corelating the overlay layer data with the underlay layer data comprises:
the application causing the one or more processors to analyze the underlay layer data to detect one or more fastening elements of the vehicle seat and an occupant of the vehicle seat (Hu “a system first analyzes a captured image to classify whether pixels are a part of a seatbelt … the system then parameterizes the seatbelt, and models the seatbelt's shape using a high order polynomial curve … The final seatbelt curve can be used to enhance occupant safety, such as to determine whether the seatbelt is being properly worn (e.g., the seatbelt is in the locked position and is worn diagonally and across the occupant's chest (detecting that one or more fastening elements are aligned with the indication of the proper fastening of the vehicle seat)).”);
the application causing the one or processors to determine a positioning of the one or more fastening elements relative to the occupant when properly fastened (Hu [0078], “A system for seatbelt localization may combine or otherwise augment determined seatbelt models together to determine a final seatbelt model”; [0081], “FIG. 7 illustrates … a modeled input image 706 (comprising overlay layer data including an indication of a proper fastening of the vehicle seat) ”); and
the application causing the one or more processors to correlate the indication associated with the proper fastening of the vehicle seat with a position in the AR display indicative of the positioning of the one or more fastening elements relative to the occupant when properly fastened (Hu [0061], “An automatic location mask generation 110 may generate a seatbelt location mask, which may be a visual indication of a location of a seatbelt within an image … Camera localization may utilize one or more models of a vehicle to determine seatbelt anchors' coordinates or positions. 3D reconstruction may comprise processes that may calculate a relative relationship between seatbelt anchors (fastening elements of the vehicle seat)”; Hu [0082], “A seatbelt localizer 704 may analyze an input image 702 to identify and model seatbelts (underlay layer data) of the input image 702. The seatbelt localizer 704 may visualize the modeled seatbelts (overlay layer data) through a modeled input image 706 (modeled image comprises data corresponding to where the anchor/one or more fastening elements of the vehicle seat, will be positioned when properly fastened as indicated by the ON indicator).”).
Regarding claim 5, Rieger in view of Hu discloses the computer-implemented method of claim 1, wherein corelating the overlay layer data with the underlay layer data comprises:
the application causing the one or more processors to analyze the underlay layer data to detect the vehicle seat (Hu [0027], “one or more processes that identify a location, size, shape, orientation, and/or other characteristics of one or more seatbelts depicted in one or more images (e.g., an image of a … passenger seat of a vehicle (vehicle seat))”);
the application causing the one or more processors to determine a positioning of the vehicle seat when properly fastened (Hu [0023], “a system first analyzes a captured image to classify whether pixels are a part of a seatbelt … the system then parameterizes the seatbelt, and models the seatbelt's shape using a high order polynomial curve … The final seatbelt curve can be used to enhance occupant safety, such as to determine whether the seatbelt is being properly worn (e.g., the seatbelt is in the locked position and is worn diagonally and across the occupant's chest (analyzing that one or more fastening elements are aligned with the indication of the proper fastening of the vehicle seat)).”); and
the application causing the one or more processors to correlate the indication associated with the proper fastening of the vehicle seat with a position in the AR display indicative of the positioning of the vehicle seat when properly fastened (Hu [0078], “A system for seatbelt localization may combine or otherwise augment determined seatbelt models together to determine a final seatbelt model”; [0081], “FIG. 7 illustrates … a modeled input image 706 (AR display comprising a notification indicating that the vehicle seat is properly fastened) ”).
Regarding claim 6, Rieger in view of Hu discloses the computer-implemented method of claim 1, further comprising:
analyzing, by the application, the underlay layer data and the overlay layer data to detect an alignment between the vehicle seat, one or more fastening elements of the vehicle seat, and/or an orientation of an occupant of the vehicle seat (Hu [0023], “a system first analyzes a captured image to classify whether pixels are a part of a seatbelt … the system then parameterizes the seatbelt, and models the seatbelt's shape using a high order polynomial curve … The final seatbelt curve can be used to enhance occupant safety, such as to determine whether the seatbelt is being properly worn (e.g., the seatbelt is in the locked position and is worn diagonally and across the occupant's chest (detecting that one or more fastening elements are aligned with the indication of the proper fastening of the vehicle seat)).”); and
the application causing the one or more processors to update the AR interface based upon the detected alignment (Hu [0087], “Referring to FIG. 8, a modeled input image 806 may depict an input image 802 with a seatbelt and seatbelt status (e.g., orientation/position) indicated … modeled input image 806 may depict a box that indicates a driver's seatbelt and a label “Seatbelt: OFF” that indicates that the driver's seatbelt is worn but applied incorrectly (updated if user doesn’t wear the seatbelt correctly).”).
Regarding claim 7, The computer-implemented method of claim 6, wherein updating the AR interface comprises:
the application causing the one or more processors to detect that the alignment between the vehicle seat, one or more fastening elements of the vehicle seat, and/or an orientation of an occupant of the vehicle seat is proper (Hu [0023], “a system first analyzes a captured image to classify whether pixels are a part of a seatbelt … the system then parameterizes the seatbelt, and models the seatbelt's shape using a high order polynomial curve … The final seatbelt curve can be used to enhance occupant safety, such as to determine whether the seatbelt is being properly worn (e.g., the seatbelt is in the locked position and is worn diagonally and across the occupant's chest (detecting that one or more fastening elements are aligned with the indication of the proper fastening of the vehicle seat)).”); and
the application causing the one or more processors to generate a notification indicating that the vehicle seat is properly fastened based upon the detected alignment (Hu [0078], “A system for seatbelt localization may combine or otherwise augment determined seatbelt models together to determine a final seatbelt model”; [0081], “FIG. 7 illustrates … a modeled input image 706 (AR display comprising a notification indicating that the vehicle seat is properly fastened based on aligning the occupant and the seatbelt)”).
Regarding claim 9, Rieger in view of Hu discloses the computer-implemented method of claim 6, wherein updating the AR interface comprises:
the application generating one or more of: (i) a visual notification, (ii) a textual notification, (iii) an audio notification, or (iv) a haptic notification (Rieger [0028], “receive information from the vehicle concerning the state of the vehicle … a seatbelt warning of an occupant of the vehicle not being belted in his seat”).
Regarding claim 12, Rieger in view of Hu discloses the computer-implemented method of claim 1, wherein the indication associated with proper fastening of the vehicle seat indicates one or more of:
(i) anchorage of the vehicle seat, (ii) orientation of an occupant in the vehicle seat, (iii) positioning of one or more fastening elements relative to an occupant’s armpits, (iv) positioning of one or more fastening elements relative to an occupant’s neckline, or (v) tightness of one or more fastening element (Hu [0061], “Camera localization may utilize one or more models of a vehicle to determine seatbelt anchors' coordinates or positions. 3D reconstruction may comprise processes that may calculate a relative relationship between seatbelt anchors and camera coordinate system.”).
Claim 13 recites a mobile computing system which corresponds to the function performed by the method of claim 1. As such, the mapping and rejection of claim 1 above is considered applicable to the mobile computing system of claim 13.
Additionally Rieger discloses
A mobile computing system (Rieger fig. 3)
an image sensor (Rieger [0015], “sensors may be cameras”);
one or more processors (Rieger [0050]); and a
non-transitory program memory coupled to the one or more processors and storing executable instructions that, when executed by the one or more processors (Rieger [0050]).
Claim 14 recites a mobile computing system which corresponds to the function performed by the method of claim 2. As such, the mapping and rejection of claim 2 above is considered applicable to the mobile computing system of claim 14.
Claim 16 recites a mobile computing system which corresponds to the function performed by the method of claim 4. As such, the mapping and rejection of claim 4 above is considered applicable to the mobile computing system of claim 16.
Claim 17 recites a mobile computing system which corresponds to the function performed by the method of claim 5. As such, the mapping and rejection of claim 5 above is considered applicable to the mobile computing system of claim 17.
Claim 18 recites a mobile computing system which corresponds to the function performed by the method of claim 6. As such, the mapping and rejection of claim 6 above is considered applicable to the mobile computing system of claim 18.
Claim 20 recites a tangible, non-transitory computer-readable medium which corresponds to the function performed by the method of claim 1. As such, the mapping and rejection of claim 1 above is considered applicable to the tangible, non-transitory computer-readable medium of claim 20.
Additionally Rieger discloses
A tangible, non-transitory computer-readable medium storing executable instructions of using augmented reality (AR) for validating proper fastening of a vehicle seat, the instructions, when executed by one or more processors of a mobile computing system (Rieger [0050]).
Claims 3, 8, 11, 15 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Rieger in view of Hu and further view of Thomas et al (US 20230026640 A1).
Regarding claim 3, Rieger in view of Hu discloses the computer-implemented method of claim 1, wherein corelating the overlay layer data with the underlay layer data comprises:
But does not disclose
the application causing the one or more processors to analyze the underlay layer data to detect an occupant of the vehicle seat;
the application causing the one or more processors to determine an orientation of the occupant when properly fastened into the vehicle seat; and
the application causing the one or more processors to correlate the indication associated with the proper fastening of the vehicle seat with a position in the AR display indicative of the orientation of the occupant when properly fastened into the vehicle seat.
However, Thomas discloses
the application causing the one or more processors to analyze the underlay layer data to detect an occupant of the vehicle seat (Thomas [0051], “detect objects in the image using image processing techniques … The landmark module 64 may determine whether the objects correspond to a body part based on predetermined relationships between (i) object sizes and shapes and (ii) body parts … the landmark module 64 may identify the main body parts (e.g., head, torso, arms, legs) of the occupant.”);
the application causing the one or more processors to determine an orientation of the occupant when properly fastened into the vehicle seat (Thomas [0023], “based on a posture of the occupant as indicated by the landmark. The seatbelt routing classification module is configured to determine whether a seatbelt is routed properly around the occupant”; [0051], “The landmark module 64 may superimpose the landmarks in a size and/or shape proportional manner over the corresponding body parts of the occupant to provide a simplified representation of the geometry and posture of the occupant.”); and
the application causing the one or more processors to correlate the indication associated with the proper fastening of the vehicle seat with a position in the AR display indicative of the orientation of the occupant when properly fastened into the vehicle seat (Thomas [0049], “The vehicle control module 22 may control the user interface device 24 to generate a message indicating whether the seatbelt 14 is properly … routed around the occupant 42.”).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Rieger further with Thomas to detect analyze user data to identify proper usage of a seatbelt. This would have done to generate information with respect to user safety in a vehicle.
Regarding claim 8, The computer-implemented method of claim 6, wherein updating the AR interface comprises:
But does not disclose
the application causing the one or more processors to detect that the alignment between the vehicle seat, one or more fastening elements of the vehicle seat, and/or an orientation of an occupant of the vehicle seat is improper; and
the application causing the one or more processors to generate a notification indicating that the vehicle seat is not properly fastened based upon the detected alignment.
However, Thomas discloses
the application causing the one or more processors to detect that the alignment between the vehicle seat, one or more fastening elements of the vehicle seat, and/or an orientation of an occupant of the vehicle seat is improper (Thomas [0015], “the improper seatbelt routing zone includes at least one of an incorrect side of head zone, an under arm zone, and an outside arm zone.”; [0023], “based on a posture of the occupant as indicated by the landmark. The seatbelt routing classification module is configured to determine whether a seatbelt is routed properly around the occupant”; [0051], “The landmark module 64 may superimpose the landmarks in a size and/or shape proportional manner over the corresponding body parts of the occupant to provide a simplified representation of the geometry and posture of the occupant.”); and
the application causing the one or more processors to generate a notification indicating that the vehicle seat is not properly fastened based upon the detected alignment (Thomas [0022], “the seatbelt routing classification module is configured to determine that the seatbelt is improperly routed if the seatbelt is not in front of the occupant when the seatbelt is secured to a seatbelt buckle of the vehicle seat.”).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Rieger further with Thomas to detect analyze user data to identify proper alignment of a seatbelt. This would have done to generate information with respect to user safety in a vehicle.
Regarding claim 11, Rieger in view of Hu discloses the computer-implemented method of claim 1, further comprising:
But does not disclose
detecting proper alignment between the vehicle seat, one or more fastening elements of the vehicle seat, or an orientation of an occupant of the vehicle seat with the indication associated with the proper fastening of the vehicle seat; and
in response to detecting the proper alignment, creating a second AR display configured to generate a second indication associated with proper fastening of the vehicle seat.
However, Thomas discloses
detecting proper alignment between the vehicle seat, one or more fastening elements of the vehicle seat, or an orientation of an occupant of the vehicle seat with the indication associated with the proper fastening of the vehicle seat (Thomas [0023], “based on a posture of the occupant as indicated by the landmark. The seatbelt routing classification module is configured to determine whether a seatbelt is routed properly around the occupant”; [0051], “The landmark module 64 may superimpose the landmarks in a size and/or shape proportional manner over the corresponding body parts of the occupant to provide a simplified representation of the geometry and posture of the occupant”); and
in response to detecting the proper alignment, creating a second AR display configured to generate a second indication associated with proper fastening of the vehicle seat (Thomas [0022], “the seatbelt routing classification module is configured to determine that the seatbelt is improperly routed if the seatbelt is not in front of the occupant when the seatbelt is secured to a seatbelt buckle of the vehicle seat.” ; [0059], “representations 54 of FIGS. 1 and 2, in proportion to the size and/or shape of the occupant and aligned with the occupant's location as indicated by the image generated by the in-cabin sensor 20. For example, the stick figure representation 52 of the adult occupant 42 shown in FIG. 1 is taller and wider than the stick figure representation 52 of the child occupant 56 shown in FIG. 2 (child representation is interpreted as reading on generating a second indication associated with proper fastening of the vehicle seat)”)
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Rieger further with Thomas to detect analyze user data to identify proper alignment of a seatbelt. This would have done to generate information with respect to user safety in a vehicle.
Claim 15 recites a mobile computing system which corresponds to the function performed by the method of claim 3. As such, the mapping and rejection of claim 3 above is considered applicable to the mobile computing system of claim 15.
Claim 19 recites a mobile computing system which corresponds to the function performed by the method of claim 11. As such, the mapping and rejection of claim 11 above is considered applicable to the mobile computing system of claim 19.
Claims 10 are rejected under 35 U.S.C. 103 as being unpatentable over Rieger in view of Hu and further view of Kathiresan et al (US 20230146434 A1).
Regarding claim 10, Rieger in view of Hu discloses the computer-implemented method of claim 1, wherein presenting the AR display via the AR interface comprises:
But does not disclose presenting one or more interface elements configured to detect user input indicative of proper alignment between the vehicle seat, one or more fastening elements of the vehicle seat, or an orientation of an occupant of the vehicle seat with the indication associated with the proper fastening of the vehicle seat.
However, Kathiresan discloses
presenting one or more interface elements configured to detect user input indicative of proper alignment between the vehicle seat, one or more fastening elements of the vehicle seat, or an orientation of an occupant of the vehicle seat with the indication associated with the proper fastening of the vehicle seat (Kathiresan [0049], “FIG. 3B depicts a hand 312 pointing at the fasten seatbelt sign 308. In response to detecting the hand gesture selecting the aircraft safety device (e.g., by feature vector detection and hand gesture detection), a safety instruction associated with the associated aircraft safety device may be retrieved from memory, composited over the frames of the video stream, and displayed to a head-mounted display.”).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Rieger further with Kathiresan to provide a user interface for a user to input data related to proper seatbelt fastening. This would have been done to provide users with accurate information regarding correct fastening of seatbelts.
Conclusion
See the notice of references cited (PTO-892) for prior art made of record, including art that is not relied upon but considered pertinent to applicant's disclosure.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JITESH PATEL whose telephone number is (571)270-3313. The examiner can normally be reached 8am - 5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Said A. Broome can be reached at (571) 272-2931. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JITESH PATEL/Primary Examiner, Art Unit 2612