DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
2. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
3. Claims 1-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Bruso et al. (US 2023/0031572 A1).
Regarding claims 1-20, Bruso discloses a physical and virtual task support and assistance system comprising:
a mobile device (mobile phone or other mobile device – see Par’s. 19, 33) adapted to capture an image of a component associated with a task (using video camera 102), transmit the task to an instructional system (AR system 200), receive content associated with the task from the instructional system and display the content on the mobile device (Par’s. 19-21 – camera captures image, system identifies task 110 and queries knowledge base to determine instructions which are displayed to the user on the mobile device as an “AR pattern” which can include an avatar, video, pictures, written and instructions spoken);
wherein the instructional system is adapted to access a database (knowledge base) that includes content associated with the task, receive component information (i.e. components in the image such as a hammer and nail), receive task information (hammer the nail), retrieve from the database an instruction for completing the task, transmit the instruction to the mobile device (Par’s. 19-21), receive a task status, complete a task order according to the task status being complete, establish a connection with a remote computer system (e.g. server 502 in embodiment using remote server – Par. 32) according to a task status being attempted-incomplete (see e.g. Par’s. 17, 24 – determine if an error was made indicative of attempted-incomplete, and/or determine task is partially complete); and,
wherein the remote computer system is in communications with the mobile device and adapted to receive the image from the mobile device, display the image, receive a notation to the image, and transmit the notation to the mobile device (notation in the form of AR pattern content such as an avatar alongside or overlaying the image, or pictures, instructions, etc. – Par. 21) (as per claim 1),
the instructional system is adapted to determine component attributes according to the image (e.g. identify a nail and hammer from the image – Par. 20) (as per claim 2),
the mobile device is adapted to capture component information and transmit the component information to the instructional system (in remote server embodiment – Par. 32); and, the instructional system is adapted to determine component attributes according to the component information (Par. 20) (as per claim 3),
the component information is a component information image (Par. 20) (as per claim 4),
the mobile device is adapted to display the content on the mobile device adjacent to a view of the component from the mobile device (Par. 24) (as per claim 5),
the mobile device is a first mobile device (e.g. I/O device 310) and a second mobile device (display device 308) adapted to display the content from the instructional system (see Par’s. 30-31) (as per claim 6),
a physical and virtual task support and assistance system comprising:
a mobile device adapted to stream video (live video translated into digital video data – Par. 19) of a component associated with a task, transmit the video to an instructional system, receive content associated with the task from the instructional system and display the content on the mobile device; and,
wherein the instructional system is adapted to receive the task, access a database that includes content associated with the task, determine component information from the received video, retrieve from the database an instruction for completing the task, transmitting the instruction to the mobile device (see Par’s. 19-21, as detailed with respect to claim 1 above), receiving a task status, completing a task order according to the task status being complete (user completes each step as indicated until the task is complete – Par. 24) (as per claim 7),
the instructional system is adapted to establish a connection with a remove computer device according to a task status being attempted-incomplete (error, or not all steps completed – Par’s. 17, 24); and, a remote computer system in communications with the mobile device and adapted to receive the video from the mobile device, derive an image from the video, receive a notation (AR pattern) to the image, transmit the notation and image to the mobile device (display AR pattern received from server – Par. 21) (as per claim 8),
the notation is a real time dynamic notation, and the mobile device is adapted to display the real time dynamic notation is in close proximity to the component associated with the task (Par’s. 19, 24) (as per claim 9),
the notation is a real time dynamic notation and overlayed with a view of the component associated with the task (user movement transposed onto avatar’s movement – Par’s. 19, 24) (as per claim 10),
the instructional system is adapted to determine component attributes according to the video (Par. 20) (as per claim 11),
the instructional system is adapted to receive feedback directed to the content, determine a modification to the content and modify the content thereby providing subsequent users with modified content (Par. 22) (as per claim 12),
the mobile device is a first mobile device (e.g. I/O device 310) and a second mobile device (display device 308) adapted to display the content from the instructional system (see Par’s. 30-31) (as per claim 13),
a physical and virtual task support and assistance system comprising:
a mobile device adapted to stream video of a component associated with a task, transmit the video to an instructional system, receive content associated with the task from the instructional system and display the content on the mobile device in proximity to the component in combination with a view of the component; and, wherein the instructional system is adapted to receive the task, access a database that includes content associated with the task, determine component information from the mobile device, retrieve from the database an instruction for completing the task, transmitting the instruction to the mobile device, receiving a task status, completing a task order according to the task status being complete (see Par’s. 17-21 and 24 as discussed above) (as per claim 14),
a remote computer system is in communication with the mobile device and adapted to receive the video from the mobile device, display the video, receive a notation to the video, transmit the notation to the mobile device (in remote server arrangement – Par’s. 32, 21) (as per claim 15),
the notation is a real time dynamic notation and is overlayed with a view of the component associated with the task (Par’s. 19, 24) (as per claim 16),
the mobile device is a first mobile device (e.g. I/O device 310) and a second mobile device (display device 308) adapted to display the content from the instructional system (see Par’s. 30-31) (as per claim 17),
the mobile device is adapted to set the task status to complete, and the instructional system is adapted to receive the task status (Par. 24) (as per claim 18),
the instructional system is adapted to modify the content according to the task status associated with the task (Par. 24) (as per claim 19), and
the mobile device is adapted to transmit notes (feedback or user comments) to the instructional system and the instructional system is adapted to modify the content according to the notes (Par. 22) (as per claim 20).
Conclusion
4. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See attached PTO-892.
5. Any inquiry concerning this communication or earlier communications from the examiner should be directed to PETER EGLOFF whose telephone number is (571)270-3548. The examiner can normally be reached on Monday - Friday 9:00 am - 5:00 pm.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Xuan Thai can be reached at (571) 272-7147. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Peter R Egloff/
Primary Examiner, Art Unit 3715