DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
2. Claim(s) 1-4, 6-8, and 15 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Geisert et al. (US Patent/PGPub. No. 20230259194).
Regarding Claim 1,
Geisert et al. teach
an information processing apparatus ([0065], FIG. 14, i.e. computer system 1400) comprising:
a processor ([0067], FIG. 14, i.e. processor 1402), wherein the processor (i.e. please see above citation(s)),
moves ([0041], FIG. 6, i.e. the field of view 165a, 165b moves to cover different portions of the real-world environment 150) a self-area ([0048], FIG. 8, i.e. VR environment 160) set by the information processing apparatus user in accordance with the user's movement (i.e. please see above citation(s)),
determines the presence of obstacles ([0046], FIG. 8A-8D, i.e. second user 102b wearing a second VR display device 135b) that should be warned ([0046], FIG. 8A-8D, i.e. proximity warning 180) based on the self-area (i.e. please see above citation(s)).
Regarding Claim 2,
Geisert et al. teach
the information processing apparatus according to claim 1,
wherein the processor, when the user moves ([0046], FIG. 8A, i.e. Based on the second user 102b direction of movement 175 and approach toward the first user 102a … one or more of the VR display device 135 and/or the controllers 106 may … alert the user 102 of an impending collision with another user or obstacle), changes the shape of the self-area ([0047], FIG. 8A, i.e. proximity warning 180 may be rendered … (and to the right) to the field of view 165a (Note that 180 is added and altered the shape of 160 as position/posture of user changes)) according to the user's movement speed ([0046], FIG. 8A, i.e. speed greater than a threshold speed).
Regarding Claim 3,
Geisert et al. teach
the information processing apparatus according to claim 1, wherein the processor,
changes the shape of the self-area ([0047], FIG. 8A, i.e. proximity warning 180 may be rendered … (and to the right) to the field of view 165a (Note that 180 is added and altered the shape of 160 as position/posture of user changes)) according to the posture of the user (i.e. please see above citation(s)).
Regarding Claim 4,
Geisert et al. teach
the information processing apparatus according to claim 1,
wherein the processor,
when the user moves, update to self-area that overlapping the self-area at the past time and the self-area at the current time ([0046], FIG. 8A, i.e. if the second VR display device 135b approaches within a predetermined distance … to provide a proximity warning indicating … is approaching the first VR display device 135a (Please note that the VR environment 160 would have been the same, or overlapped, before and after proximity warning; the only different is the proximity warning 180)).
Regarding Claim 6,
Geisert et al. teach
the information processing apparatus according to claim 1, wherein the processor,
when an obstacle ([0046], FIG. 8A-8D, i.e. second VR display device 135b approaches within a predetermined distance to the first VR display device 135a) enters the self-area (i.e. please see above citation(s)), determines that there is an obstacle that should be warned ([0046], FIG. 8A-8D, i.e. proximity warning 180).
Regarding Claim 7,
Geisert et al. teach
the information processing apparatus according to claim 1, wherein the processor,
in advance sets an obstacle area ([0045], FIG. 8A-8D, i.e. threshold distance (e.g., 1 meter)) around the obstacle,
when at least part of the obstacle area is within the self-area ([0045], FIG. 8A-8D, i.e. approaches within a threshold), determines that there is an obstacle that should be warned ([0046], FIG. 8A-8D, i.e. proximity warning 180).
Regarding Claim 8,
Geisert et al. teach
the information processing apparatus according to claim 1, further comprising an external camera ([0025], FIG. 1A, i.e. forward-facing cameras 105A and 105B) to capture images of real space ([0025], FIG. 1A, i.e. capture images and videos of the real-world environment),
wherein the processor,
updates the self-area of the range ([0038], FIG. 8, i.e. VR environment 160) that overlaps with (FIG. 6 & 8, i.e. since the VR environment 160 is set to 1 meter around the user, 160 and 165 ranges are overlapped as shown by the figure(s)) the shooting range of the external camera ([0039], FIG. 6, i.e. field of view 165a and 165b; [0029], FIG. 1B, i.e. cameras 105A-B may share an overlapping field of view).
Regarding Claim 15,
Geisert et al. teach
the information processing apparatus according to claim 1, further comprising a distance measuring device ([0043], FIG. 1, i.e. position and orientation sensors); and
a display ([0025], FIG. 1, i.e. head-mounted VR display device 135),
wherein the processor,
acquires the distance information to obstacle ([0043], FIG. 1, i.e. to determine a distance and orientation of each VR display device 135a, 135b to the anchor point), based on the output from the distance measuring device,
determines the presence of obstacles that should be warned ([0046], FIG. 8A, i.e. Based on the second user 102b direction of movement 175 and approach toward the first user 102a), further using the distance information (i.e. please see above citation(s)),
when it is determined that there is an obstacle ([0046], FIG. 8A, i.e. 102b direction of movement 175 and approach toward the first user 102a) that should be warned, outputs that fact ([0046], FIG. 8A-8D, i.e. proximity warning 180) to the display (i.e. please see above citation(s)).
Allowable Subject Matter
3. Claim(s) 5 and 9-14 is/are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
4. The following is an examiner’s statement of reasons for allowance:
Geisert et al. (US Patent/PGPub. No. 20230259194) teach a method includes capturing, by a first VR display device, one or more frames of a shared real-world environment. The VR display device identifies one or more anchor points within the shared real-world environment from the one or more frames. The first VR display device receives localization information with respect to a second VR display device in the shared real-world environment and determines a pose of the first VR display device with respect to the second VR display device based on the localization information. A first output image is rendered for one or more displays of the first VR display device. The rendered image may comprise a proximity warning with respect to the second VR display device based on determining the pose of the first VR display device with respect to the second VR display device is within a threshold distance.
Dorn et al. (US PGPUB./Pat. No. 20230384592) teaches methods and systems are provided for integrating media cues into virtual reality scenes presented on a head mounted display (HMD) is disclosed. The method includes presenting a virtual reality scene on a display of an HMD. The method further includes receiving sensor data from one or more sensors in a real-world space in which the HMD is located. Then, identifying an object location of an object in the real-world space that produces a sound. The method includes generating a media cue in the virtual reality scene presented in the HMD. The media cue is presented in a virtual location that is correlated to the object location of the object in the real-world space.
The subject matter of the independent claims could either not be found or was not suggested in the prior art of record. The subject matter not found was a display device including
“…wherein the processor,
when the user moves, update to self-area that overlapping the self-area of the past time from the present to a predetermined time ago and the self-area of the present time.” (Claim 5),
“…further comprising a location information acquisition unit is configured to using a GPS receiver,
wherein the processor,
estimates the user's movement speed, based on the time variation of the location information acquired by the location information acquisition unit,
changes the shape of the self-area, based on the estimated movement speed.” (Claim 9),
“…further comprising an external camera to capture images of real space,
wherein the processor,
extracts feature points from the image information that the external camera captures and acquires, and acquires position information relative to the extracted feature points,
estimates the user's movement speed, based on the time variation of the acquired location information,
changes the shape of the self-area, based on the estimated speed of movement.” (Claim 10),
“…further comprising an external camera to capture images of real space,
wherein the processor,
acquires the position information of the user's hand or arm, based on the image information acquired by the external camera,
changes the shape of the self-area, based on the acquired position information of the user's hand or arm.” (Claim 11),
“…further comprising a memory unit,
wherein the processor,
stores in memory unit self-area at a predetermined time during the user's movement,
at the time of the user's movement, updates to self-area that overlapping the self-area of the past time stored in the memory unit and the self-area of the present time.” (Claim 12),
“…further comprising a memory unit,
wherein the processor,
stores in memory unit self-area at a predetermined time during the user's movement,
at the time of the user's movement, updates to self-area that overlapping self-area of the past time from the current time to a predetermined time ago stored in the memory unit, and the self-area of the present time.” (Claim 13),
in combination with the other elements (or steps) of the device or apparatus and method recited in the claims.
Any comments considered necessary by applicant must be submitted no later than the payment of the issue fee and, to avoid processing delays, should preferably accompany the issue fee. Such submissions should be clearly labeled “Comments on Statement of Reasons for Allowance.”
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to VINH TANG LAM whose telephone number is (571) 270-3704. The examiner can normally be reached Monday to Friday 8:00 AM to 5:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Nitin K Patel can be reached at (571) 272-7677. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/VINH T LAM/Primary Examiner, Art Unit 2628