DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-24 are rejected under 35 U.S.C. 103 as being unpatentable over Dascola et al. (US Patent Publication No. 2022/0214743; hereinafter Dascola) and Reichelt et al. (US Patent Publication No. 2017/0043995; hereinafter Reichelt).
With reference to claim 1, Dascola discloses a device (101) for extended reality (see paragraph 56; Figs. 1-3) comprising:
a sensor configured to generate image data representative of a physical environment (100) of the device (101) (see paragraph 79; Figs. 1-2, 7);
a tracking sensor (244) configured to determine movement of the device within the physical environment (see paragraph 79; Fig. 1);
a display (120) configured to show a view of an extended reality environment (see paragraph 72; Fig. 3);
a processor (110) configured to control the view of the extended reality environment based on the image data, the determined movement of the device, and a position of the device in the physical environment (see paragraphs 71-72; Fig. 1); and
a control mechanism configured to obtain user input effective to interact with the extended reality environment (see paragraph 161; Figs. 7E-F).
While disclosing the display as described above, Dascola fails to specifically disclose a circular display as intended by the claims.
Reichelt discloses a display device (18), wherein a substantially circular portion of the display is visible to the user (see paragraph 63; Figs. 1, 4-10, 12-14).
Therefore it would have been obvious to one of ordinary skill in the art to allow a circular shaped display similar to that which is taught by Reichelt to be carried out in a system similar to that which is taught by Dascola to thereby provide an alternative housing arrangement to be viewed by the user (see Reichelt; paragraph 63).
With reference to claim 2, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses wherein the processor is configured to generate, based on the image data, an extended reality environment as a virtual representation of the physical environment (see paragraph 56; Figs. 7).
With reference to claim 3, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses wherein the tracking sensor includes at least one of an accelerometer, a magnetometer, a gyroscope, or a magnetic compass (see paragraph 85).
With reference to claim 4, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses wherein the tracking sensor includes a global positioning system (see paragraph 74).
With reference to claim 5, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses an eye tracking device (243) configured to track at least one of a location, an orientation, or movement of an eye of a user to orient the view of the extended reality environment shown on the display (see paragraph 79).
With reference to claim 6, Dascola and Reichelt disclose the device of claim 5, wherein Dascola further discloses wherein: the extended reality environment comprises a virtual character including an eye (in teaching viewpoint of a front facing head position of virtual participants; see paragraphs 277-278; Figs. 7V); and the eye tracking device is configured to track at least one of the location, the orientation, or the movement of the eye of the user to adjust a direction of a gaze of the eye of the virtual character to maintain apparent eye contact with the user (see paragraphs 277-278; Figs. 7V).
With reference to claim 7, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses wherein: the extended reality environment comprises a virtual character (7203) (see paragraph 276; Figs. 7U-V); and the device further comprises a speaker configured to output audible sounds representative of speech of the virtual character (see paragraph 307-308; Figs. 7Y-Z).
With reference to claim 8, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses wherein the sensor includes a camera (see paragraph 72).
With reference to claim 9, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses further comprising a hand-tracking sensor (245) configured to track movement of a hand of a user in the physical environment to permit interaction of the user with a virtual object in the extended reality environment (see paragraph 79; Fig. 4).
With reference to claim 10, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses wherein the processor is configured to maintain a particular orientation of the extended reality environment regardless of movement of the device (see paragraph 66).
With reference to claim 11, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses wherein the display is touch sensitive and the display is configured to show an on-screen user interface that a user may interact with via touch (see paragraphs 244, 282).
With reference to claim 12, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses wherein: the display is configured to show an on-screen user interface that a user may interact with; and a position of the on-screen user interface on the display is based on a handedness of a user (see paragraphs 139-140; Fig. 7A).
With reference to claim 13, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses wherein the control mechanism includes a button configured to indicate an interaction of a user with the extended reality environment (see paragraph 132).
With reference to claim 14, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses wherein the control mechanism permits interaction of a user with the extended reality environment based on a rotational angle of the device (see paragraph 281).
With reference to claim 15, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses further comprising a rechargeable battery (in teaching the device can be one of a device well-known to include a rechargeable battery; see paragraphs 72, 275).
With reference to claim 16, Dascola and Reichelt disclose the device of claim 1, wherein Dascola further discloses wherein: the extended reality environment comprises a virtual object (see paragraph 143-144; Figs. 7A-B); and the processor is configured to modify a depiction of the virtual object within the extended reality environment based on the user input to permit manipulation of the virtual object within the extended reality environment (see paragraph 136, 139-141; Figs. 7).
With reference to claim 17, Dascola discloses a system (101) (see Figs. 1-3) comprising:
a display (120) configured to show a view of an extended reality environment (see paragraph 72; Fig. 3);
a processor (110) configured to control the view of the extended reality environment (see paragraphs 71-72; Fig. 1);
a housing (101) configured to house the processor and the display (120) (see paragraph 79; Figs. 1-2, 7); and
effective to initiate engagement with the extended reality environment.
While disclosing the usage of an handheld device (101) (see paragraphs 56, 72), however fails to disclose a lower housing as recited.
Reichelt discloses a housing comprising an upper body (12) for housing a display (18) and a lower body (10) stemming from the upper body (12), the lower body configured to act as a handle for the system (see paragraphs 40 for manipulation of a virtual environment (see column 13, lines 42-50; Fig. 1B) wherein the housing comprising a lower body (10) configured to act as a handle for the system (see Fig. 1B); and a button (29) attached to the lower body, the button (29) configured to receive user input (see paragraph 40; Fig. 1).
Therefore it would have been obvious to allow for usage of a handle similar to that which is taught by Reichelt to be used as a portion of the handheld device similar to that which is taught by Dascola to provide the user with ease of holding the device.
With reference to claim 18, Dascola and Reichelt disclose the system of claim 17, wherein Dascola further discloses that the processor is configured to generate a virtual object (7002) within the extended reality environment (see paragraph 144; Figs. 7).
With reference to claim 19, Dascola and Reichelt disclose the system of claim 17, wherein Dascola further discloses that the processor is configured to maintain a particular orientation of the extended reality environment regardless of movement of the system (see paragraph 66).
With reference to claim 20, Dascola and Reichelt disclose the system of claim 17, wherein Dascola further discloses a tracking sensor (244) configured to determine movement of the system (see paragraph 79; Fig. 1).
With reference to claim 21, Dascola and Reichelt disclose the system of claim 17, wherein Dascola further discloses wherein the processor is configured to control the view of the extended reality environment based on the determined movement of the system (see paragraphs 71-72).
With reference to claim 22, Dascola and Reichelt disclose the system of claim 17, wherein Dascola further discloses a camera configured to capture image data representative of a physical environment of the system, wherein the extended reality environment is generated based on the image data (see paragraph 72).
With reference to claim 23, Dascola and Reichelt disclose the system of claim 17, wherein Dascola further discloses a hand-tracking sensor (245) configured to track movement of a hand of a user in a physical environment of the system to permit interaction of the user with a virtual object in the extended reality environment (see paragraph 79; Fig. 4).
With reference to claim 24, Dascola discloses a device (101) for extended reality (see Figs. 1-3) comprising:
a sensor configured to generate environmental data (see paragraph 72);
a processor (110) configured to generate an extended reality environment based on the environmental data (see paragraphs 71-72; Fig. 1);
a display (120) configured to show a view of the extended reality environment (see paragraph 72; Fig. 3);
a housing (101) configured to partially house the display, the housing being sized and shaped to be held in a hand of a user (see paragraphs 56, 72); and
a control mechanism positioned on a periphery of the housing, the control mechanism configured to obtain user input effective to interact with the extended reality environment, wherein the processor is configured to control the view of the extended reality environment shown on the display based on the user input (see paragraph 132).
While disclosing the display as described above, Dascola fails to specifically disclose a circular display as intended by the claims.
Reichelt discloses a display device (18), wherein a substantially circular portion of the display is visible to the user (see paragraph 63; Figs. 1, 4-10, 12-14).
Therefore it would have been obvious to one of ordinary skill in the art to allow a circular shaped display similar to that which is taught by Reichelt to be carried out in a system similar to that which is taught by Dascola to thereby provide an alternative housing arrangement to be viewed by the user (see Reichelt; paragraph 63).
Response to Arguments
Applicant’s arguments with respect to claims 1-24 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Further, the examiner finds that the amendment to claims 1 and 24, fail to recite the invention as the applicant intends. Specifically, “a substantially circular portion of the display is visible to a user” as recited, does not limit the display to be circular as the applicant intends. Any display which includes a substantially circular portion that is visible to the user would read on such limitation. The applicant argues that Dascola discloses rectangular shaped display devices and is silent regarding any limitations on a circular portion of the display being visible to the user. The examiner finds that Dascola does disclose a rectangular display, however a substantially circular portion of the rectangular display is visible to a user, as the case with most display devices. However, in order to expedite prosecution the examiner has rejected the claims as the applicant intends to recite based on the presented arguments. Reichelt discloses a display having a variety of different shapes square or rectangular as disclosed by Dascola, as well as circular as disclosed in this invention. Therefore, Reichelt discloses a substantially circular portion of the display is visible to a user. Therefore the examiner finds that the combination of Dascola in view of Reichelt disclose the recited invention.
Pertinent Prior Art
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
a. ELMIEH et al. (US11,172,189) discloses a system configured to provide augmented-reality to the user based on the captured environment (se column 8, lines 2-column 9, line 4; Figs. 1-5).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALECIA DIANE ENGLISH whose telephone number is (571)270-1595. The examiner can normally be reached Mon.-Fri. 7:00am-3:00am.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William Boddie can be reached at 571-272-0666. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ADE/Examiner, Art Unit 2625
/WILLIAM BODDIE/Supervisory Patent Examiner, Art Unit 2625