DETAILED ACTION
This Office Action is sent in response to Applicant's Communication received 11/19/2025 for 18410601. Claims 1-13 and 17-23 are presented.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Response to Arguments
In view of the replacement drawings filed 11/19/2025, the objection to the drawings has been withdrawn.
In view of Applicant's amendments, the objection to the specification has been withdrawn.
In view of Applicant's amendments, the objection of claims 1, 8, 13, and 20 has been withdrawn.
In view of Applicant's amendments, the 112 rejection of claims 5 and 13-20 have been withdrawn.
Applicant's arguments with respect to newly amended claim 1 have been considered but are not persuasive because the arguments do not apply to the newly cited Johnson reference being used in the current rejection.
Claims 13 and 20 recite similar limitations to those recited in claim 1 and remain rejected upon a similar basis as claim 1 as stated above.
Dependent claims 2-12, 17-19, and 21-23 remain rejected at least based on their dependence from independent claims 1 and 13.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 7-13, and 19-23 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kapinos et al. (US 20240036794 A1) in view of Johnson et al. (US 20180046352 A1).
As to claim 1, Kapinos discloses a method comprising: at an electronic device having a processor and a display [Fig. 1, para 0037, 0040, 0042, computer system includes processor and display device]:
… a first virtual object comprising a first surface and a second virtual object comprising a second surface presented in a view of an extended reality (XR) environment via the display, wherein the first virtual object is separated from the second virtual object by a gap located between the first virtual object and the second virtual object [Fig. 4, para 0052, 0055, 0059-0060, identify virtual projections (read: virtual object) of respective displays with display surfaces in 3D space viewed at user location, where displays are non-adjacent, non-contiguous, and separated by at least different depths (read: gap); also note viewed 3D space falls under the broadest reasonable interpretation of extended reality environment including an environment based on physical and/or virtual objects as consistent with Applicant's specification (para 0017)];
displaying a movement of a cursor across the first surface of the first virtual object in the view of the XR environment via the display [Fig. 5, para 0060, 0063-0064, 0073, present cursor moving on display surface presented by display device viewed by user];
determining that the movement of the cursor approaches or intersects a boundary of the first surface at a first position [Fig. 5, para 0064-0065, 0075, device identifies cursor movement abuts bottom edge (read: boundary) of display at location]; and
in accordance with determining that the movement of the cursor approaches or intersects the boundary of the first surface:
determining a second position on the second surface of a second virtual object in the XR environment based on a path of the cursor with respect an intersection point of a boundary of the second surface [Fig. 5, para 0054, 0059, 0065, 0076, determine location of disparate display surface in 3D space based on cursor motion vector (read: path) intersecting edge (read: boundary) of disparate display surface when cursor abuts bottom edge of display]; and
moving the cursor from the first position to the second position by discontinuing display of the cursor at the first position and initiating display of the cursor at the second position without displaying the cursor within the gap [Fig. 5, para 0065, 0076, continue cursor movement by removing presentation (read: discontinue display) of cursor from location at display and presenting (read: initiate display) cursor at display location on disparate display determined from cursor motion vector, note displaying cursor motion from display to disparate display does not display cursor between displays].
However, Kapinos does not specifically disclose generating on the display, a first virtual object comprising a first surface and a second virtual object comprising a second surface presented in a view of an extended reality (XR) environment via the display.
Johnson discloses generating on the display, a first virtual object comprising a first surface and a second virtual object comprising a second surface presented in a view of an extended reality (XR) environment via the display, wherein the first virtual object is separated from the second virtual object by a gap located between the first virtual object and the second virtual object [Figs. 4A-4B, 5A-5B, para 0017-0018, 0020, 0041-0042, 0050, 0062, computing device with display generates virtual objects with size and position in virtual reality environment at non-continuous positions in world space environment; note object surfaces as shown in Figures 4B and 5B and viewing real-world environment with virtual imagery falls under broadest reasonable interpretation of extended reality environment including an environment based on physical and/or virtual objects as consistent with Applicant's specification (para 0017)].
Kapinos and Johnson are analogous art to the claimed invention being from a similar field of endeavor of virtual environment interfaces. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the virtual objects presented in a view of an extended reality environment as disclosed by Kapinos with generated virtual objects presented in a view of an extended reality environment as disclosed by Johnson with a reasonable expectation of success.
One of ordinary skill in the art would be motivated to modify Kapinos as described above to present any number of objects in virtual reality [Johnson, para 0018, 0020].
As to claim 7, Kapinos discloses the method of claim 1, method of claim 1, wherein the cursor is an indicator showing positions of a user interaction within the XR environment responsive to user input [para 0021-0022, 0063, 0073, display cursor on display based on user input in 3D space].
As to claim 8, Kapinos discloses the method of claim 1, wherein one of the first virtual object or the second virtual object are flat virtual user-interface objects [para 0054, 0059-0060, identify displays with display surfaces with x-y display borders, note broadest reasonable interpretation of flat includes having a continuous horizontal surface].
As to claim 9, Kapinos discloses the method of claim 1, wherein one of the first virtual object or the second virtual object are 3D virtual user-interface objects [para 0054, 0059-0060, identify displays with display surfaces with x-y display borders and depth in 3D space].
As to claim 10, Kapinos discloses the method of claim 1, wherein the first surface and second surface are separate by the gap comprising a non-contiguous distance from one another within the XR environment [para 0054, 0059-0060, identify displays with respective display surfaces, where displays are non-adjacent and separated by non-contiguous depth (read: distance) in 3D space].
As to claim 11, Kapinos discloses the method of claim 1, wherein the first surface and second surface are flat surfaces that are separated by the gap comprising a non-contiguous distance from one another in the XR environment and oriented in different directions [para 0054, 0059-0060, identify displays with respective display surfaces with x-y display borders, where displays are non-adjacent and separated by non-contiguous depth (read: distance) in 3D space, note different depths fall under broadest reasonable interpretation of orientation including an arrangement].
As to claim 12, Kapinos discloses the method of claim 1 further comprising initially displaying the cursor at an initial position on the first surface in response to a gaze or head pose of a user [Figs. 4-5, para 0056, 0060, 0063, present cursor on display surface in projection plane established using user head location (read: pose, note broadest reasonable interpretation of pose includes any position)].
As to claim 13, Kapinos and Johnson, combined at least for the reasons above, Kapinos discloses an electronic device comprising: a non-transitory computer-readable storage medium; and one or more processors coupled to the non-transitory computer-readable storage medium, wherein the non-transitory computer-readable storage medium comprises program instructions that, when executed on the one or more processors, cause the electronic device to perform operations [para 0028-0030, 0042, computing device includes non-transitory device storing software instructions executed by device processor] comprising: limitations substantially similar to those recited in claim 1 and is rejected under similar rationale.
As to claim 19, Kapinos and Johnson, combined at least for the reasons above, disclose the electronic device of claim 13 comprising limitations substantially similar to those recited in claim 7 and is rejected under similar rationale.
As to claim 20, Kapinos and Johnson, combined at least for the reasons above, disclose a non-transitory computer-readable storage medium, storing program instructions executable by one or more processors to perform operations comprising: at an electronic device having the one or more processors and a display [para 0028-0030, 0042, non-transitory device stores software instructions executed by computing device processor including display]: performing limitations substantially similar to those recited in claim 1 and is rejected under similar rationale.
As to claim 21, Kapinos discloses the method of claim 1, and the first virtual object and the second virtual object [Fig. 4, para 0059-0061, identify virtual projections of respective displays].
However, Kapinos does not specifically disclose wherein the first virtual object is a first virtual user interface and the second virtual object is a second virtual user interface.
Johnson discloses wherein the first virtual object is a first virtual user interface and the second virtual object is a second virtual user interface [para 0020, 0042, 0050, virtual objects with classes including user interface].
Kapinos and Johnson are analogous art to the claimed invention being from a similar field of endeavor of virtual environment interfaces. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the virtual objects as disclosed by Kapinos with the virtual user interfaces as disclosed by Johnson with a reasonable expectation of success.
One of ordinary skill in the art would be motivated to modify Kapinos as described above to present any number of objects in virtual reality [Johnson, para 0018, 0020].
As to claim 22, Kapinos discloses the method of claim 21.
However, Kapinos does not specifically disclose wherein the first virtual user interface comprises at least one user interface element and the second virtual user interface comprises at least one user interface element.
Johnson discloses wherein the first virtual user interface comprises at least one user interface element and the second virtual user interface comprises at least one user interface element [para 0020, 0049-0050, virtual objects with classes including user interface controls (read: user interface element)].
Kapinos and Johnson are analogous art to the claimed invention being from a similar field of endeavor of virtual environment interfaces. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the virtual objects as disclosed by Kapinos with the virtual user interfaces including user interface elements as disclosed by Johnson with a reasonable expectation of success.
One of ordinary skill in the art would be motivated to modify Kapinos as described above to present any number of objects in virtual reality [Johnson, para 0018, 0020].
As to claim 23, Kapinos discloses the method of claim 1, wherein the first virtual object is located with respect to a first plane and the second virtual object is located with respect to a second plane differing from the first plane [Fig. 4, para 0060-0061, map display onto projection plane (read: first plane) and other display to front-plane (read: second plane) with different depth relative to projection plane].
Claims 2-6 and 17-18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kapinos and Johnson as applied to claims 1 and 13 above, and further in view of Wang et al. (US 20110107270 A1).
As to claim 2, Kapinos discloses the method of claim 1, wherein the path is a line corresponding to the cursor movement on a[] projection of the first surface and the second surface [Figs. 4-5, para 0060, 0063-0065, project cursor motion as line in projection plane including display surface and virtual projection of display surface].
However, Kapinos and Johnson do not specifically disclose wherein "a[] projection" is "an orthographic projection".
Wang discloses an orthographic projection [para 0073, 0194, 0235, display image in plane using orthographic projection].
Kapinos, Johnson, and Wang are analogous art to the claimed invention being from a similar field of endeavor of virtual environment interfaces. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the projection as disclosed by Kapinos and Johnson with the orthographic projection as disclosed by Wang with a reasonable expectation of success.
One of ordinary skill in the art would be motivated to modify Kapinos and Johnson as described above to apply known methods of presenting information on display devices [Wang, para 0082, 0194]
.
As to claim 3, Kapinos discloses the method of claim 2, wherein the path is a line based on extending a line segment corresponding to the cursor movement on the [] projection [Fig. 5, para 0063, 0065, project motion vector in direction (read: extending line segment) of cursor motion line in projection plane].
However, Kapinos and Johnson do not specifically disclose wherein "the [] projection" is "the orthographic projection".
Wang discloses the orthographic projection [para 0073, 0194, 0235, display image in plane using orthographic projection].
Kapinos, Johnson, and Wang are analogous art to the claimed invention being from a similar field of endeavor of virtual environment interfaces. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the projection as disclosed by Kapinos and Johnson with the orthographic projection as disclosed by Wang with a reasonable expectation of success.
One of ordinary skill in the art would be motivated to modify Kapinos and Johnson as described above to apply known methods of presenting information on display devices [Wang, para 0082, 0194]
.
As to claim 4, Kapinos discloses the method of claim 3, wherein a direction of the line segment is modified within a bounded area with respect to a position of the second surface of the second virtual object [para 0065, change direction of cursor motion vector in surface (read: bounded area) of disparate display after presenting cursor at edge portion (read: position) of disparate display surface (read: bounded area)].
.
As to claim 5, Kapinos discloses the method of claim 2, wherein the [] projection is onto a plane that is at a fixed position defined based on an orientation enabling the plane to remain independent of a user viewpoint [para 0060-0061, establish projection plane at depth (read: fixed position) using front surface of reference display (read: orientation), where the element of a reference display is separate from the element of a user viewpoint and is consistent with the broadest reasonable interpretation of independent including being separate].
However, Kapinos and Johnson do not specifically disclose wherein "the [] projection" is "the orthographic projection".
Wang discloses the orthographic projection [para 0073, 0194, 0235, display image in plane using orthographic projection].
Kapinos, Johnson, and Wang are analogous art to the claimed invention being from a similar field of endeavor of virtual environment interfaces. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the projection as disclosed by Kapinos and Johnson with the orthographic projection as disclosed by Wang with a reasonable expectation of success.
One of ordinary skill in the art would be motivated to modify Kapinos and Johnson as described above to apply known methods of presenting information on display devices [Wang, para 0082, 0194]
.
As to claim 6, Kapinos discloses the method of claim 2, method of claim 2, wherein the [] projection is onto a plane defined based on an orientation of the first virtual object or the second virtual object [para 0060-0061, establish projection plane including display using coordinates and depth (read: orientation, note broadest reasonable interpretation of orientation includes any direction) of reference display].
However, Kapinos and Johnson do not specifically disclose wherein "the [] projection" is "the orthographic projection".
Wang discloses the orthographic projection [para 0073, 0194, 0235, display image in plane using orthographic projection].
Kapinos, Johnson, and Wang are analogous art to the claimed invention being from a similar field of endeavor of virtual environment interfaces. Thus, it would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify the projection as disclosed by Kapinos and Johnson with the orthographic projection as disclosed by Wang with a reasonable expectation of success.
One of ordinary skill in the art would be motivated to modify Kapinos and Johnson as described above to apply known methods of presenting information on display devices [Wang, para 0082, 0194].
As to claims 17 and 18, Kapinos, Johnson, and Wang, combined at least for the reasons above, disclose the electronic device of claim 13 comprising limitations substantially similar to those recited in claim 5 and 6, respectively, and are rejected under similar rationale.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Creto et al. (US 11520457 B1) generally discloses moving a cursor from a first position of a virtual object to a second position of a virtual object as generated by an augmented reality display system.
da Veiga et al. (US 20160027216 A1) discloses transitioning cursor movement from a boundary of a two-dimensional display to a three-dimensional virtual environment.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LINDA HUYNH whose telephone number is (571)272-5240 and email is linda.huynh@uspto.gov. The examiner can normally be reached M-F between 9am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Queler can be reached at (571) 272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/LINDA HUYNH/Primary Examiner, Art Unit 2172