DETAILED ACTION
This final rejection is responsive to the amendment filed 27 February 2026. Claims 1-20 are pending. Claims 1 and 14 are independent claims. Claims 1-3, 8, 11-16, and 19 are amended.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Remarks
35 U.S.C. 103
Applicant’s prior art arguments have been fully considered and they are persuasive.
Applicant argues that the cited references do not teach the newly amended claims. The newly amended claims necessitate a new ground of rejection. Accordingly, a new reference, Harms (US 2017/0054569 A1), has been added to the rejection, as further detailed below.
The foregoing applies to all independent claims and their dependent claims.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 2, 5, 6, 11-15, 18, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Pahud (US 2019/0005724 A1) hereinafter known as Pahud in view of Harms (US 2017/0054569 A1) hereinafter known as Harm in view of Eronen (US 2018/0210206 A1) hereinafter known as Eronen.
Regarding independent claim 1, Pahud teaches:
A wearable mixed-reality system, comprising: a processor; and (Pahud: Figs. 2A-2B and ¶[0030], ¶[0039], and ¶[0045]; Pahud teaches a mixed-reality system with a processor.)
a non-transitory computer readable storage medium storing thereupon a sequence of instructions which, when executed by the processor, causes the processor to perform a set of acts, the set of acts comprising: identifying a plurality of elements from two-dimensional (2D) content at least by searching for the plurality of elements from one or more sources that include the 2D content; (Pahud: Fig. 2B and ¶[0034]-¶[0036] and ¶[0046]; Pahud teaches mapping and rendering augmented reality objects in the physical presentation environment. Fig. 1 and ¶[0034]-¶[0035] further teaches retrieving augmented reality display data which includes a plurality of augmented reality objects.)
identifying, using one or more sensors in the wearable mixed-reality system, one or more surfaces in a physical environment where a user wearing the wearable mixed-reality system is located; (Pahud: Fig. 11 and ¶[0040], ¶[0046], and ¶[0070] ; Pahud teaches location of one or more planar surfaces in the physical presentation environment.)
mapping an element of the plurality of elements onto a surface of the one or more surfaces; and (Pahud: Fig. 11 and ¶[0040], ¶[0046], and ¶[0070] ; Pahud teaches location of one or more planar surfaces in the physical presentation environment and mapping the objects.)
displaying the element as a virtual content onto the surface for the user, (Pahud: Figs. 2A-3 and ¶[0047]-¶[0055]; Pahud teaches displaying the virtual elements on the physical surfaces.)
wherein identifying the one or more surfaces comprises: identifying a plurality of prisms rendered in the physical environment, ... ; and (Pahud: Figs. 2B-2C and ¶[0052]-¶[0053]; Pahud teaches displaying the virtual elements on the physical surfaces and further presenting alternate presentation configurations by moving to different planar surfaces. Each of the graphical representations are interpreted as prisms. The instant specification
determining, ... , one or more prisms associated with the user wearing the wearable mixed-reality system ..., ... . (Pahud: Fig. 15 and ¶[0005]-¶[0006]; Pahud teaches mapping preference which are received for a specific physical presentation environment and display data characteristics, i.e. 2D chart, timeline, image, spreadsheet, etc...)
Pahud does not explicitly teach but Harms teaches:
... wherein a prism comprises a bounded three-dimensional volume into which the element is rendered (Harms: Fig. 3B and ¶[0045]; Harms teaches a bounded 3D volume.)
Pahud and Harms are in the same field of endeavor as the present invention, as the references are directed to mapping augmented reality experiences to an environment. It would have been obvious, before the effective filing date of the claimed invention, to a person of ordinary skill in the art, to combine determining physical surfaces; identifying prisms, and mapping graphical objects onto them based on stored preferences as taught in Pahud with the prisms comprising a bounded 3D volume as taught in Harms. As such, it would have been obvious to one of ordinary skill in the art to modify the teachings of Pahud to include teachings of Harms, because the combination would allow various types of interactive regions, as suggested by Harms: ¶[0045].
Pahud in view of Harms does not explicitly teach but Eronen teaches:
... from a database storing data pertaining to the plurality of prisms ... wherein the database stores therein at least (1) physical location data for the one or more prisms as well as (2) data indicative of the one or more prisms being associated with one or more applications executable by the wearable mixed-reality system. (Eronen: ¶[0105]-¶[0107]; Eronen teaches allocation property data that includes attributes. ¶[0109] further teaches that the attribute may identify the physical object as being controllable by way of the virtual information region; and thus, an application. Figs. 5A-6B and ¶[0094] further teaches the foregoing stored in a database.)
Eronen is in the same field of endeavor as the present invention, since it is directed to mapping augmented reality experiences to an environment. It would have been obvious, before the effective filing date of the claimed invention, to a person of ordinary skill in the art, to combine determining physical surfaces and mapping graphical objects onto them based on stored preferences as taught in Pahud with a database that stores and associates physical location data and content with application executable by the system as taught in Eronen. As such, it would have been obvious to one of ordinary skill in the art to modify the teachings of Pahud and Harms to include teachings of Eronen, because the combination would allow assigning specific applications to region as suggested by Eronen: ¶[0109].
Regarding claim 2, Pahud in view of Harms in view of Eronen further teaches the wearable mixed-reality system of claim 1.
Pahud further teaches:
wherein the non-transitory computer readable storage medium further stores the sequence of instruction which, when executed by the processor, causes the processor to perform identifying the plurality of elements from the 2D content in the set of acts, ... , the set of acts further comprising at least one of: searching, by the wearable mixed-reality system, digital contents from one or more remote sources; accessing, using the wearable mixed-reality system, contents accessible by or stored on a server; browsing, using the wearable mixed-reality system, one or more web pages; or collecting an inventory of available elements from the 2D content. (Pahud: Figs. 2B-2C and ¶[0030]; Pahud teaches identifying display data which includes augmented reality objects.)
Harms further teaches:
... and the prism is anchored to a physical location in the physical environment and remains movable to one or more other physical locations, ... (Harms: ¶[0064]; Harms teaches a moveable interactive region.)
Regarding claim 5, Pahud in view of Harms in view of Eronen further teaches the wearable mixed-reality system of claim 1.
Pahud further teaches:
wherein the non-transitory computer readable storage medium further stores the sequence of instruction which, when executed by the processor, causes the processor to perform displaying the element as the virtual content onto the surface for the user in the set of acts, the set of acts further comprising determining whether the element to be displayed has multiple versions. (Pahud: ¶[0004]-¶[0005]; Pahud teaches the augmented reality objects containing size and scaling restrictions, which is interpreted as versions.)
Regarding claim 6, Pahud in view of Harms in view of Eronen further teaches the wearable mixed-reality system of claim 5.
Pahud further teaches:
wherein the non-transitory computer readable storage medium further stores the sequence of instruction which, when executed by the processor, causes the processor to perform displaying the element as the virtual content onto the surface for the user in the set of acts, the set of acts further comprising determining whether the surface is compatible with displaying the multiple versions of the element. (Pahud: ¶[0040]; Pahud teaches location avoidance areas where mapping of objects is restricted.)
Regarding claim 11, Pahud in view of Harms in view of Eronen further teaches the wearable mixed-reality system of claim 1.
Pahud further teaches:
wherein the non-transitory computer readable storage medium further stores the sequence of instruction which, when executed by the processor, causes the processor to perform identifying the one or more surfaces in a physical environment in the set of acts, the set of acts further comprising: identifying, by using at least one or more sensors and the processor in the wearable mixed-reality system, one or more physical surfaces from the one or more surfaces in the physical environment; and determining whether at least one surface of the one or more surfaces is sufficient for displaying the element as the virtual content on the at least one surface. (Pahud: ¶[0040]; Pahud teaches location avoidance areas where mapping of objects is restricted.)
Regarding claim 12, Pahud in view of Harms in view of Eronen further teaches the wearable mixed-reality system of claim 11.
Pahud further teaches:
wherein the non-transitory computer readable storage medium further stores the sequence of instruction which, when executed by the processor, causes the processor to perform identifying the one or more surfaces in a physical environment in the set of acts, the set of acts further comprising: determining, by the wearable mixed-reality system, at least one virtual surface rendered or to be rendered from the one or more surfaces in the physical environment; and selecting the at least one virtual surface as the surface onto which the element is rendered. (Pahud: Figs. 2A-3 and ¶[0047]-¶[0055]; Pahud teaches displaying the virtual elements on the physical surfaces.)
Regarding claim 13, Pahud in view of Harms in view of Eronen further teaches the wearable mixed-reality system of claim 11.
Pahud further teaches:
wherein the non-transitory computer readable storage medium further stores the sequence of instruction which, when executed by the processor, causes the processor to perform determining the at least one virtual surface in the set of acts, the set of acts further comprising: selecting, by the wearable mixed-reality system, the at least one virtual surface from one or more existing virtual surfaces already rendered in the physical environment; or rendering a virtual object to the user and selecting a virtual surface from one or more virtual surfaces of the virtual object as the at least one virtual surface onto which the element is rendered. (Pahud: Figs. 2A-3 and ¶[0047]-¶[0055]; Pahud teaches displaying the virtual elements on the physical surfaces and switching the virtual elements.)
Regarding claims 14, 15, 18, and 20, these claims recite a method that performs the function of the mixed-reality system of claims 1, 2, 6, and 12; therefore, the same rationale for rejection applies.
Claims 3 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Pahud in view of Harms in view of Eronen in view of Leppanen (US 2018/0164588 A1) hereinafter known as Leppanen.
Regarding claim 3, Pahud in view of Harms in view of Eronen further teaches the wearable mixed-reality system of claim 1.
Pahud in view of Harms in view of Eronen does not explicitly teach but Leppanen teaches:
wherein the non-transitory computer readable storage medium further stores the sequence of instruction which, when executed by the processor, causes the processor to perform identifying the one or more surfaces in the set of acts, the set of acts further comprising: determining whether the element is displayed in a first open window in a first prism at a first physical location, wherein the first physical location is beyond a field of view provided by the wearable mixed-reality system to the user who is located at a second physical location; and upon receiving an instruction from a user interface of the wearable mixed-reality system, displaying the element on the surface in a second open window within the field of view provided by the wearable mixed-reality system to the user located at the second physical location. (Leppanen: Fig. 3A and 3G and ¶[0109]; Leppanen teaches virtual information region having a virtual image region location beyond of field of view and being brought within the field of view by a user input.)
Leppanen is in the same field of endeavor as the present invention, since it is directed to displaying virtual reality experiences. It would have been obvious, before the effective filing date of the claimed invention, to a person of ordinary skill in the art, to combine determining physical surfaces and mapping graphical objects onto them based on presentation attributes as taught in Pahud with further bringing elements from outside the field of view to within the field of view as taught in Leppanen. As such, it would have been obvious to one of ordinary skill in the art to modify the teachings of Pahud to include teachings of Leppanen, because the combination would allow the user to change the virtual information region, as suggested by Leppanen: ¶[0109].
Regarding claim 16, this claim recites a method that performs the function of the mixed-reality system of claim 3; therefore, the same rationale for rejection applies.
Claims 4 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Pahud in view of Harms in view of Eronen in view of Poulos (US 2015/0331240 A1) hereinafter known as Poulos.
Regarding claim 4, Pahud in view of Harms in view of Eronen further teaches the wearable mixed-reality system of claim 1.
Pahud in view of Harms in view of Eronen further teaches:
wherein the non-transitory computer readable storage medium further stores the sequence of instruction which, when executed by the processor, causes the processor to perform mapping the element identified from the 2D content onto the surface of the one or more surfaces in the set of acts, the set of acts further comprising: determining when or how the element identified from the 2D content is to be displayed in a 3D setting based at least in part upon a ... for an attribute pertaining to the element; and placing, by the wearable mixed-reality system, the element as a virtual content on the surface based at least in part upon the .... (Pahud: ¶[0070]-¶[0071]; Pahud teaches presentation attributes of the augmented reality display data that includes features such as dimensions, orientations, content, etc...)
Pahud in view of Harms does not explicitly teach but Poulos further teaches:
... markup language tag .... (Poulos: ¶[0080] and ¶[0085]; Poulos teaches parsing a webpage by using HTML or CSS elements.)
Poulos is in the same field of endeavor as the present invention, since it is directed to mapping augmented reality experiences to an environment. It would have been obvious, before the effective filing date of the claimed invention, to a person of ordinary skill in the art, to combine determining physical surfaces and mapping graphical objects onto them based on presentation attributes as taught in Pahud with further determining how to display the elements based on markup language as taught in Poulos. As such, it would have been obvious to one of ordinary skill in the art to modify the teachings of Pahud to include teachings of Poulos, because the combination would allow efficiently identifying different virtual elements, as suggested by Poulos: ¶[0032].
Regarding claim 17, this claim recites a method that performs the function of the mixed-reality system of claim 4; therefore, the same rationale for rejection applies.
Claims 7-10 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Pahud in view of Harms in view of Eronen in view of Mildrew (US 2018/0143756 A1) hereinafter known as Mildrew.
Regarding claim 7, Pahud in view of Harms in view of Eronen further teaches the wearable mixed-reality system of claim 5.
Pahud in view of Harms in view of Eronen does not explicitly teach but Mildrew teaches:
wherein the multiple versions comprise a two-dimensional (2D) version of the element for display in 2D setting and a three-dimensional (3D) version of the element for display in a 3D setting. (Mildrew: ¶[0042] and ¶[0046]; Mildrew teaches a representation of a 3D model can include 2D image data.)
Mildrew is in the same field of endeavor as the present invention, since it is directed to displaying virtual reality experiences. It would have been obvious, before the effective filing date of the claimed invention, to a person of ordinary skill in the art, to combine determining physical surfaces and mapping graphical objects onto them based on presentation attributes as taught in Pahud with further supporting 3D and 2D versions as taught in Mildrew. As such, it would have been obvious to one of ordinary skill in the art to modify the teachings of Pahud to include teachings of Mildrew, because the combination would allow different representations of data, as suggested by Mildrew: ¶[0042].
Regarding claim 8, Pahud in view of Harms in view of Eronen further teaches the wearable mixed-reality system of claim 5.
Pahud in view of Ofek in view of Eronen does not explicitly teach but Mildrew teaches:
wherein the non-transitory computer readable storage medium further stores the sequence of instruction which, when executed by the processor, causes the processor to perform displaying the element as the virtual content onto the surface for the user in the set of acts, the set of acts further comprising: determining whether a browser for displaying the element has a 3D display functionality, wherein the one or more sources further include a remote data source external to the wearable mixed-reality system. (Mildrew: ¶[0042] and ¶[0060]; Mildrew teaches determining whether a representation has 2D and/or 3D image data.)
Pahud further teaches: wherein the one or more sources further include a remote data source external to the wearable mixed-reality system (Pahud: ¶[0038]-¶[0039]; Pahud teaches a distributed system which retrieves information from multiple devices.)
Mildrew is in the same field of endeavor as the present invention, since it is directed to displaying virtual reality experiences. It would have been obvious, before the effective filing date of the claimed invention, to a person of ordinary skill in the art, to combine determining physical surfaces and mapping graphical objects onto them based on presentation attributes as taught in Pahud with further supporting 3D and 2D versions as taught in Mildrew. As such, it would have been obvious to one of ordinary skill in the art to modify the teachings of Pahud to include teachings of Mildrew, because the combination would allow different representations of data, as suggested by Mildrew: ¶[0042].
Regarding claim 9, Pahud in view of Harms in view of Eronen in view of Mildrew further teaches the wearable mixed-reality system of claim 8.
Mildrew further teaches:
wherein the non-transitory computer readable storage medium further stores the sequence of instruction which, when executed by the processor, causes the processor to perform displaying the element as the virtual content onto the surface for the user in the set of acts, the set of acts further comprising: upon determining that the browser does not have the 3D display functionality, identifying a two-dimensional (2D) version for the element; and rendering, by the wearable mixed-reality system, the 2D version for the element relative to the surface. (Mildrew: ¶[0042] and ¶[0060]; Mildrew teaches determining whether a representation has 2D and/or 3D image data and rendering the image.)
Regarding claim 10, Pahud in view of Harms in view of Eronen in view of Mildrew further teaches the wearable mixed-reality system of claim 8.
Mildrew further teaches:
wherein the non-transitory computer readable storage medium further stores the sequence of instruction which, when executed by the processor, causes the processor to perform displaying the element as the virtual content onto the surface for the user in the set of acts, the set of acts further comprising: upon determining that the browser has the 3D display functionality, identifying a three-dimensional (3D) version for the element; and rendering, by the wearable mixed-reality system, the 3D version for the element relative to the surface. (Mildrew: ¶[0042] and ¶[0060]; Mildrew teaches determining whether a representation has 2D and/or 3D image data and rendering the image.)
Regarding claim 19, this claim recites a method that performs the function of the mixed-reality system of claim 9; therefore, the same rationale for rejection applies.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALEX OLSHANNIKOV whose telephone number is (571)270-0667. The examiner can normally be reached M-F 9:30-6.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott Baderman can be reached at 571-272-3644. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ALEKSEY OLSHANNIKOV/Primary Examiner, Art Unit 2118