DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims in Consideration
Claims 1-20 are pending in this application.
Claim Rejection Notes
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-8 and 11-20, are rejected under 35 U.S.C. 102a1 as being anticipated by Korngold et al. (US 20210183158 A1, published: 6/17/2021).
Claim 1: Korngold teaches a computer system that enables an action to be performed with respect to a hologram that is displayed in a scene (displaying the AR object on the reference surface in the AR environment [Korngold, 0004]), said computer system comprising: a processor system (processor assembly 212 [Korngold, 0027, FIG. 1]); and a storage system that stores instructions (the memory 210 may store instructions and data that are usable to generate an AR environment for a user [Korngold, 0028, FIG. 1]) that are executable by the processor system to cause the computer system to:
access a hologram (AR object [Korngold, 0004]) that is included as a part of a scene, wherein the hologram is located at a first location within the scene (receiving an indication to place an AR object on a reference surface in the AR environment [Korngold, 0004]);
define a second location for the hologram, the second location being different than the first location in the scene, wherein defining the second location includes defining a triggering action (performing a first gesture [Korngold, 0004]) that, when detected, causes the hologram to progressively move from the first location to the second location; detect user input that includes the triggering action; in response to the user input, cause the hologram to progressively move from the first location to the second location, wherein, at a time in which the user input is detected (performing a first gesture on an input device of the electronic device; in response to the first gesture, elevating the AR object a distance above the reference surface in the AR environment; performing a second gesture on the input device of the electronic device; and in response to the second gesture, moving the AR object in the AR environment [Korngold, 0004]. FIG. 1 shows an example of a system 100 that can be used for generating an immersive experience by way of an AR environment [Korngold, 0017, FIG. 1]; Examiner's Note: based on a gesture input (trigger), the hologram (virtual object) is moved in the VR environment from one location to second), the hologram is visible within a first perspective view of the scene and the second location is outside of the first perspective view of the scene (FIG. 3 is a third person view of an example physical space 300, in which a user is experiencing an AR environment 302 through the example HMD 204. The AR environment 302 can be generated by the AR application 220 of the computing device 202 and displayed to the user through the HMD 204, or other device [Korngold, 0049, FIG. 3]; Examiner's Note: FIG. 3 shows VR being observed through a HMD/Helmet/VR Headset but only shows a perspective view based on the user's gaze, whereas stuff outside the user's gaze is located outside the perspective view, and is thus not shown to the user through the HMD);
concurrently with the progressive movement of the hologram, automatically pan the scene to a second perspective view in which the second location becomes visible; and display the scene from the second perspective view, resulting in the hologram being visible at the second location (the AR application 220 may update the AR environment based on input received from the camera assembly 232, the IMU 234, and/or other components of the sensor system 216. Based on the detected position and orientation, the AR application 220 may update the AR environment to reflect a changed orientation and/or position of the user within the environment [Korngold, 0045]; Examiner's Note: the system updates the VR image based on detected inputs; i.e. in real-time, concurrently with the movement of the object/hologram).
Claim 11, sharing similar elements with claim 1, is likewise rejected.
Claim 2: Korngold teaches the computer system of claim 1. Korngold further teaches wherein the computer system is a first immersive platform of a first type, the first type of the first immersive platform is a type that provides a view of the scene using a screen (displaying the AR object on the reference surface in the AR environment [Korngold, 0004]).
Claim 3: Korngold teaches the computer system of claim 1. Korngold further teaches wherein the triggering action includes one or more of: a long press cursor, a double tap cursor, or a touch action (the gesture to elevate the virtual character can be completed, as indicated at reference character 7-9, by the user releasing the virtual character (removing the two fingertips from the touchscreen of the mobile device) [Korngold, 0069]).
Claims 13 and 19, sharing similar elements with claim 3, are likewise rejected.
Claim 4: Korngold teaches the computer system of claim 1. Korngold further teaches wherein the triggering action a movement of the hologram performed by a user (displaying an augment reality in which a user can place and manipulate virtual (e.g., computer-generated) objects in a view of a physical space [Korngold, 0004]).
Claim 5: Korngold teaches the computer system of claim 1. Korngold further teaches wherein the hologram progressively moves from the first location to the second location throughout a pre-defined time period (the AR application 220 may update the AR environment based on input received from the camera assembly 232, the IMU 234, and/or other components of the sensor system 216. Based on the detected position and orientation, the AR application 220 may update the AR environment to reflect a changed orientation and/or position of the user within the environment [Korngold, 0045]; Examiner's Note: the system updates the VR image based on detected inputs; i.e. in real-time, concurrently with the movement of the object/hologram).
Claim 6: Korngold teaches the computer system of claim 1. Korngold further teaches wherein a path is defined for the hologram to follow from the first location to the second location (the path shown in FIG. 1 is illustrative and different paths and/or locations can be viewed by the user, and can include AR objects in those locations [Korngold, 0025]).
Claim 7: Korngold teaches the computer system of claim 6. Korngold further teaches wherein the path is a pre-defined path that is defined prior to the user input being detected (as is illustrated by the dashed arrows in FIG. 1, the user can, with AR environment 116, move between the locations of the object 120A-120. The path shown in FIG. 1 is illustrative and different paths and/or locations can be viewed by the user, and can include AR objects in those locations (e.g., AR objects placed by the user at, and/or manipulated by the user to, those locations) [Korngold, 0025]; Examiner's Note: as illustrated in FIG. 1).
Claim 14, sharing similar elements with claim 7, is likewise rejected.
Claim 8: Korngold teaches the computer system of claim 6. Korngold further teaches wherein the path is defined at the time in which the user input is detected (at 914, in response to the second gesture at 912, the AR object can be moved in the AR environment [Korngold, 0080]).
Claim 12: Korngold teaches the method of claim 11, wherein the method is performed by a first immersive platform, and wherein the hologram is moved to the third location by a user of a different immersive platform that is concurrently accessing the scene (Examiner's Note: this is just an extension of the independent claim adding extra locations. Computer programs may be run over and over, so after moving an object to a second location, it can be moved again, again, and again).
Claim 15: Korngold teaches the method of claim 11. Korngold further teaches wherein a path is defined for the hologram to follow from the third location to the second location, the path being defined in response to the user input being detected (as is illustrated by the dashed arrows in FIG. 1, the user can, with AR environment 116, move between the locations of the object 120A-120. The path shown in FIG. 1 is illustrative and different paths and/or locations can be viewed by the user, and can include AR objects in those locations (e.g., AR objects placed by the user at, and/or manipulated by the user to, those locations) [Korngold, 0025]. VI.REVERSAL, DUPLICATION, OR REARRANGEMENT OF PARTS [MPEP, 2144.04]; Examiner's Note: this is just an extension of the claim but moving in reverse).
Claim 16: Korngold teaches the method of claim 11, wherein the method is performed by a first immersive platform, and wherein the hologram is moved to the third location by a user of the first immersive platform (Examiner's Note: this is just an extension of the independent claim adding extra locations. Computer programs may be run over and over, so after moving an object to a second location, it can be moved again, again, and again).
Claim 17: Korngold teaches the method of claim 11, wherein: the method is performed by a first immersive platform, a second immersive platform is concurrently accessing the scene, the hologram is moved to the third location by a user of the second immersive platform, and the first immersive platform displays a movement of the hologram from the first location to the third location (Examiner's Note: this is just an extension of the independent claim adding extra locations. Computer programs may be run over and over, so after moving an object to a second location, it can be moved again, again, and again).
Claim 18: Korngold teaches a method implemented by a head mounted device (HMD) (a head-mounted display,… a VR headset 1085 [Korngold, 0017]), the HMD being a first immersive platform of a first type (displaying the AR object on the reference surface in the AR environment [Korngold, 0004]), said method comprising:
accessing a hologram that is included as a part of a scene, wherein the HMD displays the scene in a three-dimensional (3D) manner (receiving an indication to place an AR object on a reference surface in the AR environment [Korngold, 0004]. A head-mounted display,… a VR headset 1085 [Korngold, 0017]. The content may be rendered as flat images or as three-dimensional (3D) objects. The 3D objects may include one or more objects represented as polygonal meshes. The polygonal meshes may be associated with various surface textures, such as colors and images [Korngold, 0035]);
determining that a second location has been defined for the hologram (AR object [Korngold, 0004]), wherein said determining includes identifying that the hologram is associated with a triggering action (performing a first gesture [Korngold, 0004]) that, when performed by a user of a second immersive platform that displays the scene in a two-dimensional (2D) manner, causes the hologram to progressively move from whatever location the hologram is at when the triggering action is performed to the second location without further user input beyond that of the triggering action; facilitating a 3D movement of the hologram from the first location to a third location (performing a first gesture on an input device of the electronic device; in response to the first gesture, elevating the AR object a distance above the reference surface in the AR environment; performing a second gesture on the input device of the electronic device; and in response to the second gesture, moving the AR object in the AR environment [Korngold, 0004]. FIG. 1 shows an example of a system 100 that can be used for generating an immersive experience by way of an AR environment [Korngold, 0017, FIG. 1]; Examiner's Note: based on a gesture input (trigger), the hologram (virtual object) is moved in the VR environment from one location to second);
detecting that the second immersive platform is concurrently accessing the scene; in response to detecting the triggering action being performed from the user of the second immersive platform, visualizing the hologram progressively moving from the third location to the second location; and determining that the hologram is at the second location (the AR application 220 may update the AR environment based on input received from the camera assembly 232, the IMU 234, and/or other components of the sensor system 216. Based on the detected position and orientation, the AR application 220 may update the AR environment to reflect a changed orientation and/or position of the user within the environment [Korngold, 0045]; Examiner's Note: the system updates the VR image based on detected inputs; i.e. in real-time, concurrently with the movement of the object/hologram).
Claim 20: Korngold teaches the method of claim 18, wherein the HMD displays the scene from a first perspective view, and wherein the third location and the second location are both outside of the first perspective view (FIG. 3 is a third person view of an example physical space 300, in which a user is experiencing an AR environment 302 through the example HMD 204. The AR environment 302 can be generated by the AR application 220 of the computing device 202 and displayed to the user through the HMD 204, or other device [Korngold, 0049, FIG. 3]; Examiner's Note: FIG. 3 shows VR being observed through a HMD/Helmet/VR Headset but only shows a perspective view based on the user's gaze, whereas stuff outside the user's gaze is located outside the perspective view, and is thus not shown to the user through the HMD).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 9 & 10, are rejected under 35 U.S.C. 103 as being unpatentable over Korngold et al. (US 20210183158 A1, published: 6/17/2021), in view of Lyren (US 20200066058 A1, published: 2/27/2020).
Claim 9: Korngold teaches the computer system of claim 1. Korngold does not teach wherein an impeding object is positioned in a direct path from the first location to the second location, and wherein the hologram passes through the impeding object as a result of the impeding object having a non-physicality state during a time when the hologram is moving.
However, Lyren teaches wherein an impeding object is positioned in a direct path from the first location to the second location, and wherein the hologram passes through the impeding object as a result of the impeding object having a non-physicality state during a time when the hologram is moving (when these movements, however, are significant enough to impede or restrict a view of the virtual object 1710 (such as pushing the virtual object 1710 or portions thereof out of the field of view 1740), the virtual object 1710 is repositioned within the field of view 1740 to remain visible to the user 1730. Additionally, the virtual object 1710 is repositioned within the field of view 1740 when its view is obstructed. For example, while the virtual and real objects are within the field of view, another object impedes, restricts, or obstructs the user's view of the virtual object. In this instance, the virtual object is repositioned within the field of view to provide the user with a clear or unobstructed view of the virtual object. As another example, the virtual object is repositioned in the field of view when an edge or perimeter of the field of view hits or touches the virtual object [Lyren, 0214]).
Therefore, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the VR environment capable of moving objects/holograms between virtual locations invention of Korngold to include the working with objects that are impeded by other virtual objects features of Lyren.
One would have been motivated to make this modification to provide to the user a virtual reality environment that strives to be close to or accurate of actual real-world physics. Such would provide the user with a real seeming experience where objects cannot occupy the same space, and hence may bump or move other objects if moved into them.
Claim 10: Korngold teaches the computer system of claim 1. Korngold does not teach wherein an impeding object is positioned in a direct path from the first location to the second location, and wherein the hologram passes around the impeding object as a result of the impeding object having a physicality state.
However, Lyren teaches wherein an impeding object is positioned in a direct path from the first location to the second location, and wherein the hologram passes around the impeding object as a result of the impeding object having a physicality state (when these movements, however, are significant enough to impede or restrict a view of the virtual object 1710 (such as pushing the virtual object 1710 or portions thereof out of the field of view 1740), the virtual object 1710 is repositioned within the field of view 1740 to remain visible to the user 1730. Additionally, the virtual object 1710 is repositioned within the field of view 1740 when its view is obstructed. For example, while the virtual and real objects are within the field of view, another object impedes, restricts, or obstructs the user's view of the virtual object. In this instance, the virtual object is repositioned within the field of view to provide the user with a clear or unobstructed view of the virtual object. As another example, the virtual object is repositioned in the field of view when an edge or perimeter of the field of view hits or touches the virtual object [Lyren, 0214]).
Therefore, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the VR environment capable of moving objects/holograms between virtual locations invention of Korngold to include the working with objects that are impeded by other virtual objects features of Lyren.
One would have been motivated to make this modification to provide to the user a virtual reality environment that strives to be close to or accurate of actual real-world physics. Such would provide the user with a real seeming experience where objects cannot occupy the same space, and hence may bump or move other objects if moved into them.
Additional References
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. The following references refer to virtual reality where objects can be moved within the user's perspective:
Yerkes et al. (US 20180350099 A1, published: 12/6/2018)
Shen (US 20190391710 A1, published: 12/26/2019)
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SETH A SILVERMAN whose telephone number is (571)272-9783. The examiner can normally be reached Mon-Thur, 8AM-4PM MST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Queler can be reached at (571)272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Seth A Silverman/Primary Examiner, Art Unit 2172