Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Acknowledgment is made of applicant’s claim for foreign priority based on an application filed in Korea on Oct. 6, 2021.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 4/8/24 is being considered by the examiner.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-3, 5, 10, 14-15 and 17-19 are rejected under 35 U.S.C. (a)(2) as being anticipated by Tosas Bautista (US 2014/0248950 A1).
As to Claim 1, Tosas Bautista teaches An electronic system comprising:
an electronic device configured to receive, as one or more touch gestures, control of a virtual reality space and/or an object within the virtual reality space, and display a first screen according to a corresponding first view frustum within the virtual reality space (Tosas Bautista discloses “If the mobile device's keypad is a touch screen, the user will be able to use traditional navigation gestures to zoom in and zoom out (pinch gesture), roll (rotation gesture), pan (pan gesture) and click (tap gesture)” in [0097]; “Embodiments of the invention estimate the pose of the mobile device within the defined world coordinate system and render a perspective view of the visual output mapped on the virtual surface onto the mobile device's display according to the estimated pose” in [0387]; see also Fig 2B); and
a head-mounted display (HMD) configured to be worn on a head of a user and to display a second screen of the virtual reality space according to a direction of the head of the user from a viewpoint corresponding to a second view frustum within the virtual reality space, wherein a virtual display of the electronic device displayed on the second screen according to the direction of the head is synchronized to an actual display of the electronic device (Tosas Bautista discloses “FIG. 4I depicts a typical configuration of an embodiment of the system showing the relative positions of a user 200, HMD 450, mobile device 100, and a representation of the virtual surface 201. The pose 45I of the HMD, the pose 203 of the mobile device, and the representation of the virtual surface 201 can all be defined within a common world coordinate system 202” in [0111], see also Fig 4I as below:
PNG
media_image1.png
333
448
media_image1.png
Greyscale
), and
when the user performs a motion of inputting one or more touch gestures into the virtual display displayed on the second screen, the one or more touch gestures are input into the actual display of the electronic device so that control of the virtual reality space and/or the object is performed (Tosas Bautista discloses “For example, if the mobile device has a touchscreen, the user can perform navigation gestures such as sliding (for scrolling), pinching (for zooming) and rotating in order to visualise the information more clearly” in [0077]; “If the mobile device is used as display, the perspective projection of the virtual surface will be shown on the mobile device's display according to the pose 203 of the mobile device within the common world coordinate system 202. In this scenario, the HMD can show extra information relating to the contents mapped on the virtual surface. For example, the HMD can show a high level map of all the contents on the virtual surface, indicating the region that is currently being observed on the mobile device's display” in [0112].)
As to Claim 2, Tosas Bautista teaches The electronic system of claim 1, wherein the virtual display of the electronic device displayed on the second screen displays the first screen displayed on the actual display of the electronic device (Tosas Bautista discloses “the perspective projection of the virtual surface will be shown on the mobile device's display according to the pose 203 of the mobile device within the common world coordinate system 202. In this scenario, the HMD can show extra information relating to the contents mapped on the virtual surface. For example, the HMD can show a high level map of all the contents on the virtual surface, indicating the region that is currently being observed on the mobile device's display” in [0112].)
As to Claim 3, Tosas Bautista teaches The electronic system of claim 1, wherein one or a combination of at least two of a size, a position, and an orientation of the virtual display of the electronic device displayed on the second screen is synchronized with that of the actual display of the electronic device (Tosas Bautista discloses “For example, the pose 203 of the mobile device, estimated within the common world coordinate system 202, can be used to control the level of zoom of the part of the virtual surface shown on the HMD, while the pose 451 of the HMD, estimated within the common world coordinate system 202, can be used to control the rotation of the part of the virtual surface shown on the HMD” in [0110]; “the perspective projection of the virtual surface will be shown on the mobile device's display according to the pose 203 of the mobile device within the common world coordinate system 202. In this scenario, the HMD can show extra information relating to the contents mapped on the virtual surface. For example, the HMD can show a high level map of all the contents on the virtual surface, indicating the region that is currently being observed on the mobile device's display.” in [0112].)
As to Claim 5, Tosas Bautista teaches The electronic system of claim 1, wherein the electronic device is configured to perform moving and/or rotating the first view frustum within the virtual reality space based on the one or more touch gestures input from the user, and the moving and/or rotating of the first view frustum is performed within a preset boundary within the virtual reality space
(Tosas Bautista discloses “If the mobile device's keypad is a touch screen, the user will be able to use traditional navigation gestures to zoom in and zoom out (pinch gesture), roll (rotation gesture), pan (pan gesture) and click (tap gesture). Pitch and yaw rotation could be achieved by first setting a separate mode, and then using the pan gesture instead” in [0097], see also Fig 2 & 4.)
As to Claim 10, Tosas Bautista teaches The electronic system of claim 1, wherein the electronic device is configured to, based on one or more touch gestures on a card comprising an object within the virtual reality space, perform one or a combination of at least two of moving, rotating, and resizing the object on a first reference plane in which the card is included (Tosas Bautista discloses “If the mobile device's keypad is a touch screen, the user will be able to use traditional navigation gestures to zoom in and zoom out (pinch gesture), roll (rotation gesture), pan (pan gesture) and click (tap gesture)” in [0097]; see also [0003-0004, 0077].)
As to Claim 14, Tosas Bautista teaches The electronic system of claim 1, wherein one or a combination of at least two of moving, rotating, and resizing the object according to one or more touch gestures input from the user is performed within a preset boundary within the virtual reality space (Tosas Bautista discloses “If the mobile device's keypad is a touch screen, the user will be able to use traditional navigation gestures to zoom in and zoom out (pinch gesture), roll (rotation gesture), pan (pan gesture) and click (tap gesture)” in [0097].)
As to Claim 15, Tosas Bautista teaches The electronic system of claim 1, wherein the first screen and/or the second screen displays a third view frustum corresponding to a second electronic device accessing the virtual reality space in a viewing direction of the third view frustum at a position corresponding to the second electronic device (Tosas Bautista discloses “FIG. 4I depicts a typical configuration of an embodiment of the system showing the relative positions of a user 200, HMD 450, mobile device 100, and a representation of the virtual surface 201. The pose 45I of the HMD, the pose 203 of the mobile device, and the representation of the virtual surface 201 can all be defined within a common world coordinate system 202” in [0111]. Here, the pose 451 can have one or more view frustum.)
As to Claim 17, Tosas Bautista teaches The electronic system of claim 15, wherein the electronic device is configured to, based on one or more pinch gestures on an object placed within the virtual reality space, input from the user wearing the HMD, perform one or a combination of at least two of moving, rotating, and resizing the object (Tosas Bautista discloses “If the mobile device's keypad is a touch screen, the user will be able to use traditional navigation gestures to zoom in and zoom out (pinch gesture), roll (rotation gesture), pan (pan gesture) and click (tap gesture)” in [0097].)
Claim 18 recites similar limitations as claim 1 but in a method form. Therefore, the same rationale used for claim 1 is applied.
Claim 19 recites similar limitations as claim 1 but in a computer readable medium form. Therefore, the same rationale used for claim 1 is applied.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over Tosas Bautista (US 2014/0248950 A1) in view of Bedikian et al. (US 2017/0345218 A1).
As to Claim 4, Tosas Bautista teaches The electronic system of claim 1, wherein a hand of the user being tracked is displayed together with the virtual display of the electronic device on the second screen, and a touch gesture is input into the actual display with a motion of the user inputting the touch gesture into the virtual display (Tosas Bautista discloses “These, also known as "natural interfaces", may involve tracking of the users hands and body allowing them to directly "touch" the information or objects overlaid on their fields of view.” in [0005]. Bedikian further discloses “With reference to FIGS. 12 and 13, the user performs a gesture that is captured by the cameras 1202, 1204 as a series of temporally sequential images. In other implementations, cameras 1202, 1204 can capture any observable pose or portion of a user… Gesture-recognition module 1348 provides input to an electronic device, allowing a user to remotely control the electronic device and/or manipulate virtual objects, such as prototypes/models, blocks, spheres, or other shapes, buttons, levers, or other controls, in a virtual environment displayed on display 1302” in [0129]; “FIG. 23 depicts one implementation 2300 of a personal head mounted display (HMD) 2302 used to generate a virtual object 302 that is manipulated by a virtual control 104 of a smart phone device 102” in [0195].)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of Tosas Bautista with the teaching of Bedikian so that the user can see her gestures and interactively control the objects in the virtual environment (Bedikian, [0196]).
Claims 6-9 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Tosas Bautista (US 2014/0248950 A1) in view of Kiemele et al. (US 2019/0065026 A1).
As to Claim 6, Tosas Bautista teaches The electronic system of claim 1, wherein the electronic device is configured to perform a sketch on a screen plane of the first view frustum within the virtual reality space based on a pen drawing input into the actual display, and display the sketched screen plane as the first screen (Kiemele discloses “User interface 400 includes one or more controls for rendering new visual content. For example, user interface 400 may include a canvas portion 402 in which a user can draw 2D or 3D shapes and images via the application of touch input. FIG. 4 shows one such possible shape 404 drawn in canvas portion 402, which is rendered in region 108 on HMD device 104” in [0027], see also Fig 4 below:
PNG
media_image2.png
732
589
media_image2.png
Greyscale
).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of Tosas Bautista with the teaching of Kiemele so as to provide a user interface to facilitate the rendering of new or editing of visual content (Kiemele, [0026]).
As to Claim 7, Tosas Bautista in view of Kiemele teaches The electronic system of claim 6, wherein the electronic device is configured to perform one or a combination of at least two of moving, rotating, and resizing the object on the screen plane based on one or more touch gestures on an object generated by the sketch (Tosas Bautista discloses finger gestures such as pinching and rotating in [0003]. Kiemele further discloses “A touch-sensitive input device 110 may facilitate interaction with the virtual reality experience. Input device 110 includes a touch sensor configured to receive user input, which may be provided to HMD device 104 and used to control the 3D virtual reality experience by varying visual content 106 rendered on the HMD display. User input received by the touch sensor may include touch, hover, and/or hand gesture input…” in [0015]; translational and rotational input in [0011].)
As to Claim 8, Tosas Bautista in view of Kiemele teaches The electronic system of claim 6, wherein the electronic device is configured to determine a projection plane of an object generated by the sketch based on a viewing direction of the first view frustum when performing the sketch (Kiemele discloses “User interface 400 includes one or more controls for rendering new visual content. For example, user interface 400 may include a canvas portion 402 in which a user can draw 2D or 3D shapes and images via the application of touch input. FIG. 4 shows one such possible shape 404 drawn in canvas portion 402, which is rendered in region 108 on HMD device 104” in [0027]; translational and rotational input in [0011].)
As to Claim 9, Tosas Bautista teaches in view of Kiemele The electronic system of claim 8, wherein the electronic device is configured to generate the object on the projection plane and place the generated object in the virtual reality space when a command to generate the object is input from the user (Kiemele discloses “As described above, shape 404 may represent a 2D or 3D input applied to canvas portion 402. Various approaches may enable the reception of 3D input at canvas portion 402. When touch-sensitive input device 110 is operable to detect a z-coordinate of touch input into/out of the input device---e.g., as a force or pressure-the user may control such force/pressure to vary the z-coordinate of input applied to canvas portion 402.” in [0029].)
As to Claim 16, Tosas Bautista teaches The electronic system of claim 15, wherein the first screen and/or the second screen, in response to a sketch being performed on a second screen plane of the third view frustum based on a pen drawing input into an actual display of the second electronic device, displays the sketched second screen plane (Tosas Bautista discloses “If the mobile device is used as display, the perspective projection of the virtual surface will be shown on the mobile device's display according to the pose 203 of the mobile device within the common world coordinate system 202. In this scenario, the HMD can show extra information relating to the contents mapped on the virtual surface. For example, the HMD can show a high level map of all the contents on the virtual surface, indicating the region that is currently being observed on the mobile device's display” in [0112]. Kiemele further discloses “User interface 400 includes one or more controls for rendering new visual content. For example, user interface 400 may include a canvas portion 402 in which a user can draw 2D or 3D shapes and images via the application of touch input. FIG. 4 shows one such possible shape 404 drawn in canvas portion 402, which is rendered in region 108 on HMD device 104” in [0027], see also Fig 4.)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of Tosas Bautista with the teaching of Kiemele so as to provide a user interface to facilitate the rendering of new or editing of visual content (Kiemele, [0026]).
Claims 11-13 are rejected under 35 U.S.C. 103 as being unpatentable over Tosas Bautista (US 2014/0248950 A1) in view of Bedikian et al. (US 2014/0201666 A1, hereinafter Bedikian2).
As to Claim 11, Tosas Bautista teaches The electronic system of claim 1, wherein the electronic device is configured to, based on one or more touch gestures on a stand of a card comprising an object within the virtual reality space, perform moving and/or rotating the object on a second reference plane in which the stand is included, and the stand is generated to be parallel to a bottom of the virtual reality space at a position at which the card is projected onto the bottom of the virtual reality space (Tosas Bautista discloses “For example, if the mobile device has a touchscreen, the user can perform navigation gestures such as sliding (for scrolling), pinching (for zooming) and rotating in order to visualise the information more clearly. Pinching and rotating are examples of navigation gestures that require the use of two fingers in order to be performed. Two finger navigation gestures usually require the involvement of both hands, one to hold the mobile device and another to perform the gesture” in [0077]; “preferred embodiments of the system use an expanding plane 204, located at the origin of the world coordinate system. In these embodiments, the Photomap image can be thought of as a patch of texture anchored on this plane” in [0323]. Here, Tosas Bautista teaches a virtual object is anchored on a real plane. It is obvious that the virtual object can be a virtual image frame or card anchored on a real surface. For example, Bedikian2 discloses “In embodiments, the virtual surface construct may be fixed in space, e.g., relative to the screen” in [0105]; “As another example, the user's hand may be tracked to determine the positions and orientations of all fingers; each finger may have its own associated virtual surface construct… A joint virtual plane may serve, e.g., as a virtual drawing canvas on which multiple lines can be drawn by the fingers at once” in [0127], see also [0070].)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of Tosas Bautista with the teaching of Bedikian2 so as to provide a gesture control on a drawing canvas.
As to Claim 12, Tosas Bautista teaches The electronic system of claim 1, wherein the electronic device is configured to, while a first touch gesture on a stand of a card comprising an object within the virtual reality space is input, control a height of the card within the virtual reality space based on a second touch gesture on the card (Tosas Bautista discloses “For example, if the mobile device has a touchscreen, the user can perform navigation gestures such as sliding (for scrolling), pinching (for zooming) and rotating in order to visualise the information more clearly. Pinching and rotating are examples of navigation gestures that require the use of two fingers in order to be performed. Two finger navigation gestures usually require the involvement of both hands, one to hold the mobile device and another to perform the gesture” in [0077]. Bedikian2 further discloses “In an embodiment, once engaged, further movements of the control object may serve to move graphical components across the screen (e.g., drag an icon, shift a scroll bar, etc.), change perceived "depth" of the object to the viewer (e.g., resize and/or change shape of objects displayed on the screen in connection, alone, or coupled with other visual effects) to create perception of "pulling" objects into the foreground of the display or "pushing" objects into the background of the display, create new screen content (e.g., draw a line)” in [0106]; “Two parallel virtual planes may also be used to, effectively, define a virtual control construct with a certain associated thickness (i.e., a "virtual slab")” in [0107]; “To give just one example, the user may, with her index finger stretched out, have her thumb and middle finger touch so as to pin the virtual surface construct at a certain location relative to the current position of the index-finger-tip” in [0111].)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of Tosas Bautista with the teaching of Bedikian2 so as to enable the user to intuitively control and manipulate the electronic device and virtual objects by simply performing body gestures (Bedikian2, [0150]).
As to Claim 13, Tosas Bautista teaches The electronic system of claim 1, wherein the electronic device is configured to, while a first touch gesture on a stand of a card comprising an object within the virtual reality space is input, perform rotating the object based on a second touch gesture on an area other than the card or the stand (Tosas Bautista discloses “For example, if the mobile device has a touchscreen, the user can perform navigation gestures such as sliding (for scrolling), pinching (for zooming) and rotating in order to visualise the information more clearly. Pinching and rotating are examples of navigation gestures that require the use of two fingers in order to be performed.” in [0077]. Bedikian2 further discloses “A joint virtual plane may serve, e.g., as a virtual drawing canvas on which multiple lines can be drawn by the fingers at once” in [0127]; gesture-recognition module to recognize rotation and translation gesture in [0070]. Official notice has been taken of the fact that a rotation gesture is performed by one finger on the object and another finger on the area other than the object, which is well-known in the art (see MPEP 2144.03).)
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of Tosas Bautista with the teaching of Bedikian2 so as to provide a rotation gesture control on a drawing canvas.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to WEIMING HE whose telephone number is (571)270-1221. The examiner can normally be reached on Monday-Friday, 8:30am-5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tammy Goddard can be reached on 571-272-7773. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/WEIMING HE/
Primary Examiner, Art Unit 2611