DETAILED ACTION
1. This Office Action is responsive to claims filed for App. 19/175,712 on April 10, 2025. Claims 1-17 are pending.
America Invents Act
2. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
3. All claims are identical to or patentably indistinct from the invention claimed in the parent application prior to the filing of this Continued Prosecution Application under 37 CFR 1.53(d) (that is, restriction would not be proper) and could have been finally rejected on the grounds and art of record in the next Office action. Accordingly, THIS ACTION IS MADE FINAL even though it is a first action after the filing under 37 CFR 1.53(d). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
“For a new application, claims may be finally rejected in the first Office action when (A) the new application is a continuing application of, or a substitute for, an earlier application, and (B) all claims of the new application (1) are identical to, patentably indistinct from, or have unity of invention with, the claims in the earlier application (in other words, restriction (including lack of unity) would not have been proper if the new or amended claims had been entered in the earlier application), and (2) would have been properly finally rejected on the grounds and art of record in the next Office action if they had been entered in the earlier application.”
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Claim Rejections - 35 USC § 103
4. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
5. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
6. Claims 1-5, 7-10, 12-17 are rejected under 35 U.S.C. 103 as being unpatentable over Mikhailov et al. ( US 2016/0129346 A1 ) in view of Ullrich et al. ( US 2014/0362014 A1 ).
Mikhailov teaches in Claim 1:
A device for dexterous interaction in a virtual world ( Figure 1, [0066] discloses a controller 104 which can provide input for a video game in an interactive environment. Please note an HMD 102 which can render a virtual reality scene ), the device comprising:
a tactile feedback element ( [0082] disclose a tactile feedback hardware included in the controller 104 which can output vibration feedback, etc );
an orientation sensor configured for detecting data based on orientation of at least a part of the device; and a processor configured to receive the data from the orientation sensor, and the processor is configured to transmit the data ( [0083] discloses the controller can include circuitry (indicative of a processor) and inertial sensors that can communicate/transmit data to process positions, changes in positions, and other 6 axes type data elements. To clarify, an inertial sensor which can determine position and changes in position is a reasonable interpretation of an orientation sensor and this is used to determine motion aspects of the controller and how it impacts the virtual objects );
wherein the device comprises a hand-operated controller ( Figure 1, [0066] discloses additional details on the user holding and operating the controller 104 ), the device being configured to virtually move a virtual representation of a hand in a virtual environment depicted on a display of a display device such that the virtual representation of a hand virtually engages with a secondary virtual element ( Figures 5A/5B, [0091] disclose a scene in which the user interacts with, namely a user’s virtual hands holding the steering wheel (read the virtual hands as a virtual representation of a hand) and the steering wheel as a secondary virtual element which the virtual hands engage with). Other examples are provided, such as Figures 7A/7B and in either situation, there are two virtual element(s) engaging with each other. This is output on the head mounted display 102 ); but
Mikhailov does not explicitly teach “the device is configured to receive tactile feedback based on a virtually represented surface characteristic of the secondary virtual element, wherein the device is separate from the display of the display device, and wherein the virtually represented surface characteristic of the secondary virtual element comprises a virtual representation of a texture, and wherein the tactile feedback is configured to convey via the tactile feedback element to a user of the device how the texture would feel to the virtual representation of a hand”
However, in the same field of endeavor, interaction in a virtual world with haptic feedback, Ullrich teaches of a haptic effect determination module 126, ( Ullrich, Figures 1 and 3, [0045] ). Notably, this module may select a haptic effect for a virtual object, which can be based on size, color, texture, material, movement, etc. An example is given to determine a haptic effect configured to simulate the texture of sand if the virtual object comprises an associated virtual texture that is sandy or coarse (read as a surface characteristic of the secondary virtual element). Figure 3, [0055] discloses an example of a user 308 interacting with a gun and a haptic effect is configured to simulate the texture of the gun handle or grip, e.g. a wood or rubber texture. [0108] disclose an example of interacting with a virtual object of a fruit and a haptic effect can simulate a surface of the outside of the fruit. Respectfully, a number of other examples are provided as well. As combined with Mikhailov, who also teaches of a hand in the form of virtual hands, the same hands can engage with a secondary element (whether it is steering wheel, a gun, etc) and receive haptic feedback which is based on a surface characteristic of the wheel, gun, etc. Ullrich teaches in Figure 8, [0086] of determining characteristics of virtual objects and determining a haptic effect based on the characteristics. To clarify, Mikhailov teaches to use a virtual hand, i.e. the claimed virtual element to interact in a virtual setting and with a secondary virtual element, i.e. a virtual steering wheel (as well as other examples) and Ullrich teaches to use a hand to interact with a virtual element, akin to the secondary virtual element of Mikhailov. As combined, the references teach of two virtual elements, that being a virtual hand(s) and a virtual gun which can interact with each other and the texture (surface characteristics) of the gun can be imparted to the user as the virtual hand engages with the gun. Mikahilov teaches of two virtual elements, namely the hands and the steering wheel, but does not explicitly teach of imparting surface characteristics of the engaged steering wheel. However, Ullrich teaches of an engaged second object, i.e. the gun, imparting texture aspects to the user’s hands and as combined, this is due to the virtual hand engaging with the virtual gun. As combined, one of ordinary skill in the art would realize a direct touch of the user’s hand resulting in a feedback, or a touch by a virtual hand and then imparting the feedback to the controller, is within the combination of these references.
To clarify, the combination teaches as follows:
“the device is configured to receive tactile feedback based on a virtually represented surface characteristic of the secondary virtual element ( To address the “represented surface characteristic”, Ullrich teaches of providing a haptic effect which comprises a simulated texture on a surface of the computing device, i.e. touch sensitive surface. [0055] discloses a user’s hand interacting with a gun and the haptic effect can simulate the texture of the gun handle or grip, e.g. a wood or rubber texture. It is important to clarify here that Mikhailov teaches of two virtual aspects, virtual hand and a virtual steering wheel and Ullrich focuses on the gun, akin to the virtual steering wheel. The key point is the texture of the secondary object, i.e. what the user is interacting with, has a texture aspect which can be conveyed to the user ), wherein the device is separate from the display of the display device ( As noted and shown in Figure 1, the controller 104 is separate from the head-mounted display 102 ), and wherein the virtually representation surface characteristic of the secondary virtual element comprises a virtual representation of a texture ( As noted above, Ullrich teaches of simulating a texture of the gun handle or grip, the wood or rubber texture ), and wherein the tactile feedback is configured to convey via the tactile feedback element to a user of the device how the texture would feel to the virtual representation of a hand ( Ullrich, [0099], [0045] discloses a key aspect of associating user interactions with interface elements with particular haptic effects, to allow the user to feel textures associated with objects in the user interface, [0053] )”
Therefore, it would have been obvious to one of ordinary skill in the art, at the effective filed date of the invention, to implement the haptic feedback based on the texture/material aspects of the virtual object, with the motivation that it will provide a more realistic or immersive user experience, ( Ullrich, [0108] ).
Mikhailov teaches in Claim 2:
The device of claim 1, wherein the device is configured to be connected to a computer comprising a display, and the virtual environment, the virtual representation of a hand, and the secondary virtual element are visible on the display. ( Figures 2 and 12, [0064] disclose a head-mounted display HMD 102 which can display content to the user 100, namely the virtual content )
Mikhailov teaches in Claim 3:
The device of claim 2, wherein the computer further comprises a storage device that includes a memory unit, the memory unit includes a plurality of virtual environments that are configured to be displayed on the display, and user interaction occurs with each one of the plurality of virtual environments based on movement of the device. ( Figure 12, [0103] disclose a memory 1302 for storage purposes and given Mikhailov teaches of displaying interactive scenes, it is clear such scenes and scenarios are stored in a memory. For examples, Figures 4A/4B show a steering environment and Figure 7A/7B show a different environment )
Mikhailov teaches in Claim 4:
The device of claim 2, wherein the processor is configured to wirelessly transmit and receive the data, and the computer is configured to wirelessly transmit and receive the data. ( [0110] teaches of a WiFi module which can allow for wireless networking for the various devices. [0083] discloses circuitry which can process position data to be sent to the HMD )
Mikhailov teaches in Claim 5:
The device of claim 1, wherein the device includes a component that is configured to move via a bearing. ( [0083] discloses using inertial sensors which can measure 6 axes type data elements. To clarify, the controller motion can be measured as it moves in these axes. As for the bearing, in light of the controller being able to move in a variety of axes, some type of bearing/socket structure to allow for freedom is well known and common. Respectfully, examiner asserts Official Notice to this structure/concept )
Mikhailov teaches in Claim 7:
The device according to claim 1, further comprising at least one additional sensor configured to detect engagement with a user's hand, and the at least one virtual element is virtually articulated based on output from the at least one additional sensor, wherein the virtual articulation is related to articulation of the user's hand. ( [0095] discloses a controller which has lights that can be tracked in addition for sensors and other buttons for communicating information back to the computer, such as by using a camera, etc (read as examples of at least one additional sensor) )
Mikhailov teaches in Claim 8:
A method of providing interaction in a virtual world, the virtual world displayed on a display of a display device ( Figure 1, [0066] discloses a controller 104 which can provide input for a video game in an interactive environment. Please note an HMD 102 which can render a virtual reality scene ), the method comprising:
virtually engaging a secondary virtual element with a virtual representation of a hand in a virtual environment via use of a device comprising a hand-operated controller ( Figure 1, [0066] discloses additional details on the user holding and operating the controller 104 ) configured to control virtual movement of the virtual representation of a hand, wherein the device is separate from the display of the display device ( Figures 5A/5B, [0091] disclose a scene in which the user interacts with, namely a user’s virtual hands holding the steering wheel (read the virtual hands as a virtual representation of a hand) and the steering wheel as a secondary virtual element which the virtual hands engage with). Other examples are provided, such as Figures 7A/7B and in either situation, there are two virtual element(s) engaging with each other ); but
Mikhailov does not explicitly teach of “providing tactile feedback to the device based on at least a surface characteristic of the secondary virtual element, wherein the virtually represented surface characteristic of the secondary virtual element comprises a virtual representation of a texture, and wherein the tactile feedback is configured to convey via the tactile feedback element to a user of the device how the texture would feel to the virtual representation of a hand.”
However, in the same field of endeavor, interaction in a virtual world with haptic feedback, Ullrich teaches of a haptic effect determination module 126, ( Ullrich, Figures 1 and 3, [0045] ). Notably, this module may select a haptic effect for a virtual object, which can be based on size, color, texture, material, movement, etc. An example is given to determine a haptic effect configured to simulate the texture of sand if the virtual object comprises an associated virtual texture that is sandy or coarse (read as a surface characteristic of the secondary virtual element). Figure 3, [0055] discloses an example of a user 308 interacting with a gun and a haptic effect is configured to simulate the texture of the gun handle or grip, e.g. a wood or rubber texture. [0108] disclose an example of interacting with a virtual object of a fruit and a haptic effect can simulate a surface of the outside of the fruit. Respectfully, a number of other examples are provided as well. As combined with Mikhailov, who also teaches of a hand in the form of virtual hands, the same hands can engage with a secondary element (whether it is steering wheel, a gun, etc) and receive haptic feedback which is based on a surface characteristic of the wheel, gun, etc. Ullrich teaches in Figure 8, [0086] of determining characteristics of virtual objects and determining a haptic effect based on the characteristics. To clarify, Mikhailov teaches to use a virtual hand, i.e. the claimed virtual element to interact in a virtual setting and with a secondary virtual element, i.e. a virtual steering wheel (as well as other examples) and Ullrich teaches to use a hand to interact with a virtual element, akin to the secondary virtual element of Mikhailov. As combined, the references teach of two virtual elements, that being a virtual hand(s) and a virtual gun which can interact with each other and the texture (surface characteristics) of the gun can be imparted to the user as the virtual hand engages with the gun. Mikhailov teaches of two virtual elements, namely the hands and the steering wheel, but does not explicitly teach of imparting surface characteristics of the engaged steering wheel. However, Ullrich teaches of an engaged second object, i.e. the gun, imparting texture aspects to the user’s hands and as combined, this is due to the virtual hand engaging with the virtual gun. As combined, one of ordinary skill in the art would realize a direct touch of the user’s hand resulting in a feedback, or a touch by a virtual hand and then imparting the feedback to the controller, is within the combination of these references.
To clarify, the combination teaches as follows:
“providing tactile feedback to the device based on at least a surface characteristic of the secondary virtual element ( To address the “represented surface characteristic”, Ullrich teaches of providing a haptic effect which comprises a simulated texture on a surface of the computing device, i.e. touch sensitive surface. [0055] discloses a user’s hand interacting wit ha gun and the haptic effect can simulate the texture of the gun handle or grip, e.g. a wood or rubber texture. It is important to clarify here that Mikhailov teaches of two virtual aspects, virtual hand and a virtual steering wheel and Ullrich focuses on the gun, akin to the virtual steering wheel. The key point is the texture of the secondary object, i.e. what the user is interacting with, has a texture aspect which can be conveyed to the user ), wherein the virtually represented surface characteristic of the secondary virtual element comprises a virtual representation of a texture ( As noted above, Ullrich teaches of simulating a texture of the gun handle or grip, the wood or rubber texture ), and wherein the tactile feedback is configured to convey via the tactile feedback element to a user of the device how the texture would feel to the virtual representation of a hand. ( Ullrich, [0099], [0045] discloses a key aspect of associating user interactions with interface elements with particular haptic effects, to allow the user to feel textures associated with objects in the user interface, [0053] )”
Therefore, it would have been obvious to one of ordinary skill in the art, at the effective filed date of the invention, to implement the haptic feedback based on the texture/material aspects of the virtual object, with the motivation that it will provide a more realistic or immersive user experience, ( Ullrich, [0108] ).
Mikhailov teaches in Claim 9:
The method of claim 8, wherein the device comprises:
a tactile feedback element ( [0082] disclose a tactile feedback hardware included in the controller 104 which can output vibration feedback, etc );
an orientation sensor configured for detecting data based on orientation of at least a portion of the device; and a processor configured to receive the data from the orientation sensor, and the processor is configured to transmit the data. ( [0083] discloses the controller can include circuitry (indicative of a processor) and inertial sensors that can communicate/transmit data to process positions, changes in positions, and other 6 axes type data elements. To clarify, an inertial sensor which can determine position and changes in position is a reasonable interpretation of an orientation sensor and this is used to determine motion aspects of the controller and how it impacts the virtual objects )
As per Claim 10:
Mikhailov does not explicitly teach “wherein at least a portion of the device has an ellipsoid shape and is connected to a flexible shaft.”
However, Mikhailov teaches in Figure 4A/4B of a handheld controller 104 and Figures 7A/7B, [0095] of a different type of controller. Respectfully, in light of the different types of controllers, along with different layouts/shapes, it is a design choice issue as to the shape. Respectfully, in light of these teachings and ordinary skill, the shape of the housing is a design choice. This is not a patentable distinction in light of Mikhailov teaching to have different shapes/types of controllers. In light of [0083] teaching of being able to manipulate in up to six degrees of freedom, it is clear that there is some flexibility required in terms of structural detail.
Therefore, it would have been obvious to one of ordinary skill in the art, at the effective filed date of the invention, to implement the shape of the housing in a plurality of shapes, including an ellipsoid shape, with the motivation that it is a design choice to do so. The functionality of the device is not altered because of the shape and one of ordinary skill would realize to be able to design it accordingly.
Mikhailov and Ullrich teach in Claim 12:
The method of claim 9, wherein the device further comprises at least one button that is pressure sensitive, and the at least one button is configured to provide tactile feedback based on a degree of pressure applied to the at least one button. ( [0087], [0095] disclose button inputs on the controller are tracked and correlated to the input provided to the game. [0082] discloses vibration feedback, pressure feedback, etc to the controller and respectfully, it is clear that this could be applied to the button itself. Ullrich, [0063], [0083] discloses a plurality of buttons which are touch sensitive to allow the user to communicate and Ullrich clearly teaches of haptic feedback to the user as well. Respectfully, applying it to the button itself is well known and examiner asserts Official Notice to this concept )
Mikhailov teaches in Claim 13:
The method of claim 9, wherein the device is configured to be wirelessly connected to a computer. ( [0110] teaches of a WiFi module which can allow for wireless networking for the various devices. [0083] discloses circuitry which can process position data to be sent to the HMD )
Mikhailov teaches in Claim 14:
The method of claim 9, wherein the virtual representation of a hand is configured to be manipulated in real-time and in a continuous feedback loop. ( Respectfully, Figures 4A/4B, 7A/7B, etc teach of immersive experiences in different environments and in light of the tracking/updating as the user interacts, it is clear that this is a real-time process and the user continuously receives haptic effects as part of the immersion experience. To clarify on the loop aspect, as the user performs an interaction, feedback is output, the user continues to perform interactions and further feedback is output )
Mikhailov teaches in Claim 15:
The method of claim 14, wherein the continuous feedback loop provides physical feedback to the device based on manipulation of the virtual representation of a hand. ( Mikhailov teaches that as the user interacts with the virtual elements, haptic effects are output, [0072]. The combination teaches to output customized haptic effects relating to the virtual elements, as taught by Ullrich )
Mikhailov teaches in Claim 16:
A system for dexterous interaction in a virtual environment ( Figure 1, [0066] discloses a controller 104 which can provide input for a video game in an interactive environment. Please note an HMD 102 which can render a virtual reality scene ), the system comprising:
a device comprising:
a hand-operated controller ( Figure 1 shows a controller 104 which is held by the user );
a tactile feedback element ( [0082] disclose a tactile feedback hardware included in the controller 104 which can output vibration feedback, etc );
a sensor configured to detect an orientation of the device; and a processor configured to receive data from the sensor, and the processor is configured to transmit the data ( [0083] discloses the controller can include circuitry (indicative of a processor) and inertial sensors that can communicate/transmit data to process positions, changes in positions, and other 6 axes type data elements. To clarify, an inertial sensor which can determine position and changes in position is a reasonable interpretation of a sensor and this is used to determine motion aspects of the controller and how it impacts the virtual objects ); and
a computer comprising a headset display configured to display a virtual environment ( Figures 2 and 12, [0064] disclose a head-mounted display HMD 102 which can display content to the user 100, namely the virtual content ), the computer is configured to receive the data from the processor, and the computer is configured to analyze the data to virtually manipulate a virtual representation of a hand in the virtual environment ( Figures 4A/4B, 7A/7B, etc, [0072] disclose of transmitting data to and from the controller, HMD and a computer, which can update the virtual scene as the user performs interactions within it. Some examples are described below ), the device being separate from the headset display ( As noted and shown in Figure 1, the controller 104 is separate from the head-mounted display 102 ), and
virtual manipulation of a secondary virtual element by the virtual representation of a hand ( Figures 5A/5B, [0091] disclose a scene in which the user interacts with, namely a user’s virtual hands holding the steering wheel (read the virtual hands as a virtual representation of a hand) and the steering wheel as a secondary virtual element which the virtual hands engage with). Other examples are provided, such as Figures 7A/7B and in either situation, there are two virtual element(s) engaging with each other ); but
Mikhailov does not explicitly teach to “provides tactile feedback to the device, and the tactile feedback is representative of at least a surface characteristic of the secondary virtual element, wherein the virtually represented surface characteristic of the secondary virtual element comprises a virtual representation of a texture, and wherein the tactile feedback is configured to convey indirectly via the tactile feedback element to a user of the device how the texture would feel to the virtual representation of a hand.”
However, in the same field of endeavor, interaction in a virtual world with haptic feedback, Ullrich teaches of a haptic effect determination module 126, ( Ullrich, Figures 1 and 3, [0045] ). Notably, this module may select a haptic effect for a virtual object, which can be based on size, color, texture, material, movement, etc. An example is given to determine a haptic effect configured to simulate the texture of sand if the virtual object comprises an associated virtual texture that is sandy or coarse (read as a surface characteristic of the secondary virtual element). Figure 3, [0055] discloses an example of a user 308 interacting with a gun and a haptic effect is configured to simulate the texture of the gun handle or grip, e.g. a wood or rubber texture. [0108] disclose an example of interacting with a virtual object of a fruit and a haptic effect can simulate a surface of the outside of the fruit. Respectfully, a number of other examples are provided as well. As combined with Mikhailov, who also teaches of a hand in the form of virtual hands, the same hands can engage with a secondary element (whether it is steering wheel, a gun, etc) and receive haptic feedback which is based on a surface characteristic of the wheel, gun, etc. Ullrich teaches in Figure 8, [0086] of determining characteristics of virtual objects and determining a haptic effect based on the characteristics. To clarify, Mikhailov teaches to use a virtual hand, i.e. the claimed virtual element to interact in a virtual setting and with a secondary virtual element, i.e. a virtual steering wheel (as well as other examples) and Ullrich teaches to use a hand to interact with a virtual element, akin to the secondary virtual element of Mikhailov. As combined, the references teach of two virtual elements, that being a virtual hand(s) and a virtual gun which can interact with each other and the texture (surface characteristics) of the gun can be imparted to the user as the virtual hand engages with the gun. Mikahilov teaches of two virtual elements, namely the hands and the steering wheel, but does not explicitly teach of imparting surface characteristics of the engaged steering wheel. However, Ullrich teaches of an engaged second object, i.e. the gun, imparting texture aspects to the user’s hands and as combined, this is due to the virtual hand engaging with the virtual gun. As combined, one of ordinary skill in the art would realize a direct touch of the user’s hand resulting in a feedback, or a touch by a virtual hand and then imparting the feedback to the controller, is within the combination of these references.
To clarify, the combination teaches as follows:
“provides tactile feedback to the device, and the tactile feedback is representative of at least a surface characteristic of the secondary virtual element ( To address the “represented surface characteristic”, Ullrich teaches of providing a haptic effect which comprises a simulated texture on a surface of the computing device, i.e. touch sensitive surface. [0055] discloses a user’s hand interacting wit ha gun and the haptic effect can simulate the texture of the gun handle or grip, e.g. a wood or rubber texture. It is important to clarify here that Mikhailov teaches of two virtual aspects, virtual hand and a virtual steering wheel and Ullrich focuses on the gun, akin to the virtual steering wheel. The key point is the texture of the secondary object, i.e. what the user is interacting with, has a texture aspect which can be conveyed to the user ), wherein the virtually represented surface characteristic of the secondary virtual element comprises a virtual representation of a texture ( As noted above, Ullrich teaches of simulating a texture of the gun handle or grip, the wood or rubber texture ), and wherein the tactile feedback is configured to convey indirectly via the tactile feedback element to a user of the device how the texture would feel to the virtual representation of a hand ( Ullrich, [0099], [0045] discloses a key aspect of associating user interactions with interface elements with particular haptic effects, to allow the user to feel textures associated with objects in the user interface, [0053] )”
Therefore, it would have been obvious to one of ordinary skill in the art, at the effective filed date of the invention, to implement the haptic feedback based on the texture/material aspects of the virtual object, with the motivation that it will provide a more realistic or immersive user experience, ( Ullrich, [0108] ).
Mikhailov teaches in Claim 17:
The system according to claim 16, wherein the device is configured to move with three degrees of freedom via a bearing. ( [0083] discloses using inertial sensors which can measure 6 axes type data elements. To clarify, the controller motion can be measured as it moves in these axes. As for the bearing, in light of the controller being able to move in a variety of axes, some type of bearing/socket structure to allow for freedom is well known and common. Respectfully, examiner asserts Official Notice to this structure/concept )
7. Claims 6 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over Mikhailov et al. ( US 2016/0129346 A1 ) in view of Ullrich et al. ( US 2014/0362014 A1 ), as applied to Claims 1 and 9, further in view of Olsson et al. ( US 2012/0306603 A1 ).
As per Claim 6:
Mikhailov does not explicitly teach “wherein the orientation sensor comprises an accelerometer and a gyroscope.”
However, in the same field of endeavor, handheld controllers, Olsson discloses [0219], [0252] teaching the use of a gyroscope and/or an accelerometer may be incorporated to provide additional signals to measure displacements and these are examples of inertial sensors, which Mikhailov also teaches of.
Therefore, it would have been obvious to one of ordinary skill in the art, at the effective filed date of the invention, to implement the accelerometer, gyroscope, etc, as taught by Olsson, with the motivation that Mikhailov already teaches of the use of inertial sensors to determine the position of the controller and Olsson explicitly teaches of gyroscope and accelerometer, which are examples of inertial sensors. Respectfully, many types of inertial sensors can determine the motion of the controller and it is well known to use these specific types of sensors, ( Olsson, [0252] ).
As per Claim 11:
Mikhailov does not explicitly teach “wherein the orientation sensor comprises an accelerometer and a gyroscope.”
However, in the same field of endeavor, handheld controllers, Olsson discloses [0219], [0252] teaching the use of a gyroscope and/or an accelerometer may be incorporated to provide additional signals to measure displacements and these are examples of inertial sensors, which Mikhailov also teaches of.
Therefore, it would have been obvious to one of ordinary skill in the art, at the effective filed date of the invention, to implement the accelerometer, gyroscope, etc, as taught by Olsson, with the motivation that Mikhailov already teaches of the use of inertial sensors to determine the position of the controller and Olsson explicitly teaches of gyroscope and accelerometer, which are examples of inertial sensors. Respectfully, many types of inertial sensors can determine the motion of the controller and it is well known to use these specific types of sensors, ( Olsson, [0252] ).
Conclusion
8. All claims are identical to or patentably indistinct from the invention claimed in the parent application prior to the filing of this Continued Prosecution Application under 37 CFR 1.53(d) (that is, restriction would not be proper) and could have been finally rejected on the grounds and art of record in the next Office action. Accordingly, THIS ACTION IS MADE FINAL even though it is a first action after the filing under 37 CFR 1.53(d). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DENNIS P JOSEPH whose telephone number is (571)270-1459. The examiner can normally be reached Monday - Friday 5:30 - 3:30 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amr Awad can be reached at 571-272-7764. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DENNIS P JOSEPH/Primary Examiner, Art Unit 2621