Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 08/27/2024 was filed after the mailing date of the non-final rejection on 02/10/2026. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1, 15, & 16 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Bryant et al (US 8226484 B2, hereinafter, "Bryant").
Regarding Claim 1, Bryant teaches an information processing apparatus communicably connected to an operation terminal (Bryant, Fig. 1, [0014], ln. 1-2, "…the game system 100 and software can respond to inputs of handheld controller 107 in a variety of ways.") including an input unit that performs an input operation for an image of a virtual space (Bryant, Fig. 1, [0014], ln. 3-4, "…the game system 100 can for example control the graphics displayed on display 101 to correspond to the inputs provided by controller 107.") and performs a specified function in response to the input operation (Bryant, Fig. 1, [0014], ln. 4-6, "…when the game player 706 yaws the controller 107 by rotating about the Y axis, the simulated graphics modeling the simulated vehicle handlebars 702 can be observed to rotate in the direction of yaw."), the information processing apparatus comprising one or more processors and/or circuitry configured to: execute a position obtaining processing that obtains position information relating to a position of the operation terminal (Bryant, Fig. 1, [0013], ln. 7-13, "…when the game player 706 'pitches' the handheld controller 107 based on up/down wrist movements, such rotation can be sensed and used to control simulated throttle. Similarly, it is possible to, for example, simulate the steering of the simulated vehicle handlebars 702 by rotating handheld controller 107 about the 'yaw' axis to thereby simulate the handlebars being pulled toward or pushed away from the vehicle operator. In certain exemplary illustrative non-limiting implementations, it is possible to simulate and detect 'lean' of a motorcycle by detecting the rotation of handheld controller 107 about the 'roll' axis."); and execute a changing processing that is capable of changing the function of the input unit to be performed by the input operation in accordance with the position information obtained in the position obtaining processing (Bryant, Fig. 1, [0013], ln. 7-13, "…when the game player 706 'pitches' the handheld controller 107 based on up/down wrist movements, such rotation can be sensed and used to control simulated throttle. Similarly, it is possible to, for example, simulate the steering of the simulated vehicle handlebars 702 by rotating handheld controller 107 about the 'yaw' axis to thereby simulate the handlebars being pulled toward or pushed away from the vehicle operator. In certain exemplary illustrative non-limiting implementations, it is possible to simulate and detect 'lean' of a motorcycle by detecting the rotation of handheld controller 107 about the 'roll' axis.").
Regarding Claim 15, Bryant teaches a control method for controlling an information processing apparatus communicably connected to an operation terminal (Bryant, Fig. 1, [0014], ln. 1-2, "…the game system 100 and software can respond to inputs of handheld controller 107 in a variety of ways.") including an input unit that performs an input operation for an image of a virtual space (Bryant, Fig. 1, [0014], ln. 3-4, "…the game system 100 can for example control the graphics displayed on display 101 to correspond to the inputs provided by controller 107.") and performs a specified function in response to the input operation (Bryant, Fig. 1, [0014], ln. 4-6, "…when the game player 706 yaws the controller 107 by rotating about the Y axis, the simulated graphics modeling the simulated vehicle handlebars 702 can be observed to rotate in the direction of yaw."), the control method comprising: a position obtaining step of obtaining position information relating to a position of the operation terminal (Bryant, Fig. 1, [0013], ln. 7-13, "…when the game player 706 'pitches' the handheld controller 107 based on up/down wrist movements, such rotation can be sensed and used to control simulated throttle. Similarly, it is possible to, for example, simulate the steering of the simulated vehicle handlebars 702 by rotating handheld controller 107 about the 'yaw' axis to thereby simulate the handlebars being pulled toward or pushed away from the vehicle operator. In certain exemplary illustrative non-limiting implementations, it is possible to simulate and detect 'lean' of a motorcycle by detecting the rotation of handheld controller 107 about the 'roll' axis."); and a changing step of being capable of changing the function of the input unit to be performed by the input operation in accordance with the position information obtained in the position obtaining step (Bryant, Fig. 1, [0013], ln. 7-13, "…when the game player 706 'pitches' the handheld controller 107 based on up/down wrist movements, such rotation can be sensed and used to control simulated throttle. Similarly, it is possible to, for example, simulate the steering of the simulated vehicle handlebars 702 by rotating handheld controller 107 about the 'yaw' axis to thereby simulate the handlebars being pulled toward or pushed away from the vehicle operator. In certain exemplary illustrative non-limiting implementations, it is possible to simulate and detect 'lean' of a motorcycle by detecting the rotation of handheld controller 107 about the 'roll' axis.").
Regarding Claim 16, Bryant teaches A non-transitory computer-readable storage medium storing a program for causing a computer to execute a control method for controlling an information processing apparatus communicably connected to an operation terminal (Bryant, Fig. 1, [0014], ln. 1-2, "…the game system 100 and software can respond to inputs of handheld controller 107 in a variety of ways.") including an input unit that performs an input operation for an image of a virtual space (Bryant, Fig. 1, [0014], ln. 3-4, "…the game system 100 can for example control the graphics displayed on display 101 to correspond to the inputs provided by controller 107.") and performs a specified function in response to the input operation (Bryant, Fig. 1, [0014], ln. 4-6, "…when the game player 706 yaws the controller 107 by rotating about the Y axis, the simulated graphics modeling the simulated vehicle handlebars 702 can be observed to rotate in the direction of yaw."), the control method comprising: a position obtaining step of obtaining position information relating to a position of the operation terminal (Bryant, Fig. 1, [0013], ln. 7-13, "…when the game player 706 'pitches' the handheld controller 107 based on up/down wrist movements, such rotation can be sensed and used to control simulated throttle. Similarly, it is possible to, for example, simulate the steering of the simulated vehicle handlebars 702 by rotating handheld controller 107 about the 'yaw' axis to thereby simulate the handlebars being pulled toward or pushed away from the vehicle operator. In certain exemplary illustrative non-limiting implementations, it is possible to simulate and detect 'lean' of a motorcycle by detecting the rotation of handheld controller 107 about the 'roll' axis."); and a changing step of being capable of changing the function of the input unit to be performed by the input operation in accordance with the position information obtained in the position obtaining step (Bryant, Fig. 1, [0013], ln. 7-13, "…when the game player 706 'pitches' the handheld controller 107 based on up/down wrist movements, such rotation can be sensed and used to control simulated throttle. Similarly, it is possible to, for example, simulate the steering of the simulated vehicle handlebars 702 by rotating handheld controller 107 about the 'yaw' axis to thereby simulate the handlebars being pulled toward or pushed away from the vehicle operator. In certain exemplary illustrative non-limiting implementations, it is possible to simulate and detect 'lean' of a motorcycle by detecting the rotation of handheld controller 107 about the 'roll' axis.").
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 2 is rejected under U.S.C. 103 as being unpatentable over Bryant in view of Lee et al (EP 3651441 A1, hereinafter, "Lee").
Regarding Claim 2, Bryant teaches the limitations of dependent Claim 1 as noted above. Lee teaches in the position obtaining processing, a distance between the operation terminal and the information processing apparatus is obtained as the position information, and in the changing processing, the function of the input unit is capable of being changed in accordance with a length of the distance (Lee, pg. 3, para. 5, ln. 1-2, "The processor may be further configured to select the first electronic apparatus based on a distance between a remote controller for controlling the display apparatus and the first electronic apparatus."). It would have been obvious to a person having ordinary skill in the art at the time of the invention to combine the teachings of Lee with those of Bryant because it is well known in the art to change input function based on a distance between an operation terminal and a processing apparatus.
Claims 6 & 8 are rejected under 35 U.S.C. 103 as being unpatentable over Bryant in view of Abou (US 20220125299 A1, hereinafter, "Abou").
Regarding Claim 6, Bryant teaches the limitations of dependent Claim 1 as noted above. Abou teaches the one or more processors and/or circuitry is further configured to execute a line-of-sight obtaining processing that obtains line-of-sight information relating to a line of sight of a user operating the operation terminal, and in the changing processing, the function of the input unit is capable of being changed in accordance with the position information, and the line-of-sight information obtained in the line-of-sight obtaining processing (Abou, [0197], ln. 1-3, "In another use case, the wearable device may detect…line of sight, or other eye accommodation state (e.g., including changes to the foregoing) and may change a modification profile based on the detected states."). It would have been obvious to a person having ordinary skill in the art at the time of the invention to combine the teachings of Abou with those of Bryant because it is well known in the art to change function of an input based on position and line-of-sight information.
Regarding Claim 8, Bryant teaches the limitations of dependent Claim 1 as noted above. Abou teaches a display unit configured to display the image of the virtual space (Abou, Fig. 1A, [0088] ln. 13-15, "As an example, a wearable device of system 100 may obtain a live video stream from one or more cameras of the wearable device and cause the enhanced image to be displayed on one or more displays of the wearable device."). It would have been obvious to a person having ordinary skill in the art at the time of the invention to combine the teachings of Abou with those of Bryant because it is well known in the art to use a display unit configured to display an image of a virtual space.
Claims 9 & 11 are rejected under 35 U.S.C. 103 as being unpatentable over Bryant in view of Abou and Sakai et al (CN 112166602 A, hereinafter, "Sakai").
Regarding Claim 9, Bryant and Abou teach the limitations of dependent Claim 8 as noted above. Sakai teaches the one or more processors and/or circuitry is further configured to execute a switching processing that is capable of switching between a plurality of operation modes of the information processing apparatus within the image of the virtual space, and the display unit changes the image of the virtual space to an image corresponding to each of the operation modes (Sakai, pg. 25, para. 5, ln. 1-4, "…under the condition of triggering at least one of the first user or the second user, the information processing device switches the operation mode, and switching the operation mode and the first space 100A or the second space 100B set in at least one device [e.g., table display 131; window display 133, loudspeaker 135, tactile presentation device 137 and so on] interlocking."). It would have been obvious to a person having ordinary skill in the art at the time of the invention to combine the teachings of Sakai with those of Bryant and Abou because it is well known in the art to switch between a plurality of operating modes and display operating mode changes in the virtual space.
Regarding Claim 11, Bryant and Abou teach the limitations of dependent Claim 8 as noted above. Sakai teaches the display unit is capable of displaying the function of the input unit that has been changed in the changing processing on the image of the virtual space (Sakai, pg. 25, para. 5, ln. 1-4, "…under the condition of triggering at least one of the first user or the second user, the information processing device switches the operation mode, and switching the operation mode and the first space 100A or the second space 100B set in at least one device [e.g., table display 131; window display 133, loudspeaker 135, tactile presentation device 137 and so on] interlocking."). It would have been obvious to a person having ordinary skill in the art at the time of the invention to combine the teachings of Sakai with those of Bryant and Abou because it is well known in the art for a display unit to display the function of the input unit that has been changed in the changing processing on the image of the virtual space.
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Bryant in view of Abou and Ramani et al (US 20210406528 A1, hereinafter, "Ramani").
Regarding Claim 10, Bryant and Abou teach the limitations of dependent Claim 8 as noted above. Ramani teaches an image pickup unit configured to pick up an image of the operation terminal, and wherein the display unit is capable of displaying a virtual object of the operation terminal picked up by the image pickup unit on the image of the virtual space (Ramani, Fig. 2, [0045], ln. 2-4, "In many cases, the AR graphical user interfaces include graphical elements that are superimposed onto the user's view of the outside world or, in the case of a non-transparent display screen 28, superimposed on real-time images/video captured by the camera 29."). It would have been obvious to a person having ordinary skill in the art at the time of the invention to combine the teachings of Ramani with those of Bryant and Abou because it is well known in the art to display a virtual object picked up by a camera on the image of a virtual space.
Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Bryant in view of Abou and Ho (US 11761762 B1, hereinafter, "Ho").
Regarding Claim 12, Bryant and Abou teach the limitations of dependent Claim 8 as noted above. Ho teaches the information processing apparatus is a head mounted display (Ho, Fig. 2, [0037], ln. 2, "…a head-mounted device [HMD]…"). It would have been obvious to a person having ordinary skill in the art at the time of the invention to combine the teachings of Ho with those of Bryant and Abou because it is well known in the art to use a head mounted display for an information processing apparatus.
Claims 13 & 14 are rejected under 35 U.S.C. 103 as being unpatentable over Bryant in view of Ho.
Regarding Claim 13, Bryant teaches the limitations of dependent Claim 1 as noted above. Ho teaches one of a camera, a smartphone, and a mobile type controller is used as the operation terminal (Ho, Fig. 2, [0037], ln. 1-4, "In one embodiment controller 350 includes a computing device as illustrated in FIG. 2 housed in a smartphone, a head-mounted device (HMD), a glass frame, a goggle, a binocular, a magnifying glass, a helmet, a pen, a dress pin, a headset, a wearable device, a mobile computing device, a portable device, a car, a vehicle, a bus, a flying device, a digital camera..."). It would have been obvious to a person having ordinary skill in the art at the time of the invention to combine the teachings of Ho with those of Bryant because it is well known in the art to use cameras, smartphones, and mobile controllers as operation terminals.
Regarding Claim 14, Bryant teaches the limitations of dependent Claim 1 as noted above. Ho teaches the operation terminal includes a plurality of the input units, and at least one of the plurality of the input units is configured with a mechanical button, a mechanical dial, a touch panel, and a touch pad (Ho, Fig. 2, [0037], ln. 1-4, "In one embodiment controller 350 includes a computing device as illustrated in FIG. 2 housed in a smartphone, a head-mounted device (HMD), a glass frame, a goggle, a binocular, a magnifying glass, a helmet, a pen, a dress pin, a headset, a wearable device, a mobile computing device, a portable device, a car, a vehicle, a bus, a flying device, a digital camera..."). It would have been obvious to a person having ordinary skill in the art at the time of the invention to combine the teachings of Ho with those of Bryant because it is well known in the art to use operation terminals that have mechanical buttons, dials, touch panels, and touch pads.
Allowable Subject Matter
Claims 3-5 & 7 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Regarding Claim 3, the prior art of record – taken alone or in combination – fails to teach or render obvious a camera including a finder photographing function and a live view photographing function is used as the operation terminal, and in the changing processing, in a case that the distance is equal to or smaller than a threshold value, the function of the input unit is changed so as to be used for the finder photographing function, and in a case that the distance is not equal to or smaller than the threshold value, the function of the input unit is changed so as to be used for the live view photographing function.
Regarding Claim 4, the prior art of record – taken alone or in combination – fails to teach or render obvious an image pickup unit configured to pick up an image of the operation terminal, and wherein in the position obtaining processing, a position of the operation terminal within an angle of view of the image pickup unit is obtained as the position information, and in the changing processing, the function of the input unit is capable of being changed in accordance with the position of the operation terminal within the angle of view of the image pickup unit.
Regarding Claim 5, the prior art of record – taken alone or in combination – fails to teach or render obvious a camera including a finder photographing function and a live view photographing function is used as the operation terminal, and in the changing processing, in a case that the position of the operation terminal is within the angle of view of the image pickup unit, the function of the input unit is changed so as to be used for the finder photographing function or the live view photographing function, and in a case that the position of the operation terminal is not within the angle of view of the image pickup unit, the function of the input unit is changed so as to be used for moving within the virtual space.
Regarding Claim 7, the prior art of record – taken alone or in combination – fails to teach or render obvious a camera including a finder photographing function and a live view photographing function is used as the operation terminal, and in the changing processing, in a case that the line-of-sight information indicating that the operation terminal is located ahead of the line of sight of the user has been obtained, the function of the input unit is changed so as to be used for the finder photographing function, and in a case that the line-of-sight information indicating that the operation terminal is not located ahead of the line of sight of the user has been obtained, the function of the input unit is changed so as to be used for moving within the virtual space.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to STEVEN DANIEL BARRY whose telephone number is (571)270-0432. The examiner can normally be reached M-Th 0730-1630.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Lin Ye can be reached on 517-272-7372. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/STEVEN DANIEL BARRY/Examiner, Art Unit 2638
/LIN YE/Supervisory Patent Examiner, Art Unit 2638