DETAILED ACTION
Response to Amendment
Applicant’s amendments filed on 18 February 2026 have been entered. Claims 1-7 have been amended. Claims 8-13 have been added. Claims 1-13 are still pending in this application, with claims 1, 2, 6 and 7 being independent.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on October 13, 2025 has been entered.
Claim Objections
Claims 1 and 2 are objected to because of the following informalities:
Claim 1 recited the limitation “aligning the virtual object with the virtual coordinate in the virtual coordinate system”. It should read “aligning the virtual object with the virtual coordinate in the global coordinate system”
Claim 2 recited the limitation “corresponds to the coordinate in the global coordinate syste”. It should read “corresponds to the coordinate in the global coordinate system”
Appropriate correction is required.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rurin (US 20110109628 A1), referred herein as Rurin in view of Iijima et al. (US 20120056802 A1), referred herein as Iijima.
Regarding Claim 1, Rurin in view of Iijima teaches an information processing system comprising (Rurin Abst: A method of affecting the virtual objects; [0032] On FIG. 1 an example is shown of interaction of the user 1 with the system; [0071] Thus, the proposed method is industrially applicable in training devices, simulators, computer games based on reality modeling, and in other applications):
a display device (Rurin FIG. 1:150);
an information processing device (Rurin FIG. 1:140); and
a sensor device which is configured to detect relative positions of a user and the display device (Rurin FIG. 1:160; [0030] FIG. 4 shows an example of determination of a spatial position of the object (indicator) using the images from two video- or photo-cameras; [0024] This gives the user a chance to control those objects of the virtual world that are located both immediately in front of the "window", and to the right, or to the left of the "window", as well as the objects located at various distances from the "window"), wherein the information processing device is configured to:
acquire information from the sensor device, wherein the information represents a coordinate in a global coordinate system (Rurin [0061] Selecting a point with the global coordinates (X, Y, Z), coordinates of its projection in the image plane of the first (left) camera will be designated as (x', y'') and in the image plane of the second (right) camera--as (x', y''). (Projections of the same point on the planes of images of different cameras are known as conjugate points.). It is easy to see, that x'=f(X+b/2)/Z, x''=f(X-b/2)/Z, y'=y''=fY/Z).
Iijima further teaches
a global coordinate system associated with a distance between a first horizontal plane associated with the display device and a second horizontal plane associated with the user (Iijima [0052] In the real space, it is favorable that the Y-axis be correctly defined in the vertical direction, and the X-axis and Z-axis be correctly defined in the horizontal plane; [0053] Referring to FIG. 5, the orientation information deriving unit 94 derives the pitch angle of the input device 20 as an angle with respect to the XZ-plane);
determine a virtual coordinate in a virtual coordinate system that corresponds to the coordinate in the global coordinate system (Rurin [0066] As noted earlier, the indicator (according to the proposed method) is positioned a real physical base system of coordinates, and all objects of the three-dimensional virtual space, together with the display, have the specific coordinates set in this real physical base system of coordinates. For example, the user has intention to perform an operation over the object of a virtual three-dimensional space, for example, the person shown on FIG. 1);
determine a virtual position of a virtual object in a virtual space by aligning the virtual object with the virtual coordinate in the virtual coordinate system (Rurin [0069] FIG. 1 shows that, though presentation 220 of the virtual pointer on the display is carried out in a two-dimensional plane of the display, all operations by the user will be carried out in a real physical coordinate system (optionally, taking into account the selected scale), that is presented on FIG. 1 image of the virtual pointer 200, which illustrates the user intention to control the selected virtual object 190);
generate a video, depicting the virtual object at the virtual position in the virtual space (Rurin [0032] the image 220 on the display of the virtual pointer 200, the image 230 on the display of objects of the three-dimensional virtual space, the point 240 of interaction between the virtual pointer 200 with the objects 190 of the three-dimensional virtual space in the real three-dimensional space regarding the base system of coordinates, and the image 250 on the display of the point 240 of interaction of the virtual pointer 200 with the objects 190 of the three-dimensional virtual space); and
transmit the video to the display device for display on the display device (Rurin [0050] 314--the array of the displayed data is modified by insertion of a displayed collection of the geometrical location of the virtual pointer points, and the current array of the displayed three-dimensional virtual objects is generated; [0051] 315--the generated array of the displayed data is shown on the display(s)).
Iijima discloses a game control technology and, particularly, to a game device that allows a user to control an object such as a game character by using an input device, which is analogous to the present patent application.
It would have been obvious for a person of ordinary skill in the art before the effective filing date of the claimed invention to have modified Rurin to incorporate the teachings of Iijima, and apply the pitch angle respect to the horizonal XZ-planes of a input device to the position of a person in 3D coordinates.
Doing so would cause the object to jump in a game space by allowing the input device/person be moved upward/downward in the vertical direction.
Regarding Claim 8, Rurin in view of Iijima teaches the information processing system according to claim 1, and further teaches wherein the information further includes angle information indicating a direction where the user is located with respect to a reference direction, wherein the reference direction is a predetermined direction within the first horizontal plane associated with the display device and is determined according to a relation with an arrangement position of the display device, and wherein the information processing device is further configured to determine the virtual position of the virtual object in the virtual space using the angle information (Rurin [0067] For example, at usage of the scale set by the user and equal 1:20, the user wishing to touch a tree, should move the hand equipped with sensors of the second system of identification blocks, to the distance equal to 50 cm exactly. [0068] If the scale is set as 1:1, all operations of the user with virtual objects will require performance of precisely same operations, as in case of handling the objects of the real physical space (or at producing an effect on such objects). [0069] FIG. 1 shows that, though presentation 220 of the virtual pointer on the display is carried out in a two-dimensional plane of the display, all operations by the user will be carried out in a real physical coordinate system (optionally, taking into account the selected scale), that is presented on FIG. 1 image of the virtual pointer 200, which illustrates the user intention to control the selected virtual object 190.).
Regarding Claim 9, Rurin in view of Iijima teaches the information processing system according to claim 1, and further teaches wherein the information further includes a horizontal distance to a position where the user is located, the horizontal distance to the position where the user is located being determined according to a relation with an arrangement position of the display device, and wherein the information processing device is further configured to determine the virtual position of the virtual object in the virtual space using the horizontal distance to the position where the user is located (Iijima [0075] in accordance with the direction of movement of the game object 200 and the action performed that are determined by a user input command; [0084] FIG. 10 shows the range of pitch angle .beta. and yaw angle .alpha. of the input device 20 defined by the first and second conditions. The range is represented by a virtual square pyramid. If the user holds the input device 20 such that the axial line of the input device 20 is within the range defined by the square pyramid, if the user sticks the input device 20 in the negative direction aligned with the Z-axis within one second for a distance longer than 150 mm, the condition determination unit 112 determines that all of the first, second, and third conditions are met. To summarize the first through third conditions, the condition that triggers "dash" is that the user directs the light-emitting body 22 toward the imaging device 14, holds the input device 20 in a direction proximate to the Z-axis, and sticks the input device 20 quickly toward the imaging device 14 for a distance longer than 150 mm.).
Regarding Claims 2-5, Rurin in view of Iijima teaches an information processing device (Rurin Abst: A method of affecting the virtual objects; [0032] On FIG. 1 an example is shown of interaction of the user 1 with the system; [0071] the proposed method is industrially applicable in training devices, simulators, computer games based on reality modeling, and in other applications; FIG. 1):
The metes and bounds of the rest of the limitations of the claim substantially correspond to the feature set forth in claim 1, 8 and 9; thus they are rejected on similar grounds and rationale as their corresponding limitations.
Regarding Claims 6, 10 and 11, Rurin in view of Iijima teaches an information processing method (Rurin Abst: A method of affecting the virtual objects; [0032] On FIG. 1 an example is shown of interaction of the user 1 with the system; [0071] the proposed method is industrially applicable in training devices, simulators, computer games based on reality modeling, and in other applications; FIG. 1):
The metes and bounds of the rest of the limitations of the claim substantially correspond to the feature set forth in claim 1, 8 and 9; thus they are rejected on similar grounds and rationale as their corresponding limitations.
Regarding Claims 7, 12 and 13, Rurin in view of Iijima teaches a non-transitory, computer-readable storage medium containing a computer program, which when executed by an information processing device (Rurin Abst: A method of affecting the virtual objects; [0032] On FIG. 1 an example is shown of interaction of the user 1 with the system; [0071] the proposed method is industrially applicable in training devices, simulators, computer games based on reality modeling, and in other applications; FIG. 1; Iijima Claim 1. A computer program embedded in a non-transitory computer-readable recording medium):
The metes and bounds of the rest of the limitations of the claim substantially correspond to the feature set forth in claim 1, 8 and 9; thus they are rejected on similar grounds and rationale as their corresponding limitations.
Response to Arguments
Applicant’s arguments, see page 8, filed on 18 February 2026 with respect to103 rejection on claims 1, 2, 6 and 7 has been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Samantha (Yuehan) Wang whose telephone number is (571)270-5011. The examiner can normally be reached Monday-Friday, 8am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, King Poon can be reached on (571)272-7440. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Samantha (YUEHAN) WANG/
Primary Examiner
Art Unit 2617