Prosecution Insights
Last updated: April 19, 2026
Application No. 19/116,369

INFORMATION PROCESSING APPARATUS AND INFORMATION PROCESSING METHOD

Non-Final OA §102§103
Filed
Mar 28, 2025
Examiner
LEE, NICHOLAS J
Art Unit
2624
Tech Center
2600 — Communications
Assignee
Sony Group Corporation
OA Round
1 (Non-Final)
82%
Grant Probability
Favorable
1-2
OA Rounds
2y 3m
To Grant
93%
With Interview

Examiner Intelligence

Grants 82% — above average
82%
Career Allow Rate
779 granted / 951 resolved
+19.9% vs TC avg
Moderate +11% lift
Without
With
+10.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 3m
Avg Prosecution
19 currently pending
Career history
970
Total Applications
across all art units

Statute-Specific Performance

§101
3.0%
-37.0% vs TC avg
§103
55.8%
+15.8% vs TC avg
§102
28.7%
-11.3% vs TC avg
§112
6.6%
-33.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 951 resolved cases

Office Action

§102 §103
DETAILED ACTION Allowable Subject Matter Claims 7 and 11 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1, 8-9, and 12-15 is/are rejected under 35 U.S.C. 102a2 as being anticipated by US Patent Pub. 2018/0299963 A1 to Fukazawa et al (“Fukazawa”). As to claim 1, Fukazawa discloses an information processing apparatus (See Fig. 2) comprising: a space control unit (150) that controls display of a virtual object in an XR (cross reality) space (¶ 0079); and a recognition unit (120) that recognizes a designated object that is the virtual object designated by a user on a basis of a position, a posture, and a degree of opening of a virtual tool or a real input device capable of adjusting a degree of opening of tips in the XR space (See Fig. 4A; ¶ 0089-0094; Fukuzawa discloses a virtual tool U14 to be controlled by a user by performing pinching or moving operations for selecting an object. Further, Fukuzawa discloses the recognition unit 120 may recognize the opening degree of the user’s hand, and the control unit 150 may cause the opening degree of the user’s hand to be displayed as the opening deree of the finger U142 and the finger of U144.). As to claim 8, Fukuzawa discloses wherein the space control unit adjusts a position, a posture, and a degree of opening of the virtual tool on a basis of a position, a posture, and an interval between fingertips of two fingers of a user (See Fig. 4B, ¶ 0092, “the recognition unit 120 may recognize the opening degree of the user's hand, and the control unit 150 may cause the opening degree of the user's hand to be displayed as the opening degree of the finger U142 and the finger U144”). As to claim 9, Fukuzawa discloses wherein the space control unit adjusts a position, a posture, and a degree of opening of the virtual tool on a basis of a position, a posture, and an operation content of the input device (See Fig. 4B, ¶ 0092, “the recognition unit 120 may recognize the opening degree of the user's hand, and the control unit 150 may cause the opening degree of the user's hand to be displayed as the opening degree of the finger U142 and the finger U144”). As to claim 12, Fukuzawa discloses wherein the space control unit adjusts a degree of opening of the virtual tool on a basis of a distance between tips of the input devices (See Fig. 4B, ¶ 0092, “the recognition unit 120 may recognize the opening degree of the user's hand, and the control unit 150 may cause the opening degree of the user's hand to be displayed as the opening degree of the finger U142 and the finger U144”). As to claim 13, Fukuzawa discloses wherein the space control unit controls display of the virtual tool in the XR space (¶ 0092-0093). As to claim 14, Fukuzasa discloses wherein the input device is a tweezer type (See Fig. 4B, U14). As to claim 15, the same rejection or discussion is used as in the rejection of claim 1. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 2-6 is/are rejected under 35 U.S.C. 103 as being unpatentable over US Patent Pub. 2018/0299963 A1 to Fukazawa et al (“Fukazawa”) in view of US Patent Pub. 2017/0287225 A1 to Powderly et al (“Powderly”). As to claim 2, Fukuzawa fails to disclose wherein the space control unit performs control to present a plurality of candidates in the XR space in a case where the plurality of candidates for the designated object is recognized by the recognition unit. Powderly discloses wherein the space control unit performs control to present a plurality of candidates in the XR space in a case where the plurality of candidates for the designated object is recognized by the recognition unit (See Fig. 12B, 1230). Before the effective filing date, it would have been obvious to one of ordinary skill in the art to have modified Fukuzawa with the teachings of Powderly wherein the space control unit performs control to present a plurality of candidates in the XR space in a case where the plurality of candidates for the designated object is recognized by the recognition unit, as suggested by Powderly thereby similarly using known configurations for providing multiple candidates for selection in a virtual/cross reality environment. As to claim 3, Powderly discloses wherein the space control unit controls to display the plurality of candidates in a display mode different from a display mode of another virtual object (See Fig. 12B, 1230a; Powderly discloses providing a focus indicator that is a red highlight around all or part of a selected object.). As to claim 4, Powderly discloses wherein the recognition unit recognizes the candidate selected using the virtual tool or the input device as the designated object (See Fig. 12B; Powderly discloses the cone 1220 intersects with object 120a indicating a designated object.). As to claim 5, Powderly discloses wherein the space control unit performs control to display a menu for selecting the designated object from the plurality of candidates in the XR space (¶ 0153, “the user can select an object, open a menu associated with the object, or move an object, etc”). As to claim 6, Powderly discloses wherein the recognition unit recognizes the candidate selected from the menu as the designated object (¶ 0103, “the wearable system may add a virtual menu associated with a television in the room, where the virtual menu may give the user the option to turn on or change the channels of the television using the wearable system”). Claim(s) 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over US Patent Pub. 2018/0299963 A1 to Fukazawa et al (“Fukazawa”) in view of US Patent Pub. 2024/0103636 A1 to Lindmeier et al (“Lindmeier”). As to claim 10, Fukuzawa fails to disclose wherein the space control unit adjusts a degree of opening of the virtual tool on a basis of pressure applied to the input device. Lindmeier discloses wherein the space control unit manipulates an object a basis of pressure applied to the input device (¶ 0156, “For example, a selection input that is described as being performed with an air tap or air pinch input could be alternatively detected with a button press, a tap on a touch-sensitive surface, a press on a pressure-sensitive surface, or other hardware input”). Before the effective filing date, it would have been obvious to one of ordinary skill in the art to have modified Fukuzawa with the teachings of Lindermeier wherein the space control unit manipulates an object a basis of pressure applied to the input device, as suggested by Lindmeier thereby similarly using known configurations using pressure-based input devices in the operation of virtual/cross reality environments. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to NICHOLAS J LEE whose telephone number is (571)270-7354. The examiner can normally be reached Mon-Fri 10-6PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Eason can be reached at 571-270-7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NICHOLAS J LEE/Primary Examiner, Art Unit 2624
Read full office action

Prosecution Timeline

Mar 28, 2025
Application Filed
Feb 21, 2026
Non-Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12600300
FULL DISPLAY MIRROR ASSEMBLY WITH THROUGH BEZEL INFRARED ILLUMINATION
2y 5m to grant Granted Apr 14, 2026
Patent 12603041
DISPLAY PANEL AND ELECTRONIC DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12596255
POLARIZATION MECHANISM TO REDUCE WAVEGUIDE REFLECTIONS IN A HEAD-WORN DISPLAY
2y 5m to grant Granted Apr 07, 2026
Patent 12597378
DISPLAY SCREEN
2y 5m to grant Granted Apr 07, 2026
Patent 12597286
ELECTRONIC STRUCTURE
2y 5m to grant Granted Apr 07, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
82%
Grant Probability
93%
With Interview (+10.9%)
2y 3m
Median Time to Grant
Low
PTA Risk
Based on 951 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month