Prosecution Insights
Last updated: April 19, 2026
Application No. 18/280,420

INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND PROGRAM

Non-Final OA §103§112
Filed
Sep 05, 2023
Examiner
MCGRATH, ERIN E
Art Unit
3771
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
The University of Tokyo
OA Round
3 (Non-Final)
59%
Grant Probability
Moderate
3-4
OA Rounds
3y 11m
To Grant
90%
With Interview

Examiner Intelligence

Grants 59% of resolved cases
59%
Career Allow Rate
250 granted / 423 resolved
-10.9% vs TC avg
Strong +31% interview lift
Without
With
+31.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 11m
Avg Prosecution
45 currently pending
Career history
468
Total Applications
across all art units

Statute-Specific Performance

§101
0.5%
-39.5% vs TC avg
§103
46.0%
+6.0% vs TC avg
§102
19.1%
-20.9% vs TC avg
§112
31.6%
-8.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 423 resolved cases

Office Action

§103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 1/23/26 has been entered. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 6 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 6 recites “the augmented reality reproduces the reference maneuver representing a time-series future step showing the augmented reality medical instrument as at least one of the time-series images.” While the reference maneuver was recited in claim 1, it appears that this refers to a different reference maneuver (specifically one which represents a time-series future step) and lacks sufficient antecedent basis. The intended meaning is unclear, so further clarification is required. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lang [US 2017/0258526 A1] in view of Joskowicz et al. [US 2009/0177081 A1, hereinafter “Joskowicz”]. Re. claim 1, Lang discloses: An information processing system comprising a controller including at least one processor [Par. 0072] configured to execute each of following steps including: a reading step of reading reference information on a reference maneuver [“virtual surgical plan,” Par. 0147] that is a medical maneuver to be referred to by a user, the reference information including a three-dimensional position of a track point in the reference maneuver [“The multiple OHMD's can be registered in a common coordinate system 15 using anatomic structures, anatomic landmarks, calibration phantoms, reference phantoms, optical markers, navigation markers, and/or spatial anchors, for example like the spatial anchors used by the Microsoft Hololens. Pre-operative data 16 of the patient can also be registered in the common coordinate system 15….The live data 18 of the patient can be registered in the common coordinate system 15. Intra-operative imaging studies 20 can be registered in the common coordinate system 15. OR references, e.g. an OR table or room fixtures can be registered in the common coordinate system 15 using, for example, optical markers IMU's, navigation markers or spatial mapping 22,” Par. 0147. Thus, any of the above objects can be a track point] and time-series images [“the surgeon wearing OHMD…can execute commands…to display the next predetermined bone cut, e.g. from a virtual surgical plan…which can…project digital holograms of the next surgical step 34 superposed onto and aligned with the surgical site in a predetermined position and/or orientation,” Par. 0147. Because the system has digital holograms of the next surgical step, the virtual surgical plan necessarily includes time series images of the planned procedure which are read by the above processor in order to project said images], each of the time series images including an augmented reality medical instrument used in the reference maneuver [the OHMD displays the” virtual surgical instrument,” Pars. 0073-0074]; a first reception step of receiving a captured image acquired by capturing an image of an external world as viewed by the user [“Live data 18 of the patient, for example from the surgical site…can be measured…using…image or video capture systems,” Par. 0147]; and a generation step of generating, based on the three-dimensional position of the track point, the time series images, and the captured image, display information for presenting augmented reality regarding the reference maneuver to the user [“The pre-operative data 16 or live data 18 including intra-operative measurements or combinations thereof can be used to … modify a virtual surgical plan 24…[and the system can] display the next predetermined bone cut, e.g. from a virtual surgical plan or an imaging study or intra-operative measurements, which can trigger the OHMD's 11, 12, 13, 14 to project digital holograms of the next surgical step 34 superimposed onto and aligned with the surgical site in a predetermined position and/or orientation,” Par. 0147] the augmented reality three-dimensionally reproducing an aspect of the augmented reality medical instrument [“the OHMD can display…[a] virtual surgical instrument,” Par. 0074 at a variety of orientations for a plurality of users, i.e. in three dimensions; Par. 0147] used in the reference maneuver Lang teaches that “another medical instrument” is the actual physical medical instrument [Par. 0073] but fails to disclose generating display information for presenting augmented reality reproducing a hand manipulating the (physical) medical instrument. However, Joskowicz teaches, in a surgical system having a processor and display, the system’s processor is configured to generate, based on a position of a surgeon’s hand, display information for presenting augmented reality reproducing a hand manipulating a medical instrument [Par. 0039: “the video camera system could be arranged to provide an augmented virtuality display of the patient's … and a real-time image of the surgeon's hand holding the robot base or a targeting jig is imposed on this virtual image of the patient's head”]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the system of Lang to configure the processor to generate an image of a hand manipulating the physical medical instrument, as taught by Joskowicz, in order to “guid[e] the surgeon in positioning” the instrument [Joskowicz, Par. 0039]. Re. Claim 2, Lang discloses the track point is set on the another (physical) medical instrument [“optical markers…can…be attached to the instrument,” Par. 1548], and the three-dimensional position is a relative position with respect to a surgical field of the reference maneuver [Par. 0384]. Lang fails to teach the track point being set on the hand manipulating the another medical instrument. However, given the above teachings, in which the hand is displayed, It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to modify the system such that the track point is placed on the hand manipulating said medical instrument, because in order for a real-time image of the hand to be displayed, it must be tracked in some manner; so setting this to be the track point is an obvious modification readily performed by one of ordinary skill in the art, given the above-modified system. Re. Claim 3, Lang discloses the processor is configured to further execute: a first specification step of identifying a location of a surgical field visible to the user in the captured image [see e.g. Par. 0259, which describes “field of view” tracking, and registration, Par. 0260]; and the generation step where the augmented reality medical instrument is superimposed on the surgical field [Par. 1548]. Regarding the hand being superimposed on the surgical field, Joskowicz teaches this as set forth above. Re. Claim 4, Lang discloses the processor is configured to further execute: a second specification step of identifying a user maneuver from the captured image, the user maneuver being a medical maneuver performed by the user [Par. 1548 describes how the surgeon’s movements are tracked and registered live, with the virtual display included augmented reality aspect which track this maneuver.]; and in the generation step, the display information is modified according to an aspect of the user maneuver [Par. 1548 discloses how the physical and virtual tools are aligned in the OHMD, so that e.g. the hidden portions of a rod can be virtually displayed]. Re. Claim 5, Lang discloses The information processing system according to claim 4, wherein, in the augmented reality, progression of the reference maneuver is controlled according to the aspect of the user maneuver [Par. 1549: “pedicle screws and related instruments…or trocars…can be placed reliably following a trajectory or desired position of the…instruments…”]. Re. Claim 6, as best understood, Lang discloses The information processing system according to claim 5, wherein the augmented reality reproduces the reference maneuver representing a time-series future step showing the augmented reality medical instrument as at least one of the time-series images (the AR instrument is displayed as time series images; see claim 1 above) compared with a current aspect of the user maneuver [Par. 1549: “pedicle screws and related instruments…or trocars…can be placed reliably following a …desired position of the…instruments…”]. Re. Claim 7, Lang discloses the augmented reality is configured to be such that a progression speed of the reference maneuver is variable [Par. 1420. Because the surgeon is able to toggle between steps, the surgeon can control/vary the progression speed of the reference maneuver.]. Re. Claim 8, Lang discloses the augmented reality is configured to be such that transparency is variable of the augmented reality medical instrument and the hand manipulating the another medical instrument [“The surgeon can optionally adjust the transparency or opacity of the virtual data displayed in the OHMD,” Par. 0210]. Regarding the hand being superimposed on the surgical field, Joskowicz teaches this as set forth above. Re. Claim 9, Lang discloses the augmented reality presents, to the user, time-series changes in the aspect of the augmented reality medical instrument, with time-series changed aspects superimposed on each other [Par. 1421]. Regarding the hand being superimposed on the surgical field, Joskowicz teaches this as set forth above, and the above teachings would reasonably convey, to one of ordinary skill, the benefits of time-series changes in the aspect of the hand being superimposed as well. Re. Claim 10, Lang discloses each of the time-series changed aspects is based on a discretely selected frame [Par. 1420]. Re. Claim 11, Lang discloses the reading step reads the reference information, which is stored in a memory in advance, is read from the memory [Par. 0450. The coordinates are read from a memory and a “virtual surgical plan” may be adjusted, updated, or modified. Thus, the reading happens in advance of the actual procedure.]. Re. Claim 12, Lang discloses the processor is configured to further execute: a second reception step of receiving the reference information from outside via a network [Par. 0909]; and in the reading step the reference information received from outside via the network is read [in order to develop the surgical plan using data stored on an outside device via a network, the outside reference information must be read]. Re. Claim 13, Lang discloses The information processing system according to claim 12, wherein: the second reception step continuously receives another reference information on the reference maneuver in progress [Pars. 0249-0250 and Par. 0254] [“by a person different from the user”: this related to the intended use of the system. Whoever is performing the reference maneuver, it may be tracked continuously as cited above]; the reading step continuously reads the another reference information [Par. 0254]; and the generation step presents augmented reality regarding the reference maneuver, relating to the another reference information, in progress to the user [Par. 1548]. Re. Claim 14, Lang discloses the generation step further generates information to be perceived by the user with a sense other than sight [audible alarm, Par. 1360]. Re. Claim 15, Lang discloses a display, the display being configured to: transmit light from the external world in a direction toward the user [Par. 1547]; and display a screen based on the display information so that the user can view a state where the reference maneuver is superimposed on the external world is configured to be viewed by the user [Par. 1548]. Re. Claim 16, Lang discloses the display is included in a wearable device wearable for the user [a head-mounted display, Par. 0002]. Re. Claim 17, the modified Lang teaches the method steps of claim 1. Re. Claim 18, the modified Lang teaches a non-transitory computer-readable memory medium storing the programming of claim 1 [inherently; see storage referred to in e.g. Par. 1259; furthermore, because the steps are executed electronically there must be some storage medium which stores the steps]. Response to Arguments Applicant's arguments filed 1/23/26 have been fully considered but they are not persuasive. Applicant argues that Lang fails to teach time-series images. The examiner respectfully disagrees. Because Lang teaches a virtual surgical plan with a virtual surgical instrument and the projection of holograms of the next surgical step (see Par. 0147), Lang necessarily teaches that a time-series of images of the virtual surgical plan are produced and read by the processor. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ERIN MCGRATH whose telephone number is (571)270-0674. The examiner can normally be reached M-F 9 am to 5 pm ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JACKIE HO can be reached at (571) 272-4696. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ERIN MCGRATH/Primary Examiner, Art Unit 3771
Read full office action

Prosecution Timeline

Sep 05, 2023
Application Filed
May 13, 2025
Non-Final Rejection — §103, §112
Aug 08, 2025
Response Filed
Oct 22, 2025
Final Rejection — §103, §112
Jan 23, 2026
Request for Continued Examination
Feb 18, 2026
Response after Non-Final Action
Mar 12, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594068
MEDICAL SYSTEMS, DEVICES AND METHODS ADAPTED FOR TISSUE FIXATION AND APPROXIMATING TISSUE DEFECTS
2y 5m to grant Granted Apr 07, 2026
Patent 12591986
METHOD AND NAVIGATION SYSTEM FOR REGISTERING TWO-DIMENSIONAL IMAGE DATA SET WITH THREE-DIMENSIONAL IMAGE DATA SET OF BODY OF INTEREST
2y 5m to grant Granted Mar 31, 2026
Patent 12575821
A DEVICE FOR THE STITCHING OF LAPAROSCOPIC INCISIONS AND METHODS THEREOF
2y 5m to grant Granted Mar 17, 2026
Patent 12569246
METHODS AND DEVICES FOR REPAIRING AND ANCHORING DAMAGED TISSUE
2y 5m to grant Granted Mar 10, 2026
Patent 12569232
SURGICAL INSTRUMENT AND STEERING GEAR THEREOF
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
59%
Grant Probability
90%
With Interview (+31.3%)
3y 11m
Median Time to Grant
High
PTA Risk
Based on 423 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month