Prosecution Insights
Last updated: April 19, 2026
Application No. 17/702,309

REGISTRATION METHOD AND SETUP

Non-Final OA §103
Filed
Mar 23, 2022
Examiner
ANDERSON II, JAMES M
Art Unit
2425
Tech Center
2400 — Computer Networks
Assignee
Intersect Ent International GmbH
OA Round
5 (Non-Final)
75%
Grant Probability
Favorable
5-6
OA Rounds
2y 11m
To Grant
85%
With Interview

Examiner Intelligence

Grants 75% — above average
75%
Career Allow Rate
513 granted / 684 resolved
+17.0% vs TC avg
Moderate +10% lift
Without
With
+10.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
31 currently pending
Career history
715
Total Applications
across all art units

Statute-Specific Performance

§101
7.8%
-32.2% vs TC avg
§103
49.8%
+9.8% vs TC avg
§102
15.5%
-24.5% vs TC avg
§112
17.0%
-23.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 684 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 10/08/2025 has been entered. Claim Status Claims 34-52 are currently pending in the application with claims 34 and 44 being amended. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 34, 38-40 and 43 are rejected under 35 U.S.C. 103 as being unpatentable over Kruger et a. (US 9,208,561 B2) in view of Kempinski (US 20140253737 A1). Concerning claim 34, Kruger et al. (hereinafter Kruger) teaches a method for non-tactile patient registration of an object with a position detection system (col. 3, ll. 19-34), the method comprising: photogrammetrically generating a surface model of the object based on one or more images captured with an image sensor unit positioned at a capturing position (col. 4, ll. 30-39, col. 5, ll. 35-36, l. 50 – performing a plurality of image recordings of a surface of the object or body part; col. 5, ll. 38-40: photogrammetrically generating a surface model of the object from the captured image), wherein the position detection system is an electromagnetic position detection system that has a working space having a predetermined electromagnetic field strength (col. 4, ll. 40-54), and wherein the image sensor unit comprises a position sensor positioned separate from the object (col. 2, ll. 65-67; fig. 2: position sensor 14’; col. 5, ll. 53-60); determining the capturing position in a coordinate system of the position detection system, based on a sensor signal from the position sensor (col. 3, ll. 9-12 col. 5, ll. 56-60); and relating the photogrammetrically generated surface model of the object to the position detection system based at least in part on the capturing position (col. 2, ll. 47-52). Not explicitly taught is the image sensor unit comprising a motion sensor. Kempinski teaches a method of detecting movement of a body part in an image captured by an operator of the imager which is capturing the image using a motion sensor that is associated with an imager (e.g., an imaging device) (fig. 4: steps 400-406, ¶0072). Further, the motion sensor may be incorporated into the imaging device (¶0034). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the method of tracking a position and orientation of an imaging device of Kruger in the manner described by Kempinski because it allows the tracking of an object when the tracking device is portable, moving or unstable (Kempinski, ¶0001). Thus a position or orientation, or a change in position or orientation, of the imaging device may be determined concurrently with tracking of an object by the imaging device (Kempinski, ¶0018). Concerning claim 38, Kruger further teaches the method of claim 34, wherein the image sensor unit comprises one or more stereographic cameras, one or more cameras in a multi-camera arrangement, one or more infrared image sensors (col. 4, ll. 19-24), or a combination thereof. Concerning claim 39, Kruger further teaches the method of claim 38, wherein the method comprises capturing the one or more images with the image sensor unit position at a capturing position outside of the working space (col. 4, ll. 40-54) Concerning claim 40, Kruger in view of Kempinski teaches the method of claim 34, wherein the motion sensor is configured to track position of the image sensor unit along a path on which the image sensor unit is moved (Kruger, now incorporating the teachings of Pflugi,: fig. 2: position sensor 14’ (now a motion sensor); col. 5, ll. 53-60, Kempinski ¶0020). Concerning claim 43, Kruger further teaches the method of claim 34, wherein the object is a face of a subject depicted in a captured image (fig. 3: pattern 22 is image on the surface of a subject’s head (i.e., a face in fig. 3)). Claims 35-37, 44-49 and 52 are rejected under 35 U.S.C. 103 as being unpatentable over Kruger et a. (US 9,208,561 B2) in view of Kempinski (US 20140253737 A1), further in view of Pflugi et al. (“Augmented Marker Tracking for Peri-Acetabular Osteotomy Surgery”, 11 July 2017). Concerning claim 35, Kruger in view of Kempinski teaches the method of claim 34. Not explicitly taught is the method, wherein, determining the capturing position comprises: detecting motion of the image sensor unit along a path relative to the position detection system, based on the sensor signal from the motion sensor; defining a spatial relationship that relates at least one point on the path in a coordinate system of the motion sensor to a known position in a coordinate system of the position detection system; and determining the capturing position in the coordinate system of the position detection system based on the detected motion of the image sensor along the path and the defined spatial relationship. Pflugi teaches an augmented marker tracking system, wherein an inertial measurement unit (IMU) is used to measure the system’s orientation (Abstract); and detecting motion of the image sensor unit along a path relative to the position detection system, based on the sensor signal from the motion sensor (section II-F: motion signals from the IMU are recorded (i.e., recording a path) and fused with the optically derived positions by a Kalman filter); defining a spatial relationship that relates at least one point on the path in a coordinate system of the motion sensor to a known position in a coordinate system of the position detection system (section II-F; fig. 4: The Kalman filter is initialized using the first pose estimate from marker tracking); and determining the capturing position in the coordinate system of the position detection system based on the detected motion of the image sensor along the path and the defined spatial relationship (section II-F: The subsequent positions of the augmented marker are then determined by the prediction/update steps of the Kalman filter based on this initial pose estimate and the recorded path. When marker detection fails, the filter does not perform an update step and instead, the final orientation output is the one after the prediction step.). Taking the teachings of Kruger, Kempinski and Pflugi together as a whole, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Kruger in the manner taught by Pflugi by using a motion sensor in order to determine the capturing position in a coordinate system of the position detection system. Such a modification is merely a simple substitution of one known element for another to obtain predictable results. Concerning claim 36, Pflugi further teaches the method of claim 35, wherein the path starts at the known position in the coordinate system of the position detection system and ends at the capturing position (fig. 4: The Kalman filter is initialized using the first pose estimate from marker tracking (i.e., the start position is known in the coordinate system of the position detection system)). Concerning claim 37, Pflugi further teaches the method of claim 35, wherein the path starts at the capturing position and ends at the known position in the coordinate system of the position detection system (fig. 4: The Kalman filter being bidirectional allows for data to be run forward and backward (i.e., the start position being the capturing position and ending at the known position in the coordinate system)). Concerning claim 44, Kruger et al. (hereinafter Kruger) teaches a method for non-tactile patient registration of an object with a position detection system (col. 3, ll. 19-34), the method comprising: photogrammetrically generating a surface model of the object based on one or more images captured with an image sensor unit positioned at a capturing position (col. 4, ll. 30-39, col. 5, ll. 35-36, l. 50 – performing a plurality of image recordings of a surface of the object or body part; col. 5, ll. 38-40: photogrammetrically generating a surface model of the object from the captured image), wherein the image sensor unit comprises a position sensor positioned separate from the object (col. 2, ll. 65-67; fig. 2: position sensor 14’; col. 5, ll. 53-60); determining the capturing position in a coordinate system of the position detection system, based on a sensor signal from the position sensor (col. 3, ll. 9-12 col. 5, ll. 56-60); and relating the photogrammetrically generated surface model of the object to the position detection system based at least in part on the capturing position (col. 2, ll. 47-52). Not explicitly taught is the image sensor unit comprising a motion sensor. Kempinski teaches a method of detecting movement of a body part in an image captured by an operator of the imager which is capturing the image, using a motion sensor that is associated with an imager (e.g., an imaging device) (fig. 4: steps 400-406, ¶0072). Further, the motion sensor may be incorporated into the imaging device (¶0034). It would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the method of tracking a position and orientation of an imaging device of Kruger in the manner described by Kempinski because it allows the tracking of an object when the tracking device is portable, moving or unstable (Kempinski, ¶0001). Thus a position or orientation, or a change in position or orientation, of the imaging device may be determined concurrently with tracking of an object by the imaging device (Kempinski, ¶0018). Not explicitly taught by Kruger or Kempinski is the method, wherein, determining the capturing position comprises: detecting motion of the image sensor unit along a path relative to the position detection system, based on the sensor signal from the motion sensor; defining a spatial relationship that relates at least one point on the path in a coordinate system of the motion sensor to a known position in a coordinate system of the position detection system; and determining the capturing position in the coordinate system of the position detection system based on the detected motion of the image sensor along the path and the defined spatial relationship. Pflugi teaches an augmented marker tracking system, wherein an inertial measurement unit (IMU) is used to measure the system’s orientation (Abstract); and detecting motion of the image sensor unit along a path relative to the position detection system, based on the sensor signal from the motion sensor (section II-F: motion signals from the IMU are recorded (i.e., recording a path) and fused with the optically derived positions by a Kalman filter); defining a spatial relationship that relates at least one point on the path in a coordinate system of the motion sensor to a known position in a coordinate system of the position detection system (section II-F; fig. 4: The Kalman filter is initialized using the first pose estimate from marker tracking); and determining the capturing position in the coordinate system of the position detection system based on the detected motion of the image sensor along the path and the defined spatial relationship (section II-F: The subsequent positions of the augmented marker are then determined by the prediction/update steps of the Kalman filter based on this initial pose estimate and the recorded path. When marker detection fails, the filter does not perform an update step and instead, the final orientation output is the one after the prediction step.). Taking the teachings of Kruger, Kempinski and Pflugi together as a whole, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify Kruger in the manner taught by Pflugi by using a motion sensor in order to determine the capturing position in a coordinate system of the position detection system. Such a modification is merely a simple substitution of one known element for another to obtain predictable results. Concerning claim 45, Pflugi further teaches the method of claim 44, wherein the path starts at the known position in the coordinate system of the position detection system and ends at the capturing position (fig. 4: The Kalman filter is initialized using the first pose estimate from marker tracking (i.e., the start position is known in the coordinate system of the position detection system)). Concerning claim 46, Pflugi further teaches the method of claim 44, wherein the path starts at the capturing position and ends at the known position in the coordinate system of the position detection system (fig. 4: The Kalman filter being bidirectional allows for data to be run forward and backward (i.e., the start position being the capturing position and ending at the known position in the coordinate system)). Concerning claim 47, Kruger further teaches the method of claim 44, wherein the image sensor unit comprises one or more stereographic cameras, one or more cameras in a multi-camera arrangement, one or more infrared image sensors (col. 4, ll. 19-24), or a combination thereof. Concerning claim 48, Kruger further teaches the method of claim 44, wherein the position detection system is an electromagnetic position detection system (col. 4, ll. 40-54), and wherein the position detection system has a working space having a predetermined electromagnetic field strength (col. 4, ll. 40-54). Concerning claim 49, Kruger further teaches the method of claim 48, wherein the method comprises capturing the one or more images with the image sensor unit position at a capturing position outside of the working space (col. 4, ll. 40-54). Concerning claim 52, Kruger further teaches the method of claim 344, wherein the object is a face of a subject depicted in a captured image (fig. 3: pattern 22 is image on the surface of a subject’s head (i.e., a face in fig. 3)). Claim 41 is rejected under 35 U.S.C. 103 as being unpatentable over Kruger et a. (US 9,208,561 B2) in view of Kempinski (US 20140253737 A1), further in view of Lang (US 2017/0258526 A1) and in further view of Drako (US 20180225523 A1). Concerning claim 41, Kruger in view of Kempinski teaches the method of claim 34. Not explicitly taught is the method, wherein capturing the one or more images captured with an image sensor unit positioned at a capturing position further comprises: analyzing streaming image data of the image sensor unit; and automatically triggering image capture based upon certain conditions. Lang teaches methods for computer assisted surgery, wherein capturing the one or more images captured with an image sensor unit positioned at a capturing position further comprises: analyzing streaming image data of the image sensor unit (¶1220: live stream of data); and automatically triggering image capture based upon certain conditions (¶¶1270-1271: recognizing an anatomic landmark, optical marker, etc.). Taking the teachings of Kruger, Kempinski and Lang together as a whole, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the Kruger invention to capture and analyze streaming data and automatically triggering image capture based upon certain conditions in order to automatically activate the system. It is noted that Kruger, Kempinski and Lang fail to explicitly teach automatically triggering image capture based upon recognition of the object in a certain position relative to the image sensor unit. Drako teaches a system, wherein image capture is triggered when certain postures or orientation of a skeleton are detected (¶0041). Taking the teachings of Kruger, Kempinski, Lang and Drako together as a whole, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the Kruger invention to capture and analyze streaming data and automatically triggering image capture based upon recognition of the object in a certain position relative to the image sensor unit in order to automatically activate the system. Claim 42 is rejected under 35 U.S.C. 103 as being unpatentable over Kruger et a. (US 9,208,561 B2) in view of Kempinski (US 20140253737 A1), further in view of Lang (US 2017/0258526 A1) and in further view of Zhang et al. (US 20170356824 A1). Concerning claim 42, Kruger in view of Kempinski teaches the method of claim 34. Not explicitly taught is the method, wherein capturing the one or more images captured with an image sensor unit positioned at a capturing position further comprises: analyzing streaming image data of the image sensor unit; and automatically triggering image capture based upon certain conditions. Lang teaches methods for computer assisted surgery, wherein capturing the one or more images captured with an image sensor unit positioned at a capturing position further comprises: analyzing streaming image data of the image sensor unit (¶1220: live stream of data); and automatically triggering image capture based upon certain conditions (¶¶1270-1271: recognizing an anatomic landmark, optical marker, etc.). Taking the teachings of Kruger, Kempinski and Lang together as a whole, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the Kruger invention to capture and analyze streaming data and automatically triggering image capture based upon certain conditions in order to automatically activate the system. It is noted that Kruger, Kempinski and Lang fail to explicitly teach automatically triggering image capture based upon a positional relation between a reference position sensor attached to the object and the image sensor unit. Zhang et al. (hereinafter Zhang) teaches a system, wherein image capture is triggered based upon a positional relation between a reference position sensor attached to the object and the image sensor unit (¶0027: image capture begins by way of at least one position sensor). Taking the teachings of Kruger, Kempinski, Lang and Zhang together as a whole, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the Kruger invention to capture and analyze streaming data and automatically triggering image capture based upon a positional relation between a reference position sensor attached to the object and the image sensor unit in order to automatically activate the system. Claim 50 is rejected under 35 U.S.C. 103 as being unpatentable over Kruger et a. (US 9,208,561 B2) in view of Kempinski (US 20140253737 A1), further in view of Pflugi et al. (“Augmented Marker Tracking for Peri-Acetabular Osteotomy Surgery”, 11 July 2017), further in view of Lang (US 2017/0258526 A1) and in further view of Drako (US 20180225523 A1). Concerning claim 50, Kruger in view of Kempinski, further in view of Pflugi teaches the method of claim 44. Not explicitly taught is the method, wherein capturing the one or more images captured with an image sensor unit positioned at a capturing position further comprises: analyzing streaming image data of the image sensor unit; and automatically triggering image capture based upon certain conditions. Lang teaches methods for computer assisted surgery, wherein capturing the one or more images captured with an image sensor unit positioned at a capturing position further comprises: analyzing streaming image data of the image sensor unit (¶1220: live stream of data); and automatically triggering image capture based upon certain conditions (¶¶1270-1271: recognizing an anatomic landmark, optical marker, etc.). Taking the teachings of Kruger, Kempinski, Pflugi and Lang together as a whole, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the Kruger invention to capture and analyze streaming data and automatically triggering image capture based upon certain conditions in order to automatically activate the system. It is noted that Kruger, Kempinski, Pflugi and Lang fail to explicitly teach automatically triggering image capture based upon recognition of the object in a certain position relative to the image sensor unit. Drako teaches a system, wherein image capture is triggered when certain postures or orientation of a skeleton are detected (¶0041). Taking the teachings of Kruger, Kempinski, Pflugi, Lang and Drako together as a whole, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the Kruger invention to capture and analyze streaming data and automatically triggering image capture based upon recognition of the object in a certain position relative to the image sensor unit in order to automatically activate the system. Claim 51 is rejected under 35 U.S.C. 103 as being unpatentable over Kruger et a. (US 9,208,561 B2) in view of Kempinski (US 20140253737 A1), further in view of Pflugi et al. (“Augmented Marker Tracking for Peri-Acetabular Osteotomy Surgery”, 11 July 2017), further in view of Lang (US 2017/0258526 A1) and in further view of Zhang et al. (US 20170356824 A1). Concerning claim 51, Kruger in view of Kempinski, further in view of Pflugi teaches the method of claim 44. Not explicitly taught is the method, wherein capturing the one or more images captured with an image sensor unit positioned at a capturing position further comprises: analyzing streaming image data of the image sensor unit; and automatically triggering image capture based upon certain conditions. Lang teaches methods for computer assisted surgery, wherein capturing the one or more images captured with an image sensor unit positioned at a capturing position further comprises: analyzing streaming image data of the image sensor unit (¶1220: live stream of data); and automatically triggering image capture based upon certain conditions (¶¶1270-1271: recognizing an anatomic landmark, optical marker, etc.). Taking the teachings of Kruger, Kempinski, Pflugi and Lang together as a whole, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the Kruger invention to capture and analyze streaming data and automatically triggering image capture based upon certain conditions in order to automatically activate the system. It is noted that Kruger, Kempinski, Pflugi and Lang fail to explicitly teach automatically triggering image capture based upon a positional relation between a reference position sensor attached to the object and the image sensor unit. Zhang et al. (hereinafter Zhang) teaches a system, wherein image capture is triggered based upon a positional relation between a reference position sensor attached to the object and the image sensor unit (¶0027: image capture begins by way of at least one position sensor). Taking the teachings of Kruger, Kempinski, Pflugi, Lang and Zhang together as a whole, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to modify the Kruger invention to capture and analyze streaming data and automatically triggering image capture based upon a positional relation between a reference position sensor attached to the object and the image sensor unit in order to automatically activate the system. Response to Arguments Applicant’s arguments, see Section I (pages 6-12) of the remarks, filed 10/08/2025, with respect to the rejection of claims 34-52 have been fully considered, but upon further consideration they are moot in view of new grounds of rejection made in view of Kempinski because the new grounds of rejection does not rely on Cajigas for any teaching or matter specifically challenged in the arguments. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to JAMES M ANDERSON II whose telephone number is (571)270-1444. The examiner can normally be reached Monday - Friday 10AM-6PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, BRIAN PENDLETON can be reached at 571-272-7527. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /James M Anderson II/Primary Examiner, Art Unit 2425
Read full office action

Prosecution Timeline

Mar 23, 2022
Application Filed
Feb 11, 2023
Non-Final Rejection — §103
Aug 11, 2023
Response Filed
Nov 29, 2023
Final Rejection — §103
May 17, 2024
Request for Continued Examination
May 24, 2024
Response after Non-Final Action
Jun 15, 2024
Non-Final Rejection — §103
Dec 18, 2024
Response Filed
Apr 05, 2025
Final Rejection — §103
Oct 08, 2025
Request for Continued Examination
Oct 17, 2025
Response after Non-Final Action
Oct 18, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12561976
COMMENT GENERATION DEVICE AND COMMENT GENERATION METHOD
2y 5m to grant Granted Feb 24, 2026
Patent 12548437
SYSTEMS AND METHODS FOR POLICY CENTRIC DATA RETENTION IN TRAFFIC MONITORING
2y 5m to grant Granted Feb 10, 2026
Patent 12537949
METHODS AND APPARATUS FOR KERNEL TENSOR AND TREE PARTITION BASED NEURAL NETWORK COMPRESSION FRAMEWORK
2y 5m to grant Granted Jan 27, 2026
Patent 12534313
CAMERA-ENABLED LOADER SYSTEM AND METHOD
2y 5m to grant Granted Jan 27, 2026
Patent 12525019
INTELLIGENT AI SYSTEM FOR RAPID WEAPON THREAT ASSESSMENT IN VIDEO STREAMS
2y 5m to grant Granted Jan 13, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
75%
Grant Probability
85%
With Interview (+10.4%)
2y 11m
Median Time to Grant
High
PTA Risk
Based on 684 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month