Prosecution Insights
Last updated: April 19, 2026
Application No. 18/529,366

Self-Navigating Overhead Support System and Method for Imaging System

Non-Final OA §103
Filed
Dec 05, 2023
Examiner
SONG, HOON K
Art Unit
2884
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
GE Precision Healthcare LLC
OA Round
2 (Non-Final)
86%
Grant Probability
Favorable
2-3
OA Rounds
2y 6m
To Grant
94%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
1294 granted / 1505 resolved
+18.0% vs TC avg
Moderate +8% lift
Without
With
+8.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 6m
Avg Prosecution
36 currently pending
Career history
1541
Total Applications
across all art units

Statute-Specific Performance

§101
2.0%
-38.0% vs TC avg
§103
39.1%
-0.9% vs TC avg
§102
39.9%
-0.1% vs TC avg
§112
13.2%
-26.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1505 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-5 and 12-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Deinlein et al. (US 20220096027) in view of Kennedy et al. (US 20160008988). Regarding claim 1, Deinlein teaches an imaging system comprising: a. a multiple degree of freedom overhead support system 20 adapted to be mounted to a surface within an environment for the imaging system; b. an imaging device 30 mounted to the overhead support system; c. sensors 501 502 50a-50d disposed on the imaging device; e. a motion controller 23 operably connected to the overhead support system; f. a processor 60 operably connected to the motion controller, and the visual sensor and the non-visual sensor to send control signals to and to receive data signals from the overhead support system the visual sensor and the non-visual sensor; and g. a memory 60 operably connected to the processor, the memory storing processor-executable instructions therein for operation of a self-navigating and positioning system configured to generate a three-dimensional (3D) map of the environment of the imaging system with visual data from the visual sensor, and non- visual data from the non-visual sensor, wherein the processor-executable instructions when executed by the processor to operate the self-navigating and positioning system cause: i. generation of the 3D map of the environment (para 105); and ii. navigation of the overhead support system within the environment from a start position to a finish position to avoid collisions with one or more objects identified on the 3D map within the environment (121-126). However Deinlein fails to teach the sensors are visual sensor and non-visual sensor and the self-navigating and positioning system is configured to generate a three-dimensional (3D) map of the environment of the imaging system with visual data from the visual sensor, non-visual data from the non-visual sensor and position data from the motion controller. Kennedy teaches sensors having visual sensor and non-visual sensor and the self-navigating and positioning system is configured to generate a three-dimensional (3D) map of the environment of the imaging system with visual data from the visual sensor, non-visual data from the non-visual sensor and position data from the motion controller (para 92). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt the sensors of Deinlein with the sensors as taught by Kennedy, since it would better environmental data. Regarding claim 2, Deinlein teaches the visual sensor is a camera 501 502 50a-50d. Regarding claim 3, Kennedy teaches the non-visual sensor is 3D spatial sensor (para 92). Regarding claim 5, Deinlein teaches the self-navigating and positioning system includes a simultaneous localization and mapping algorithm, and wherein the processor- executable instructions when executed by the processor to operate the simultaneous localization and mapping algorithm causes: a. performing a semantic mapping operation to generate a semantic 3D map; and b. performing a map refinement operation to generate the 3D map from the semantic 3D map (para 87+). Regarding claim 12, Deinlein teaches the imaging device is an X-ray tube 30. Regarding claim 13, Deinlein teaches a method for navigating an overhead support system of an imaging system through an environment, the method comprising the steps of: a. providing an imaging system comprising: i. a multiple degree of freedom overhead support system adapted to be mounted to a surface within an environment for the imaging system; ii. an imaging device mounted to the overhead support system; iii. sensors 501 502 50a-50d disposed on the imaging device; v. a motion controller operably connected to the overhead support system; vi. a processor operably connected to the motion controller, and the visual sensor and the non-visual sensor to send control signals to and to receive data signals from the overhead support system the visual sensor and the non-visual sensor; and vii. a memory operably connected to the processor, the memory storing processor-executable instructions therein for operation of a self-navigating and positioning system configured to generate and c. navigating the overhead support system within the environment from a start position to a finish position to avoid collisions with one or more objects identified on the 3D map within the environment (121-126). However Deinlein fails to teach the sensors are visual sensor and non-visual sensor and a three-dimensional (3D) map of the environment of the imaging system with visual data from the visual sensor, non-visual data from the non-visual sensor and position data from the motion controller, b. generating the 3D map of the environment. Kennedy teaches sensors having visual sensor and non-visual sensor and a three-dimensional (3D) map of the environment of the imaging system with visual data from the visual sensor, non-visual data from the non-visual sensor and position data from the motion controller, b. generating the 3D map of the environment. (para 92). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to adapt the sensors of Deinlein with the sensors as taught by Kennedy, since it would better environmental data. Regarding claim 14, Deinlein as modified by Kennedy teaches the self-navigating and positioning system includes a simultaneous localization and mapping algorithm, and wherein the method includes the steps of: a. performing a semantic mapping operation using the simultaneous localization and mapping algorithm to generate a semantic 3D map; and b. performing a map refinement operation to generate the 3D map from the semantic 3D map (para 87+ and 92). Allowable Subject Matter Claims 15-20 are allowed. Claims 6-11 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. The following is a statement of reasons for the indication of allowable subject matter: Regarding claims 6-11, the prior art fails to teach the processor-executable instructions when executed by the processor to perform the semantic mapping operation causes: a. synchronizing the visual data with the non-visual data; b. converting the non-visual data into homogenous coordinates; c. aligning the non-visual data with the visual data; d. converting the homogeneous coordinates of the non-visual data to Euclidean coordinates; and e. generating the semantic 3D map from the non-visual data as claimed in claim 6. Regarding claims 15-20, the prior art fails to teach the step of performing the semantic mapping operation causes: a. synchronizing the visual data with the non-visual data; b. converting the non-visual data into homogenous coordinates; c. aligning the non-visual data with the visual data; d. converting the homogeneous coordinates of the non-visual data to Euclidean coordinates; and e. generating the semantic 3D map from the non-visual data as claimed in independent claim 15. Response to Arguments Applicant’s arguments with respect to claim(s) 1-3, 5 and 12-14 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to HOON K SONG whose telephone number is (571)272-2494. The examiner can normally be reached M to Th 10am to 7pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David Makiya can be reached at 571-272-2273. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HOON K SONG/Primary Examiner, Art Unit 2884
Read full office action

Prosecution Timeline

Dec 05, 2023
Application Filed
Jul 25, 2025
Non-Final Rejection — §103
Oct 21, 2025
Response Filed
Mar 07, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12599781
ASSESSING TREATMENT PARAMETERS FOR RADIATION TREATMENT PLANNING
2y 5m to grant Granted Apr 14, 2026
Patent 12603191
ELECTROMAGNETIC RADIATION FOCUSING DEVICE AND APPLICATIONS THEREOF
2y 5m to grant Granted Apr 14, 2026
Patent 12599346
PHOTON COUNTING COMPUTED TOMOGRAPHY APPARATUS AND PHOTON-COUNTING CT-SCANNING CONDITION SETTING METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12599344
X-RAY CT APPARATUS
2y 5m to grant Granted Apr 14, 2026
Patent 12589260
TIMING-BASED METHODS, SYSTEMS, AND COMPUTER READABLE MEDIUMS FOR A GATED LINEAR ACCELERATOR
2y 5m to grant Granted Mar 31, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

2-3
Expected OA Rounds
86%
Grant Probability
94%
With Interview (+8.5%)
2y 6m
Median Time to Grant
Moderate
PTA Risk
Based on 1505 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month