Prosecution Insights
Last updated: April 19, 2026
Application No. 18/133,824

AUTOMATIC PILOT EYE POINT POSITIONING SYSTEM

Final Rejection §101§103§112
Filed
Apr 12, 2023
Examiner
ORANGE, DAVID BENJAMIN
Art Unit
2663
Tech Center
2600 — Communications
Assignee
Ami Industries Inc.
OA Round
2 (Final)
34%
Grant Probability
At Risk
3-4
OA Rounds
3y 7m
To Grant
63%
With Interview

Examiner Intelligence

Grants only 34% of cases
34%
Career Allow Rate
51 granted / 151 resolved
-28.2% vs TC avg
Strong +29% interview lift
Without
With
+29.4%
Interview Lift
resolved cases with interview
Typical timeline
3y 7m
Avg Prosecution
51 currently pending
Career history
202
Total Applications
across all art units

Statute-Specific Performance

§101
13.1%
-26.9% vs TC avg
§103
29.0%
-11.0% vs TC avg
§102
20.2%
-19.8% vs TC avg
§112
32.0%
-8.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 151 resolved cases

Office Action

§101 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments and amendment have persuasively overcome the 112a rejection, certain 112b rejections, . However, in each of the instances where Applicant has amended the claims to overcome a rejection, Applicant’s arguments that the rejection was incorrect are unpersuasive, but not addressed as they are moot. The remaining issues are addressed below. 112b Applicant argues: The Applicant respectfully submits that the controller of claim 1 is not recited as comprising the eye, actuators, and pilot seat. Examiner responds: This assertion is insufficient to resolve the issue. For example, given a controller that outputs a signal, how can one assess if the signal is an actuation signal or not if the pilot seat is not specified/not present? For example, if the signal actuates one brand of pilot seat, but not the other, is the claim met? Applicant argues: The Applicant also respectfully submits those skilled in the art would understand what is claimed when the claim is read in light of the specification. See MPEP 2173.02. Examiner responds: The statement from 2173.02 is just part of the analysis, it is not the extent of the analysis. Applicant argues: For example, "wherein the memory comprises a distance from the image sensor to the design eye position" and/or "wherein the design eye position is a fixed position relative to the image sensor." Examiner responds: Reciting that the design eye position is a value stored in memory is expected overcome this rejection (because the value in memory is an objective standard) Applicant argues: Applicant respectfully submits that the ordinary meaning of "distance" is clearly redefined in the written description as being defined by distance values (d, dX, dY, and dZ) in three-dimensional space. See paragraphs [0065] and [0074]-[0077]. Examiner responds: This is unpersuasive because it does not meet the standard from MPEP 2173.05(a). Further, the specification describes a distance being made up of other distances (e.g., specification [0065] “The distances … each include a longitudinal distance, a vertical distance, and a lateral distance.”) 101 Applicant argues: The Applicant respectfully requests the Examiner consider each step which the program instructions cause the one or more processors to perform when analyzing the claims under the Broadest Reasonable Interpretation. Examiner responds: Each of the steps that Applicant highlighted correspond to how a person looks at a pilot. Applicant argues: For example, the human mind is not equipped to process depth data in an image and would not practically perform at least the step of "determine a distance from the image sensor to the eye based on the depth data associated with the position of the eye within the image" according to as-filed claim 1. Examiner responds: The human mind processes depth data all the time, such as when catching a ball. Applicant argues: By way of another example, the human mind is not equipped to "cause the one or more processors to continually generate the actuation signal in a feedback loop using the video" according to as-filed claim 6. Examiner responds: MPEP 2106.04(a)(2)(III)(C) explains that use of a generic computer or in a computer environment is still a mental process. Applicant argues: For example, this additional element is not part of the mental process of "a person can look at a pilot and decide if they are sitting in the correct position," as recited by the examiner. Examiner responds: The use of an actuation is merely use of a generic computer (as per the above response) Applicant argues: The additional elements of generating the "actuation signal based on the distance from the eye to the design eye position" improves the functioning of the pilot seat by moving the eye towards the design eye position reducing the distance from the eye to the design eye position. Examiner responds: This is both a mental process and well-known in the art, as shown in Applicant’s “safety first” publication. Applicant argues: As to the system claims, the Applicant respectfully submits that the system applies the judicial exception with, or by use of, a particular machine. … The Applicant respectfully submits that the pilot seat comprising one or more actuators are particular elements of the system which may be specifically identified. Examiner responds: Applicant is reminded of their earlier statement “The Applicant respectfully submits that the controller of claim 1 is not recited as comprising the eye, actuators, and pilot seat.” Applicant argues: Claim 1 is amended to recite "wherein the depth data is a depth map". Support for this amendment is found throughout the as-filed application, including as-filed paragraph [0039]. Applicant respectfully submits amended Claim 1 recites elements which have not been disclosed, taught or suggested by Monfraix in view of Zeng. Examiner responds: Specification [0039] states “The depth data 212 may also be referred to as a depth map or a depth channel. … In this regard, the depth data 212 may indicate a distance from the image sensors 202 to the eye.” The term “depth map” appears to just be another word for a depth data, meaning that the phrase “depth map” does not limit the “depth data,” and thus this amendment fails to overcome the prior rejection. Claim interpretation Applicant argues: Accordingly, Applicant respectfully submits that any cancellations and/or amendments during the course of prosecution should be held to be tangential to and/or unrelated to patentability in the event that such cancellations and/or amendments are viewed in a post-issuance context under post-issuance claim interpretation rules. Examiner responds: Applicant is incorrect regarding claim interpretation. Claim Objections Applicant is advised that should claim 12 be found allowable, claim 16 will be objected to under 37 CFR 1.75 as being a substantial duplicate thereof (and vice versa). When two claims in an application are duplicates or else are so close in content that they both cover the same thing, despite a slight difference in wording, it is proper after allowing one claim to object to the other as being a substantial duplicate of the allowed claim. See MPEP § 608.01(m). Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-16 (all claims) are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 is directed to a controller, but also recites an eye, actuators and a pilot seat. It is not clear which of the eye, actuators and a pilot seat need to be present for the claim to be met. Similarly, claim is directed to a system, but also recites an eye (e.g., “causes the eye to move towards”). It is not clear which of the eye, actuators and a pilot seat need to be present for the claim to be met. Claims 1 and 9 recite a “depth map,” but this is new terminology. MPEP 2173.05(a). In particular, the usage in the claim suggests that it is more specific than “depth data,” but specification [0039] shows suggests that this is not the case. Claims 1 and 9 recite a “design eye position,” but this is subjective. MPEP 2173.05(b)(IV). In other words, the design eye position can be arbitrarily chosen, and thus different people can have different opinions of where it is. Claims 1 and 9 recite “distance,” but he usage in the claim suggest that something different is meant. “Distance” is understood as a scalar amount (e.g., 9 inches), but the claim recites “determine a distance from the eye to a design eye position based on the distance from the image sensor to the eye.” See also, Fig. 5B showing a vector of changes. The claim language appears to be directed to Fig. 5B, but that would mean that “distance” is a vector rather than a scalar. A similar issue arises, for example, with the calculation of claim 2. Dependent claims are likewise rejected. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-16 (all claims) are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea (mental process) without significantly more. Step 1: All of the claims recite a system or a controller, and machines are eligible subject matter. Step 2A, prong one: All of the elements of claims 1-15 are a mental process because a person can look at a pilot and decide if they are sitting in the correct position. MPEP 2106.04(a)(2)(III)(C) explains that use of a generic computer or in a computer environment is still a mental process. In particular, this section begins by citing Gottschalk v. Benson, 409 US 63 (1972). “The Supreme Court recognized this in Benson, determining that a mathematical algorithm for converting binary coded decimal to pure binary within a computer’s shift register was an abstract idea.” In Benson the Supreme Court did not separately analyze the computer hardware at issue; the specifics of what hardware was claimed is only included in an appendix to the decision. Because there are no additional elements, no further analysis is required for Step 2A, prong two or Step 2B. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-6, 8-11 and 13-15 (all claims except those rejected under 103) are rejected under 35 U.S.C. 103 as being unpatentable over US20180009533A1 (“Monfraix”) and CN111301689A (“Zeng”). The citations to Zeng refer to the attached translation. A controller comprising: a memory maintaining program instructions; and (Monfraix, Fig. 5, RAM 1051, ROM 1052 and SD 1053) one or more processors configured to execute the program instructions causing the one or more processors to: (Monfraix, Fig. 5, CPU 1050) receive an image from an image sensor; (Monfraix, claim 1, “acquiring an image of a user seated on the seat”) wherein the image comprises depth data, wherein the depth data is a depth map; (Monfraix, claim 1 “obtaining a relative position between each eye detected and the predefined zone” Monfraix’s image includes depth data, such as the relative positioning.) detect a position of an eye within the image; (Monfraix, claim 1, “detecting at least one eye of said seated user in the image acquired”) Monfraix is not relied on for the below claim language. However, Zeng teaches determine a distance from the image sensor to the eye based on the depth data and the position of the eye within the image; (Zeng, abstract “(1) three-dimensional positioning is carried out on the eye position of a pilot by using an innovative eye position positioning device”) determine a distance from the eye to a design eye position based on the distance from the image sensor to the eye; and (Zeng, p. 3, “Step 3: The seat adjustment control unit receives the three-dimensional real-time eye position coordinate information (x, y, z) and compares it with the three-dimensional reference point of the eye position (x0, y0, z0) to determine the seat movement Direction and position (Δx, Δy, Δz);”) generate an actuation signal based on the distance from the eye to the design eye position; (Zeng, p. 3, “Step 4: The seat adjustment control unit issues an adjustment command to adjust the seat to a designated position;”) wherein the actuation signal causes one or more actuators to actuate a pilot seat; (Zeng, p. 3, “Step 4: The seat adjustment control unit issues an adjustment command to adjust the seat to a designated position;”) wherein the actuation of the pilot seat causes the eye to move towards the design eye position reducing the distance from the eye to the design eye position. (Zeng, p. 3, “Step 4: The seat adjustment control unit issues an adjustment command to adjust the seat to a designated position;”) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to apply the teachings of Zeng to the teachings of Monfraix such that Zeng’s three dimensional techniques are used with Monfraix’s system for the purpose of implementation details (e.g., Monfraix discusses using motors for each axis at [0034], but does not provide an algorithm). Based on the above, this is an example of “combining prior art elements according to known methods to yield predictable results.” MPEP 2143. The controller of claim 1, wherein the memory comprises a distance from the image sensor to the design eye position; (Zeng, p. 3, “Step 3: The seat adjustment control unit receives the three-dimensional real-time eye position coordinate information (x, y, z) and compares it with the three-dimensional reference point of the eye position (x0, y0, z0) to determine the seat movement Direction and position (Δx, Δy, Δz);”) wherein the program instructions cause the one or more processors to determine the distance from the eye to the design eye position based on the distance from the image sensor to the eye and based on the distance from the image sensor to the design eye position. (Zeng, p. 3, “Step 3: The seat adjustment control unit receives the three-dimensional real-time eye position coordinate information (x, y, z) and compares it with the three-dimensional reference point of the eye position (x0, y0, z0) to determine the seat movement Direction and position (Δx, Δy, Δz);”) 3. The controller of claim 2, wherein the program instructions cause the one or more processors to determine a difference in distance between the distance from the image sensor to the design eye position and the distance from the image sensor to the eye, the difference in distance indicating the distance from the eye to the design eye position. (Zeng, p. 3, “Step 3: The seat adjustment control unit receives the three-dimensional real-time eye position coordinate information (x, y, z) and compares it with the three-dimensional reference point of the eye position (x0, y0, z0) to determine the seat movement Direction and position (Δx, Δy, Δz);”) 4. The controller of claim 1, wherein the distance from the eye to the design eye position comprises at least a longitudinal distance value and a vertical distance value. (Zeng, p. 3, “Step 3: The seat adjustment control unit receives the three-dimensional real-time eye position coordinate information (x, y, z) and compares it with the three-dimensional reference point of the eye position (x0, y0, z0) to determine the seat movement Direction and position (Δx, Δy, Δz);” Zeng’s x, y and z coordinates teach the claimed longitudinal and vertical values.) 5. The controller of claim 1, wherein the program instructions cause the one or more processors to continually generate the actuation signal until at least a portion of the eye is disposed at the design eye position. (Zeng, p. 3, “Step 5: After the adjustment is completed, re-compare the current 3D real-time eye position coordinate information (x, y, z) with the 3D eye position reference point (x0, y0, z0) to ensure that the seat has reached the reference eye Bit.”) 6. The controller of claim 1, wherein the image is one of a plurality of images in a video; wherein the program instructions cause the one or more processors to continually generate the actuation signal in a feedback loop using the video. (Zeng, p. 3, “Preferably, in the system for intelligently adjusting the eye position of a pilot of the present invention, the eye position positioning device performs real-time three-dimensional positioning of the eye position of the pilot through a camera.” Zeng’s real time use of a camera teaches the claimed video.) 8. The controller of claim 1, the program instructions causing the one or more processors to: receive a seat adjust signal; and (Zeng, p. 3, “Step 3: The seat adjustment control unit receives the three-dimensional real-time eye position coordinate information (x, y, z) and compares it with the three-dimensional reference point of the eye position (x0, y0, z0) to determine the seat movement Direction and position (Δx, Δy, Δz);” Zeng’s seat movement determination teaches the claimed seat adjust signal.) generate the actuation signal in response to receiving the seat adjust signal. (Zeng, p. 3, “Step 4: The seat adjustment control unit issues an adjustment command to adjust the seat to a designated position;”) Claim 9 is rejected as per claim 1. See also, Monfraix, Figs. 4 and 5 for the seat, sensor and controller. 10. The system of claim 9, wherein the one or more actuators comprise a vertical actuator and a longitudinal actuator; (Monfraix, [0034] “each seat 103A and 103B has a plurality of motors, each one able to modify the position of that seat respectively along the x-, y- and z-axes. Each motor is controlled by the processing module according to the method described in connection with FIG. 6.”) wherein the vertical actuator is configured to vertically actuate the pilot seat; (Monfraix, [0034] “each seat 103A and 103B has a plurality of motors, each one able to modify the position of that seat respectively along the x-, y- and z-axes. Each motor is controlled by the processing module according to the method described in connection with FIG. 6.”) wherein the longitudinal actuator is configured to longitudinally actuate the pilot seat. (Monfraix, [0034] “each seat 103A and 103B has a plurality of motors, each one able to modify the position of that seat respectively along the x-, y- and z-axes. Each motor is controlled by the processing module according to the method described in connection with FIG. 6.”) 11. The system of claim 10, wherein the actuation signal is configured to cause the vertical actuator and the longitudinal actuator to simultaneously actuate the pilot seat. (Monfraix, [0034] “each seat 103A and 103B has a plurality of motors, each one able to modify the position of that seat respectively along the x-, y- and z-axes. Each motor is controlled by the processing module according to the method described in connection with FIG. 6.” Monfraix’s independent motors disclose the claimed simultaneously because they are not required to operate sequentially, but rather each can respond to the signal in real time.) 13. The system of claim 9, wherein the design eye position is a fixed position relative to the image sensor. (Zeng, p. 3, “three-dimensional reference point of the eye position (x0, y0, z0)”) 14. The system of claim 9, wherein the program instructions cause the one or more processors to continually generate the actuation signal until at least a portion of the eye is disposed at the design eye position. (Zeng, p. 3, “Step 5: After the adjustment is completed, re-compare the current 3D real-time eye position coordinate information (x, y, z) with the 3D eye position reference point (x0, y0, z0) to ensure that the seat has reached the reference eye Bit.”) Claim 15 is rejected as per claim 8. Zeng’s “eye position adjustment control switch” teaches the claimed button (in the interest of compact prosecution, various types of buttons and switches are known substitutes. MPEP 2144.06(II)). See also, Monfraix, Figs. 4 and 5 for the seat, sensor and controller of claim 9. Claims 7, 12, and 16 (all claims except those rejected under 102) are rejected under 35 U.S.C. 103 as being unpatentable over Monfraix and Zeng, as per the parent claims, and US20220174261A1 (“Hornstein”). 7. The combination of Monfraix and Zeng teaches the controller of claim 1, but is not relied on for the below claim language. However, Hornstein teaches wherein detecting the position of the eye within the image comprises generating a bounding box around the position of the eye. (Hornstein, Fig. 10, bounding box 215) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to apply the teachings of Hornstein to the teachings of Monfraix and Zeng such that Hornstein’s three dimensional techniques are used with Monfraix and Zeng’s system for the purpose of improving alignment. Hornstein, [0011]-[0013]. Based on the above, this is an example of “combining prior art elements according to known methods to yield predictable results.” MPEP 2143. 12. The combination of Monfraix and Zeng teaches the controller of claim 9, but is not relied on for the below claim language. However, Hornstein teaches the system of claim 9, wherein the image sensor comprises a light detection and ranging (LIDAR) system configured to generate the depth data. (Hornstein, [0058] “The depth map can be determined using sensor data (e.g., distance sensor, depth sensor, depth camera, LIDAR, SONAR, etc.)” See also, Fig. 8A.) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to apply the teachings of Hornstein to the teachings of Monfraix and Zeng such that Hornstein’s three dimensional techniques are used with Monfraix and Zeng’s system for the purpose of improving alignment. Hornstein, [0011]-[0013]. Based on the above, this is an example of “combining prior art elements according to known methods to yield predictable results.” MPEP 2143. Claim 16 is rejected as per claim 12. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US9645414B2 USRE47637E1 US10169846B2 Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID ORANGE whose telephone number is (571)270-1799. The examiner can normally be reached Mon-Fri, 9-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Gregory Morse can be reached at 571-272-3838. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DAVID ORANGE/Primary Examiner, Art Unit 2663
Read full office action

Prosecution Timeline

Apr 12, 2023
Application Filed
Sep 08, 2025
Non-Final Rejection — §101, §103, §112
Dec 09, 2025
Response Filed
Feb 02, 2026
Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12567126
INFRASTRUCTURE-SUPPORTED PERCEPTION SYSTEM FOR CONNECTED VEHICLE APPLICATIONS
2y 5m to grant Granted Mar 03, 2026
Patent 11300964
METHOD AND SYSTEM FOR UPDATING OCCUPANCY MAP FOR A ROBOTIC SYSTEM
2y 5m to grant Granted Apr 12, 2022
Patent 10816794
METHOD FOR DESIGNING ILLUMINATION SYSTEM WITH FREEFORM SURFACE
2y 5m to grant Granted Oct 27, 2020
Patent 10433126
METHOD AND APPARATUS FOR SUPPORTING PUBLIC TRANSPORTATION BY USING V2X SERVICES IN A WIRELESS ACCESS SYSTEM
2y 5m to grant Granted Oct 01, 2019
Patent 10285010
ADAPTIVE TRIGGERING OF RTT RANGING FOR ENHANCED POSITION ACCURACY
2y 5m to grant Granted May 07, 2019
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
34%
Grant Probability
63%
With Interview (+29.4%)
3y 7m
Median Time to Grant
Moderate
PTA Risk
Based on 151 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month