Prosecution Insights
Last updated: April 19, 2026
Application No. 18/790,444

LIGHT REFRACTION OR DISPERSION AND LANDMARK BASED NAVIGATION

Non-Final OA §103
Filed
Jul 31, 2024
Examiner
OVALLE JR., DAVID MESQUITI
Art Unit
3669
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
The Charles Stark Draper Laboratory Inc.
OA Round
1 (Non-Final)
100%
Grant Probability
Favorable
1-2
OA Rounds
3y 0m
To Grant
99%
With Interview

Examiner Intelligence

Grants 100% — above average
100%
Career Allow Rate
4 granted / 4 resolved
+48.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
31 currently pending
Career history
35
Total Applications
across all art units

Statute-Specific Performance

§101
7.5%
-32.5% vs TC avg
§103
58.1%
+18.1% vs TC avg
§102
16.9%
-23.1% vs TC avg
§112
16.9%
-23.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 4 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Status of Claims This Office Action is in response to the application filed on 12/11/2025. Claims 1, 3 – 5, 8 – 12, 14 - 19 are presently pending and are presented for examination. Election/Restrictions Claims 2, 6 – 7, 13, & 20 are withdrawn from further consideration pursuant to 37 CFR 1.142(b), as being drawn to a nonelected 1, 3 – 5, 8 – 12, 14 - 19, there being no allowable generic or linking claim. Applicant timely traversed the restriction (election) requirement in the reply filed on 12/11/2025. 4. Applicant's election with traverse of claims 2,6-7,13 and 20 in the reply filed on 12/11/2025 is acknowledged. The traversal is on the ground(s) that these claims will not provide burden to the examiner’s search. This is not found persuasive because even though these claims species pertain to determining the position of a satellite, searching these four distinct and separate species would add a burden to the Examiner, requiring separate search strategies, separate database queries and finally separate analysis and therefore the election of a single species is required. The requirement is still deemed proper and is therefore made FINAL. Information Disclosure Statement The information disclosure statement (IDS) submitted on 10/29/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Specification The disclosure is objected to because of the following informalities: Paragraph [0070], recites “…correspond to a different cones 600.” Should say “…correspond to a different cone 600.” Appropriate correction is required. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1 is rejected under 35 U.S.C. 103 as being unpatentable over US9702702B1 (hereinafter, “Lane”), and further in view of US20090048780A1 (hereinafter, “Caballero”). 9. Regarding claim 1, Lane teaches a system, comprising [Col. 8 Lines 64 - 67] – [Col. 9 Lines 1 – 5]: Lane is a navigation system. a data processing system comprising one or more processors, coupled with memory, to [Col. 9 Lines 33 – 57], [Col. 16 Lines 32 – 61]: A processor (206) is implemented [Col. 9 Lines 33 – 57] as well as a memory [Col. 16 Lines 32 – 61]. 11. Lane does not explicitly teach receive a first image of a surface of a planet from a first camera of a vehicle; generate a first position dataset based on the first image and data representing landmarks of the surface of the planet; However, Caballero teaches receive a first image of a surface of a planet from a first camera of a vehicle [0018], [0025], [0032]; The vehicle (24) will take images of a planet surface (44). Once an image is taken by the imaging sensor (40), that sensor will receive the image of the planet surface. This imaging sensor (40) constitutes a camera. For examining purposes, this imaging sensor (40) will be interpreted as a first camera. generate a first position dataset based on the first image and data representing landmarks of the surface of the planet ([0021] Fig. 2); Caballero teaches receiving an image of the planet surface (44) and incorporating steps that involve processing the image to identify edge pixels and angle data to identify planetary features and using these planetary features to compare them to a predefined library of planet surface descriptions (landmarks) to determine a location of the vehicle (24) relative to the planet. This process involves datasets that are determined and compared based on the received image (first image). Therefore, the entirety of this process that uses edge pixels and angle data to determine a location of the vehicle (24) relative to the planet produces a first data position dataset. One of ordinary skill in the art, before the effective filing date of the instant application with a reasonable expectation of success, would have been motivated to modify the disclosure of Lane with the teachings of Caballero, to more accurately determine a position of a satellite in an alternate way without the usage of GPS. 10. Lane teaches receive, by a second camera of the vehicle oriented towards an atmosphere of the planet, a second image that includes light of a celestial body refracted or dispersed by the atmosphere of the planet [Col. 8 Lines 27 – 60]; Lane teaches a satellite system that is configured to observe the Earth’s atmospheric limb and to capture images of celestial bodies whose light has been refracted by the Earth’s atmosphere. The star tracker, a camera, would have to be orientated towards the Earth’s limb to capture a star’s light through images in which the positions of stars (celestial body) are altered due to the atmospheric refraction of light. For examining purposes, these star field images will be considered a second image. Such imaging includes light originating from celestial bodies that has passed through and been refracted by the planet’s atmosphere. generate, via a celestial body catalog, a second position dataset based at least in part on an amount of refraction or dispersion of the light of the celestial body in the second image; and [Col. 7 Lines 61 – 67] – [Col. 8 Lines 1 - 2], [Col. 8 Lines 7 – 60] Lane teaches a database (142) that stores a star catalog (celestial body catalog) of known star locations [Col. 7 Lines 61 – 67] – [Col. 8 Lines 1 - 2]. The star tracker can calculate a position of the vehicle in space. As we discussed above, the star tracker captures a star’s light through images in which the positions of stars (celestial body) are altered due to the atmospheric refraction of light [Col. 8 Lines 7 – 60]. Therefore, a second position dataset can be extracted from the positional information that is gained as the star tracker calculates a position of the vehicle in space. determine, based on a filter applied to the first position dataset and the second position dataset, a position and attitude of the vehicle [Col. 9 Lines 33 – 57], [Col. 10 Lines 37 – 58]. Lane teaches a navigation filter (240) which uses image data from celestial objects in which the star tracker component captures (second position dataset) and data from the inertial measurement unit (IfMU) to produce a navigation estimate. The comparing of data is what is important. The navigation filter (240) clearly compares two different types of data. It would’ve been obvious to input a first position data set relating to capturing landmarks of a surface of a planet (Caballero) to be compared with information relating to what the star tracker captured to estimate a position and attitude of the vehicle instead of using data from an IMU [Col. 9 Lines 33 – 57], [Col. 10 Lines 37 – 58]. A position and attitude of the vehicle is determined based on this filter [Col. 10 Lines 59 – 67] – [Col. 11 Lines 1 – 15]. Claim(s) 3 is rejected under 35 U.S.C. 103 as being unpatentable over US9702702B1 (hereinafter, “Lane”), and further in view of US20090048780A1 (hereinafter, “Caballero”), and further in view of US20140320341A1 (hereinafter, “Muraki”). 12. Regarding claim 3, Lane teaches the system of claim 1, comprising: the data processing system to: determine, based on the filter applied to the first position dataset and the second position dataset,… [Col. 9 Lines 33 – 57], [Col. 10 Lines 37 – 58] Lane teaches a navigation filter (240) which uses image data from celestial objects in which the star tracker component captures (second position dataset) and data from the inertial measurement unit (IMU) to produce a navigation estimate. The comparing of data is what is important. The navigation filter (240) clearly compares two different types of data. It would’ve been obvious to input a first position data set relating to capturing landmarks of a surface of a planet (Cabellero) to be compared with information relating to what the star tracker captured to estimate a position and attitude of the vehicle instead of using data from an IMU [Col. 9 Lines 33 – 57], [Col. 10 Lines 37 – 58]. A position and attitude of the vehicle is determined based on this filter [Col. 10 Lines 59 – 67] – [Col. 11 Lines 1 – 15]. The modified Lane reference does not explicitly teach …the position of the vehicle in an earth centered inertial coordinate system and an earth centered earth fixed coordinate system. However, Muraki teaches …the position of the vehicle in an earth centered inertial coordinate system and an earth centered earth fixed coordinate system [0038] – [0039], [0055] – [0057]. Muraki teaches capturing a position of a satellite using an earth-centered earth fixed (ECEF) and an earth-centered inertial (ECI) coordinate system. Lane and Muraki are analogous art because Lane teaches a navigation filter which compares two sets of data while Muraki teaches using an earth centered inertial coordinate system and an earth centered earth fixed coordinate system. One of ordinary skill would have the motivation to combine Lane and Muraki because both are directed at improving the accuracy of satellite-based estimation and positioning. Muraki’s satellite positioning data expressed in ECEF and a ECI coordinate system would be an additional input to such a filter that Lane uses. The combination would yield a more improved and stable satellite positioning estimation. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Muraki, to modify the teachings of the combination of Lane and Caballero to include the teachings of Muraki because doing so would allow for a more refined positional data of the satellite. 13. Claim(s) 4 is rejected under 35 U.S.C. 103 as being unpatentable over US9702702B1 (hereinafter, “Lane”), and further in view of US20090048780A1 (hereinafter, “Caballero”), and further in view of US20110196550A1 (hereinafter, “Carrico”) 14. Regarding claim 4, Lane teaches the system of claim 1, comprising; the data processing system to: correct, based on the filter applied to the first position dataset and the second position dataset,… Lane does not explicitly teach …for a wobble of the planet and for an atmospheric drag of the vehicle without receiving an update parameter from a base station. However, Carrico teaches …for a wobble of the planet and for an atmospheric drag of the vehicle without receiving an update parameter from a base station ([0006], [0008], [0029], [0032] Fig. 3). Carrico teaches including atmospheric drag into its models [0006], [0008]. Carrico may not explicitly state wobble of the planet, but it does incorporate orbit determination and orbit propagation which includes planetary wobble [0029], [0032]. They both account for planetary wobble because they use Earth’s orientation into the models. These Earth’s orientation values represent variations in the Earth’s rotational axis and rotation rate over time which corresponds to the physical wobble of the planet. This is because the satellite needs to remain aligned with the true orientation of the Earth as it changes. Therefore, systems that use orbit determination and orbit propagation incorporate planetary wobble to prevent positional errors that would otherwise occur. This type of modeling allows the satellite state to be estimated and predicted over time without requiring continuous update parameters from a base station. Lane and Carrico are analogous art because Lane teaches a navigation filter which compares two sets of data while Carrico teaches generating models that take into account atmospheric drag and planet wobble. One of ordinary skill would have the motivation to combine Lane and Carrico because both are directed to improving accuracy of a navigation state of the satellite by incorporating modeled errors into a filter. It would have been obvious to incorporate the orbit propagation outputs as an additional modeled input to Lane’s navigation filter. The combination would therefore be able to obtain a more precise and stable satellite positioning data. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Carrico, to modify the teachings of the combination of Lane and Caballero to include the teachings of Carrico to improve and have more precise positional data. 15. Claim(s) 5 is rejected under 35 U.S.C. 103 as being unpatentable over US9702702B1 (hereinafter, “Lane”), and further in view of US20090048780A1 (hereinafter, “Caballero”), and further in view of US20250382072A1 (hereinafter, “Rubel”), and further in view of US20200025571A1 (hereinafter, “Skilton”). 16. Regarding claim 5, the modified Lane reference does not explicitly teach the system of claim 1, comprising: the data processing system to: receive a third image captured at a first point in time and a fourth image captured at a second point in time from the second camera or a third camera, the third image including a first star and a first planet, the fourth image including a second star and a second planet; determine a third position dataset based on the third image and the fourth image; and determine, based on the filter applied to the first position dataset, the second position dataset, and the third position dataset, the position and the attitude of the vehicle. However, Rubel teaches the system of claim 1, comprising: the data processing system to: receive a third image captured at a first point in time and a fourth image captured at a second point in time from the second camera or a third camera, the third image including a first star and a first planet, the fourth image including a second star and a second planet [0026], [0055], [0058], [0077] – [0078]; Rubel teaches capturing a plurality of images of planets [0055] or stars [0058] over time [0026] using a payload sensor (405) and secondary imaging systems such as sensors (410 & 415). These multiple sensors conclude that this spacecraft (400) has multiple cameras. For examining purposes, the payload sensor (405) will be interpreted as a second camera and the secondary imaging sensors can be interpreted as a third (410) and fourth camera (415). This includes capturing a first image at a first time (third image) and a subsequent image (fourth image) at a later time. Celestial objects such as stars and planets are identified in both images [0055], [0077] – [0078] and their relative positions are compared across the multiple image frames. Whatever star that was captured first will be considered the first star and whatever planet that was captured first will be considered the first planet. Due to the images being acquired sequentially, the first image corresponds to a first point in time and the second image corresponds to a second point in time. determine a third position dataset based on the third image and the fourth image; and [0058], [0077] – [0078] Due to the first (third image) and second image (fourth image) being compared, we can confirm that the comparison data of these images and using that comparison data to determine a position of the satellite can be known as a third position dataset. The modified Lane reference does not explicitly teach determine, based on the filter applied to the first position dataset, the second position dataset, and the third position dataset, the position and the attitude of the vehicle. However, Skilton teaches determine, based on the filter applied to the first position dataset, the second position dataset, and the third position dataset, the position and the attitude of the vehicle ([0026], [0032], [0056], [0059] Fig. 1 & 3). Skilton teaches a navigation filter (110) that intakes three different types of datasets (Fig.1). Skilton intaking three different types of datasets is what is important. This navigation filter (110) already teaches a terrain unit (130) which uses the terrain of the planet to estimate a position based on a correlation between measured profile data and stored terrain profile data which is using landmarks in these terrain profiles [0032] and a star tracker unit (140) that uses the Earth’s atmosphere to determine light refraction [0026]. Therefore, it would have been obvious to modify these three different inputs to be the inputs of a first position dataset that Caballero teaches, a second position dataset that Lane teaches, and a third dataset that Rubel teaches to determine a position and attitude of the satellite ([0056], [0059] Fig. 3). Rubel and Skilton are analogous art to Lane because Rubel teaches comparing a plurality of images of stars and planets at different points in time and comparing those images together while Skilton teaches a filter that intakes three different types of datasets. One of ordinary skill would have had the motivation to combine Rubel and Skilton because both references are directed at to improving satellite navigation accuracy. A person of ordinary skill would recognize that Rubel would constitute as an additional measurement that could be inputted into to Skilton’s navigation filter. Skilton teaches that navigation accuracy is improved by comparing and fusing multiple different data types within a filter framework and Rubel provides an independent source of position relevant information derived from sequential imaging of celestial bodies. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Rubel and Skilton, to modify the teachings of the combination of Lane and Caballero to include the teachings of Rubel and Skilton improve the accuracy of determining a satellite’s position in space. 17. Claim(s) 8 is rejected under 35 U.S.C. 103 as being unpatentable over US9702702B1 (hereinafter, “Lane”), and further in view of US20090048780A1 (hereinafter, “Caballero”), and further in view of US20250382072A1 (hereinafter, “Rubel”), and further in view of US20250382072A1 (hereinafter, “Rubel”), and further in view of US20170076137A1 (hereinafter, “Gordley”). 18. Regarding claim 8, Lane teaches generate the second position dataset…and the second position dataset… [Col. 8 Lines 7 – 60] The star tracker can calculate a position of the vehicle in space. As we discussed above, the star tracker captures a star’s light through images in which the positions of stars (celestial body) are altered due to the atmospheric refraction of light. Therefore, a second position dataset can be extracted from the positional information that is gained as the star tracker calculates a position of the vehicle in space. However, Lane does not explicitly teach the system of claim 1, comprising: the data processing system to: receive an altitude of the vehicle; select between a co-sighting technique or at least one of a stellar horizon atmospheric dispersion or stellar horizon atmospheric refraction technique based on the altitude; …based on the stellar horizon atmospheric dispersion or the stellar horizon atmospheric refraction technique and apply the filter to the first position dataset…responsive to a selection of the stellar horizon atmospheric dispersion or the stellar horizon atmospheric refraction technique; and generate a third position dataset based on the co-sighting technique and apply the filter to the first position dataset and the third position dataset responsive to a selection of the co- sighting technique. However, Caballero teaches …to the first position dataset…to the first position dataset… ([0021] Fig. 2) Caballero teaches receiving an image of the planet surface (44) and incorporating steps that involve processing the image to identify edge pixels and angle data to identify planetary features and using these planetary features to compare them to a predefined library of planet surface descriptions (landmarks) to determine a location of the vehicle (24) relative to the planet. This process involves datasets that are determined and compared based on the received image (first image). Therefore, the entirety of this process can be known as a first position dataset. The modified Lane reference does not explicitly teach the system of claim 1, comprising: the data processing system to: receive an altitude of the vehicle; select between a co-sighting technique or at least one of a stellar horizon atmospheric dispersion or stellar horizon atmospheric refraction technique based on the altitude; …based on the stellar horizon atmospheric dispersion or the stellar horizon atmospheric refraction technique and apply the filter…responsive to a selection of the stellar horizon atmospheric dispersion or the stellar horizon atmospheric refraction technique; and generate a third position dataset based on the co-sighting technique and apply the filter…and the third position dataset responsive to a selection of the co- sighting technique. However, Rubel teaches generate a third position dataset…and the third position dataset… [0058], [0077] – [0078] Due to the first (third image) and second image (fourth image) being compared, we can confirm that the comparison data of these images and using that comparison data to determine a position of the satellite can be known as a third position dataset. The modified Lane reference does not explicitly teach the system of claim 1, comprising: the data processing system to: receive an altitude of the vehicle; select between a co-sighting technique or at least one of a stellar horizon atmospheric dispersion or stellar horizon atmospheric refraction technique based on the altitude; …based on the stellar horizon atmospheric dispersion or the stellar horizon atmospheric refraction technique and apply the filter…responsive to a selection of the stellar horizon atmospheric dispersion or the stellar horizon atmospheric refraction technique; and …based on the co-sighting technique and apply the filter…responsive to a selection of the co- sighting technique. However, Larry teaches the system of claim 1, comprising: the data processing system to: receive an altitude of the vehicle ([0019] Fig. 2); A system (10) includes a platform (12) which will be located about Earth to carry out this refraction angle mapping. In order to carry out this refraction mapping, the platform (12) will have to receive an altitude. select between a co-sighting technique or at least one of a stellar horizon atmospheric dispersion or stellar horizon atmospheric refraction technique based on the altitude; …based on the stellar horizon atmospheric dispersion or the stellar horizon atmospheric refraction technique and apply the filter…responsive to a selection of the stellar horizon atmospheric dispersion or the stellar horizon atmospheric refraction technique; and …based on the co-sighting technique and apply the filter…responsive to a selection of the co- sighting technique [0018], [0021] - [0022], [0026] – [0027]. Larry teaches using atmospheric horizon refraction that is related to altitude dependent atmospheric properties. Larry discloses a star tracker that takes images of celestial bodies through the Earth’s atmospheric limb and that a star passing behind the upper atmosphere appears shifted from its true position due to refraction of the starlight as it traverses atmospheric layers. The magnitude of this refraction depends on atmospheric density, this varies on altitude, and that the system can directly measure the refractive displacement of a known star near the horizon [0017], [0021] – [0022]. This type of measuring of atmospheric horizon refraction data can be applied to the navigation filter mentioned by Skilton above. Lane, Caballero, Rubel, and Gordley are analogous art because Lane teaches a star tracker that measures light from the Earth’s atmosphere to determine a position of a satellite which is a second position dataset while Caballero teaches using images of a planet’s surface and comparing those images to a predefined repository of landmarks on that planet’s surface to determine a position of the satellite which can be considered a first position dataset while Rubel teaches comparing images that were taken sequentially of celestial bodies and comparing those images together to determine a position of a space vehicle which can be considered a third position dataset while Gordley teaches using atmospheric horizon refraction that is based on altitude to measure displacement of the star. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Rubel and Gordley, to modify the teachings of the combination of Lane and Caballero to include the teachings of Rubel and Gordley to have further accuracy in determining a satellite’s position using light refraction from Earth’s atmosphere. 19. Claim(s) 9 is rejected under 35 U.S.C. 103 as being unpatentable over US9702702B1 (hereinafter, “Lane”), and further in view of US20090048780A1 (hereinafter, “Caballero”), and further in view of US20250382072A1 (hereinafter, “Rubel”), and further in view of US20250382072A1 (hereinafter, “Rubel”), and further in view of US6478260B1 (hereinafter, “Rice”). 20. Regarding claim 9, Lane teaches determine the second position dataset… the second position dataset… [Col. 8 Lines 7 – 60] The star tracker can calculate a position of the vehicle in space. As we discussed above, the star tracker captures a star’s light through images in which the positions of stars (celestial body) are altered due to the atmospheric refraction of light. Therefore, a second position dataset can be extracted from the positional information that is gained as the star tracker calculates a position of the vehicle in space. …to determine the position and the attitude of the vehicle [Col. 7 Lines 61 – 65] – [Col. 8 Lines 1 – 2], [Col. 8 Lines 7 – 14]. Lane teaches acquiring the position [Col. 8 Lines 7 – 14] and attitude [Col. 7 Lines 61 – 65] – [Col. 8 Lines 1 – 2] of the vehicle in space. Lane does not explicitly teach the system of claim 1, comprising: the data processing system to: identify a constellation of celestial bodies in the second image based on the celestial body catalog, the constellation of celestial bodies including the celestial body and a second celestial body; determine an error between the constellation of celestial bodies in the second image and data the celestial body catalog; …based at least in part on the error; and apply the filter…determined based on the error… However, Caballero teaches …to the first position dataset and… ([0021] Fig. 2) Caballero teaches receiving an image of the planet surface (44) and incorporating steps that involve processing the image to identify edge pixels and angle data to identify planetary features and using these planetary features to compare them to a predefined library of planet surface descriptions (landmarks) to determine a location of the vehicle (24) relative to the planet. This process involves datasets that are determined and compared based on the received image (first image). Therefore, the entirety of this process can be known as a first position dataset. The modified Lane reference does not explicitly teach the system of claim 1, comprising: the data processing system to: identify a constellation of celestial bodies in the second image based on the celestial body catalog, the constellation of celestial bodies including the celestial body and a second celestial body; determine an error between the constellation of celestial bodies in the second image and data the celestial body catalog; …based at least in part on the error; and apply the filter…determined based on the error… However, Rubel teaches the system of claim 1, comprising: the data processing system to: identify a constellation of celestial bodies in the second image based on the celestial body catalog, the constellation of celestial bodies including the celestial body and a second celestial body [0077], [0092]; Rubel teaches comparing a captured image to a database (catalog) of known star constellations (constellation of celestial bodies). The image captured could have more than one celestial body depending on how full of stars and celestial bodies are within the image. Therefore, a second celestial body can be identified. This can be applied to the second image that Lane captures. The modified Lane reference does not explicitly teach determine an error between the constellation of celestial bodies in the second image and data the celestial body catalog; …based at least in part on the error; and apply the filter…determined based on the error to determine the position and the attitude of the vehicle. However, Rice teaches determine an error between the constellation of celestial bodies in the second image and data the celestial body catalog; …based at least in part on the error; and apply the filter…determined based on the error… (Fig. 2 – 3). Rice teaches determining an error between an observed constellation in an image and a stored celestial catalog through its star tracker-based attitude estimation process. The star tracker (22) which resides in the satellite (12) captures an FOV (28) which is capturing images containing multiple stars and identifies the imaged stars by matching their observed positions to entries in the star catalog (26) [Col. 3 Lines 45 – 59]. Using the star catalog (26), the system generates predicted star direction vectors based on an estimated vehicle attitude and compares those predicted directions with the measured star directions derived from the captured image. The differences between the measured star vectors and the catalog derived predicted star vectors form measurement residuals that are used to correct and refine the attitude of the satellite (12). Therefore, Rice teaches comparing an imaged constellation to a star catalog (26) and determining an error between the two in the form of differences between observed and expected star positions. The filter from Skilton can be applied in order to intake this data. Lane, Caballero, Rubel, and Rice are analogous art because Lane teaches a star tracker that measures light from the Earth’s atmosphere to determine a position of a satellite which is a second position dataset while Caballero teaches using images of a planet’s surface and comparing those images to a predefined repository of landmarks on that planet’s surface to determine a position of the satellite which can be considered a first position dataset while Rubel teaches comparing images that were taken sequentially of celestial bodies and comparing those images together to determine a position of a space vehicle which can be considered a third position dataset while Rice teaches capturing images with a boresight to measure their observed position with the predicted star direction. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Rubel and Rice, to modify the teachings of the combination of Lane and Caballero to include the teachings of Rubel and Rice to further minimize error when it comes to determining a satellite’s position in space using celestial bodies as an anchor. 21. Claim(s) 10 is rejected under 35 U.S.C. 103 as being unpatentable over US9702702B1 (hereinafter, “Lane”), and further in view of US20090048780A1 (hereinafter, “Caballero”), and further in view of US8767072B1 (hereinafter, “Rosenwinkle”), and further in view of US20250382072A1 (hereinafter, “Rubel”), and further in view of US20210108922A1 (hereinafter, “Dawson”). 22. Regarding claim 10, Lane teaches …the second position dataset… [Col. 8 Lines 7 – 60] The star tracker can calculate a position of the vehicle in space. As we discussed above, the star tracker captures a star’s light through images in which the positions of stars (celestial body) are altered due to the atmospheric refraction of light. Therefore, a second position dataset can be extracted from the positional information that is gained as the star tracker calculates a position of the vehicle in space. Lane does not explicitly teach the system of claim 1, comprising: the data processing system to: compare a first spectrum of the light of the celestial body of the second image captured by the second camera at a first point in time to a plurality of spectrums of the light of the celestial body at a plurality of altitudes; determine a first cone associated with the position of the vehicle based on the comparison of the first spectrum of the light to the plurality of spectrums; compare a second spectrum of the light of the celestial body of the second image captured by the second camera at a second point in time to the plurality of spectrums of the light of the celestial body at the plurality of altitudes; determine a second cone associated with a second position of the vehicle based on the comparison of the second spectrum of the light to the plurality of spectrums; determine the first position dataset based on an intersection of the first cone and the second cone; and determine, based on the filter applied to the first position dataset and…determined based on the intersection of the first cone and the second cone, the position and the attitude of the vehicle. However, Caballero teaches determine the first position dataset…to the first position dataset and… ([0021] Fig. 2) Caballero teaches receiving an image of the planet surface (44) and incorporating steps that involve processing the image to identify edge pixels and angle data to identify planetary features and using these planetary features to compare them to a predefined library of planet surface descriptions (landmarks) to determine a location of the vehicle (24) relative to the planet. This process involves datasets that are determined and compared based on the received image (first image). Therefore, the entirety of this process can be known as a first position dataset. The modified Lane reference does not explicitly teach the system of claim 1, comprising: the data processing system to: compare a first spectrum of the light of the celestial body of the second image captured by the second camera at a first point in time to a plurality of spectrums of the light of the celestial body at a plurality of altitudes; determine a first cone associated with the position of the vehicle based on the comparison of the first spectrum of the light to the plurality of spectrums; compare a second spectrum of the light of the celestial body of the second image captured by the second camera at a second point in time to the plurality of spectrums of the light of the celestial body at the plurality of altitudes; determine a second cone associated with a second position of the vehicle based on the comparison of the second spectrum of the light to the plurality of spectrums; …based on an intersection of the first cone and the second cone; and determine, based on the filter applied…determined based on the intersection of the first cone and the second cone, the position and the attitude of the vehicle. However, Rosenwinkel teaches the system of claim 1, comprising: the data processing system to: compare a first spectrum of the light of the celestial body…to a plurality of spectrums of the light of the celestial body at a plurality of altitudes; …based on the comparison of the first spectrum of the light to the plurality of spectrums; compare a second spectrum of the light of the celestial body…to the plurality of spectrums of the light of the celestial body at the plurality of altitudes; …based on the comparison of the second spectrum of the light to the plurality of spectrums [Col. 2 Lines 46 – 67] – [Col. 3 Lines 1 – 9], [Col. 5 Lines 8 – 53], [Col. 7 Lines 47 – 64]; Rosenwinkel teaches obtaining light from stars (celestial body) in multiple frequency bands and comparing those frequency bands together. Rosenwinkel captures two images of each star using sensors configured to detect different light frequencies such as visible and infrared which are part of the spectrum of light and determining the displacement between the first and second frequencies for each star [Col. 5 Lines 8 – 53], [Col. 7 Lines 47 – 64]. Due to each frequency band representing a different portion of the star’s spectrum and each piece of light is refracted by the atmosphere differently [Col. 2 Lines 46 – 67] – [Col. 3 Lines 1 – 9], the system effectively compares a first spectrum of light of the celestial body to additional spectra obtained at other frequencies. One of ordinary skill would recognize that the atmospheric refraction models used in Rosenwinkel incorporate atmospheric height. Refraction varies as the altitude increases or decreases due to density, temperature, and pressure. Therefore, the predicted displacement patterns correspond to the change in altitude due to different atmospheric heights. The modified Lane reference does not explicitly teach …of the second image captured by the second camera at a first point in time… determine a first cone associated with the position of the vehicle based on the comparison of the first spectrum of the light to the plurality of spectrums; …of the second image captured by the second camera at a second point in time… determine a second cone associated with a second position of the vehicle based on the comparison of the second spectrum of the light to the plurality of spectrums; …based on an intersection of the first cone and the second cone; and determine, based on the filter applied…determined based on the intersection of the first cone and the second cone, the position and the attitude of the vehicle. However, Rubel teaches …of the second image captured by the second camera at a first point in time… …of the second image captured by the second camera at a second point in time… [0055], [0077] – [0078] includes capturing a first image at a first time (third image) and a subsequent image (fourth image) at a later time. Celestial objects such as stars and planets are identified in both images [0055], [0077] – [0078] and their relative positions are compared across the multiple image frames. Whatever star that was captured first will be considered the first star and whatever planet that was captured first will be considered the first planet. Due to the images being acquired sequentially, the first image corresponds to a first point in time and the second image corresponds to a second point in time. The modified Lane reference does not explicitly teach determine a first cone associated with the position of the vehicle… determine a second cone associated with a second position of the vehicle… …based on an intersection of the first cone and the second cone; and determine, based on the filter applied…determined based on the intersection of the first cone and the second cone, the position and the attitude of the vehicle. However, Dawson teaches determine a first cone associated with the position of the vehicle… determine a second cone associated with a second position of the vehicle… …based on an intersection of the first cone and the second cone; and determine, based on the filter applied…determined based on the intersection of the first cone and the second cone, the position and the attitude of the vehicle ([0140], [240] – [241] Fig. 21). Dawson teaches observing refraction of a vehicle to indicate where that vehicle is located on the cone (2100). The vehicle can determine its position by repeating this same type of observation on stars in different directions by solving for intersections of various cones ([240] – [241] Fig. 21). Therefore, a second cone will happen in order to determine the vehicle’s position and attitude [0140]. This type of data can be inputted into a navigation filter that Lane teaches as mentioned in claim 1. Lane, Caballero, Rosenwinkel, Rubel, and Dawson are analogous art because Lane teaches a star tracker that measures light from the Earth’s atmosphere to determine a position of a satellite which is a second position dataset while Caballero teaches using images of a planet’s surface and comparing those images to a predefined repository of landmarks on that planet’s surface to determine a position of the satellite which can be considered a first position dataset while Rosenwinkel teaches obtaining light from stars in multiple frequency bands and comparing those frequency bands together while Rubel teaches comparing images that were taken sequentially of celestial bodies and comparing those images together to determine a position of a space vehicle while Dawson teaches generating multiple cones and using those cone intersections to determine a vehicle’s position and attitude in space. Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Rosenwinkel, Rubel, and Dawson, to modify the teachings of the combination of Lane and Caballero to include the teachings of Rosenwinkel, Rubel, and Dawson to further accurately determine the position of the space vehicle using images taken and incorporating spectrums of light and cones. 23. Claim(s) 11 is rejected under 35 U.S.C. 103 as being unpatentable over US9702702B1 (hereinafter, “Lane”), and further in view of US20090048780A1 (hereinafter, “Caballero”), and further in view of US20150298827A1 (hereinafter, “Nguyen”), 24. Regarding claim 11, the modified Lane reference does not explicitly teach the system of claim 1, comprising: the first camera, the first camera coupled with the vehicle and oriented towards the planet below the vehicle; and the second camera, the second camera coupled with the vehicle and oriented towards a horizon of the planet. However, Nguyen in the same field of endeavor, teaches the system of claim 1, comprising: the first camera, the first camera coupled with the vehicle and oriented towards the planet below the vehicle; and the second camera, the second camera coupled with the vehicle and oriented towards a horizon of the planet ([0034] Fig. 1). Nguyen teaches having a sensor on each mount reserved for looking at Earth’s horizon and Earth itself. Such horizon sensors include optical collection elements configured to capture Earth’s electromagnetic radiation information across a field of view (FOV). One of ordinary skill would consider these sensors as optical cameras. Therefore, Nguyen as two optical cameras. One facing towards Earth and another facing towards Earth’s horizon. One of ordinary skill in the art, before the effective filing date of the instant application with a reasonable expectation of success, would have been motivated to modify the disclosure of the modified Lane reference with the teachings of Nguyen, to have time efficiency with two cameras coupled to the space vehicle instead of having to deal with the adjustment of one camera. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID MESQUITI OVALLE JR. whose telephone number is (571)272-6229. The examiner can normally be reached Monday - Friday 7:30am - 5pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Erin Piateski can be reached on (571) 270-7429. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DAVID MESQUITI OVALLE/Examiner, Art Unit 3669 /Erin M Piateski/Supervisory Patent Examiner, Art Unit 3669
Read full office action

Prosecution Timeline

Jul 31, 2024
Application Filed
Feb 23, 2026
Non-Final Rejection — §103 (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
100%
Grant Probability
99%
With Interview (+0.0%)
3y 0m
Median Time to Grant
Low
PTA Risk
Based on 4 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month