Prosecution Insights
Last updated: April 19, 2026
Application No. 18/461,625

CAMERA POSE ESTIMATION TECHNIQUES

Non-Final OA §101§103§DP
Filed
Sep 06, 2023
Examiner
LEMIEUX, IAN L
Art Unit
2669
Tech Center
2600 — Communications
Assignee
TuSimple, Inc.
OA Round
1 (Non-Final)
87%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
97%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
496 granted / 569 resolved
+25.2% vs TC avg
Moderate +10% lift
Without
With
+9.6%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
34 currently pending
Career history
603
Total Applications
across all art units

Statute-Specific Performance

§101
11.2%
-28.8% vs TC avg
§103
39.6%
-0.4% vs TC avg
§102
19.1%
-20.9% vs TC avg
§112
19.4%
-20.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 569 resolved cases

Office Action

§101 §103 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims 1-20 are currently pending in U.S. Patent Application No. 18/461,625 and an Office action on the merits follows. Information Disclosure Statement The information disclosure statement(s) (IDS) to include that recently submitted on 03/27/2025 comply with the provisions of 37 CFR 1.97 and 1.98. Accordingly, the information disclosure statements have been considered. Examiner additionally acknowledges Applicant’s compliant written assertion identifying that no IDS size fee required under 37 CFR 1.17(v) is due. While the cumulative number of citations exceeds that count (200) associated with 37 CFR 1.17(v)(3), this threshold was exceeded prior to the submission of the most recent IDS, and only the most recent submission falls after the effective date of the rule change (1/19/2025). Reference may be made to question 9 of the associated Quick Reference Guide : The IDS must cause the cumulative count to exceed one or more of the specified thresholds in order to incur an IDS size fee. If the cumulative count already exceeds one of the thresholds before a particular IDS is filed, then that particular IDS will not incur an IDS size fee unless it causes the cumulative count to exceed a higher threshold. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim(s) 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception, in particular an Abstract Idea falling under the (a) mathematical concepts category (mathematical relationships, formulas or equations, and/or calculations), not ‘integrated into a practical application’ at prong 2 of Step 2A and without ‘significantly more’ at step 2B. Step 1: The claim(s) in question are directed primarily to a computer implemented method/process for determining a camera pose (following ‘Yes’ path at Step 1). Corresponding system and non-transitory CRM claim(s) are congruent in scope, and while featuring generic computer hardware considered under the ‘apply it’ considerations of MPEP 2106.05(f), these claims are also understood to be directed to a machine, manufacture and/or composition of matter for the purposes of analysis at Step 1. (Step 1: Yes). Step 2A, Prong One: This part of the eligibility analysis evaluates whether the claim recites a judicial exception. As explained in MPEP 2106.04, subsection II, a claim “recites” a judicial exception when the judicial exception is “set forth” or “described” in the claim. Independent/representative claim(s) 1/9/17 rest with “determining a pose of the camera by minimizing a distance…” which may be drawn to the mathematical concepts Abstract Idea grouping, at least because the instant claims explicitly recite (i.e. more than ‘involving’ or being ‘based on’) one or more calculations in view of that explicitly recited ‘by minimizing a distance…’, unlike the claim(s)/findings described in Thales Visionix, Inc. v. United States, 850 F.3d 1343, 1348-49, 121 USPQ2d 1898, 1902-03 (Fed. Cir. 2017). Concerning the mathematical concepts Abstract Idea Grouping Applicant may see MPEP 2106.04(a)(2), and (C) Mathematical Calculations more specifically. Also, e.g., SAP America, Inc. v. InvestPic, LLC, 898 F.3d 1161, 1163, 127 USPQ2d 1597, 1599 (Fed. Cir. 2018) (holding that claims to a ‘‘series of mathematical calculations based on selected information’’ are directed to abstract ideas). MPEP 2106.04(a)(2)(C): A mathematical calculation is a mathematical operation (such as multiplication) or an act of calculating using mathematical methods to determine a variable or number, e.g., performing an arithmetic operation such as exponentiation. There is no particular word or set of words that indicates a claim recites a mathematical calculation. That is, a claim does not have to recite the word "calculating" in order to be considered a mathematical calculation. For example, a step of "determining" a variable or number using mathematical methods or "performing" a mathematical operation may also be considered mathematical calculations when the broadest reasonable interpretation of the claim in light of the specification encompasses a mathematical calculation. (Step 2A, Prong One: Yes). Step 2A, Prong Two: This part of the eligibility analysis evaluates whether the claim as a whole integrates the recited judicial exception into a practical application of the exception. This evaluation is performed by (1) identifying whether there are any additional elements recited in the claim beyond the judicial exception, and (2) evaluating those additional elements individually and in combination to determine whether the claim as a whole integrates the exception into a practical application. See MPEP 2106.04(d). Examiner notes for consideration at prong 2 of 2A that MPEP 2106.05(a), (b), (c), and (e) generally concern limitations that are indicative of integration, whereas 2106.05(f), (g), and (h) generally concern limitations that are not indicative of integration. As an additional note, ‘additional elements’ are generally limitations excluded from interpretation under the Abstract Idea groupings, and may comprise portions of limitations otherwise identified as falling under those abstract idea groupings of the 2019 PEG (e.g. the use of a neural network under the ‘apply it’ considerations of 2106.05(f) and further described in the most recent PEG). In view of MPEP 2106.05(f), the words ‘apply it’ or equivalent serving as a mere implementation of the abstract idea on a generic computer/use of a computer as a tool to perform an abstract idea, fail to integrate at prong 2 of 2A. While it may be argued that the recited ‘receiving’, ‘determining’ and ‘obtaining’ are ‘additional elements’ precluded from interpretation under any of the Abstract Idea groupings at Prong One of 2A, these limitations appear insufficient for integration when considered at Prong Two of 2A, in view of 2106.05(g) for that receiving, and 2106.05(h) for that determining and obtaining. Recited at a high level of generality these limitations at best generally link the recited exception to a field of use involving computer vision techniques for detecting image key-points/features and a 3D map database. While not recited in the claim(s) the use of e.g. any neural network/AI model, broadly would similarly not serve for integration at Prong Two in view of MPEP 2106.05(f) and (h), as they would at best serve to generally link the claims to a field of use involving the broad use of one or more AI models. Subsection (h), unlike (e), concerns limitations that merely serve to ‘generally link’ the abstract idea to e.g. a particular technological environment or field of use – distinguished from applying or using the judicial exception in ‘some other meaningful way beyond generally linking the use of the JE to a particular environment’ (see MPEP 2106.05(e)). 2106.05(b) does not apply for the claims in question since there is no explicitly claimed ‘particular machine’ but instead a generic computer for which MPEP 2106.05(f) is more applicable. Dependent claims are similarly analyzed, and also fail to present limitations successfully integrating at Prong 2 of 2A – even when considered in combination, in view of those same considerations identified above. A purportedly novel pose calculation (e.g. one that involves filtering constraints determining which 3D map coordinates are considered), is still a calculation per se, and the claim(s) fall silent regarding any way the calculated pose is subsequently utilized in a manner integrating the claimed exception into a practical application. As the claims rest with calculating said pose, and do not apply or use the JE in some other meaningful/specific way beyond generally linking the use to a particular environment, they appear to tie up/monopolize the exception that is such a calculation. (Step 2A, Prong Two: No; Revised Step 2A: Yes [Wingdings font/0xE0] Step 2B). Step 2B: This part of the eligibility analysis evaluates whether the claim as a whole amounts to ‘significantly more’ than the recited exception, i.e., whether any ‘additional element’, or combination of additional elements, adds an inventive concept to the claim. The considerations of Step 2A Prong 2 and Step 2B overlap, but differ in that 2B also requires considering whether the claims feature any “specific limitation(s) other than what is well-understood, routine, conventional activity in the field” (WURC) (MPEP 2106.05(d)). Step 2B further requires a re-evaluation of any additional elements drawn to extra-solution activity in Step 2A (i.e. that initial image capture/receipt). Reference may be made to the Step 2B analysis at page 9 of the recent PEG, where additional elements (a) and (f) of Example 47 claim 2 are re-evaluated, as the instant analysis produces similar findings. Limitations not indicative of an inventive concept/‘significantly more’ include those that are not specifically recited (instead recited at a high level of generality) and/or those that are established as WURC. As identified above with reference to page 33 of the 2024 PEG, an improvement to the Abstract Idea/JE itself does not constitute an ‘inventive concept’/’significantly more’. Similar to the analysis presented above for the case of prong 2 at Step 2A, even when considered in combination, additional elements for the case of the instant claims represent mere instructions to apply an exception (MPEP 2106.05(f)), insignificant extra-solution activity (MPEP 2106.05(g)), and/or are broadly recited at a high level of generality at best generally linking the use of the judicial exception to one or more environment(s)/field of use (MPEP 2106.05(h)), and therefore do not provide an inventive concept that is clearly distinct from the exception itself (Step 2B: No). Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over corresponding claims of U.S. Patent No. 11,810,322 and/or obvious modification thereto similar to the grounds presented in the prior-art based rejections below. The instant claims closely correspond to the original claim set for parent Application 17/225,396, upon which the claims of reference are based/incorporate. Although the claims at issue are not identical, they are not patentably distinct from each other because claims of reference anticipate the claims of the instant application and/or require only minimal/obvious modification to teach/suggest all elements recited in the instant claims. The following additional considerations similarly apply: • Instant claims and claim(s) of reference recite common subject matter; • Whereby instant claim(s), recite the open ended transitional phrase “comprising”, and do not preclude those additional elements recited by claims of reference; • Language/terminology of instant claim(s) constituting minor variations from the claims of reference (e.g. ‘obtaining’ vs. ‘receiving’, claim(s) 5/13 vs. claim 10 of reference), if/where present, correspond to interpretations under plain meaning definitions and/or explicitly disclosed obvious variants/ alternatives/equivalents thereto as identified in the corresponding Specification of the reference application, and accordingly serve in identifying permissible interpretation of claims of reference serving as grounds for nonstatutory double patenting rejection(s). While the disclosure of reference may not be used as prior art (Double Patenting concerns the claims of reference), portions of the specification which provide support for reference claims may also be examined and considered when addressing the scope of claim(s) of reference and the issue of whether an instant claim defines an obvious variation or falls within the scope of an invention claimed in the claim(s) of reference. See MPEP 804 with reference to In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970). • Language/terminology of instant claim(s) otherwise not explicitly recited in claim(s) of reference (e.g. instant claim 2), constitute limitations met in view of obvious modification to claims of reference for reasons same/similar to those presented in the prior art based rejections below – namely that filtering of Zhang et al. (US 2019/0271549 A1) and/or Efland et al. (US 2020/0089973 A1) in view of that motivation presented in the rejection of claim 2. It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to modify the claims of reference to further include such a filtering, as taught/suggested by references of record/disclosure identified in the prior art based rejections below, the motivation(s) being that/those similarly identified below, in further view of the manner in which such a modification to the claims of reference would serve to increase the marketability and/or adoption of that/those systems/methods in the claim(s) of reference, in a manner characterized by a reasonable expectation of success and without undue experimentation. Examiner recognizes a common preference to hold such Double Patenting rejections (obvious type and/or otherwise) in abeyance until an identification of Allowable Subject matter, however presents such a ground(s) in the interest of compact prosecution. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 1. Claims 1, 5-9, and 13-17 are rejected under 35 U.S.C. 103 as being unpatentable over Zhang et al. (US 2019/0271549 A1) in view of Fujii et al. (US 2021/0183099 A1). As to claim 1, Zhang discloses a method of estimating camera pose (Abs “Camera based localization performed to determine a current pose of an autonomous vehicle”, [0007]), comprising: receiving, by a computer located in a vehicle, an image from a camera located on the vehicle, wherein the image comprises a lane marker on a road (Fig. 11 captured image(s) of 1108, comprising those lane markers 1204/1304, Figures 12-13, Fig. 19 900 captured image); determining pixel locations of a plurality Figures 12-13, lane markers 1204/1304, [0115] “each cluster may correspond to a portion a feature captured in the image 1202 (e.g., a portion of a lane line captured in the image)”, ‘corresponding edge pixels’ of e.g. [0012]); obtaining, from a database (Fig. 5, HD Map 510), three-dimensional (3D) coordinates of the plurality Fig. 20A 1000, [0008] “each edgel corresponding to a three-dimensional location and a gradient direction”, [0072] “The HD map 510 of a geographical region comprises a landmark map (LMap) 520 and an occupancy map (OMap) 530. The landmark map comprises information describing lanes including spatial location of lanes and semantic information about each lane. The spatial location of a lane comprises the geometric location in latitude, longitude and elevation at high prevision, for example, at or below 10 cm precision… The landmark map may further comprise information describing stop lines, yield lines, spatial location of cross walks, safely navigable space, spatial location of speed bumps, curb, and road signs comprising spatial location and type of all signage that is relevant to driving restrictions”, [0073] “The occupancy map 530 comprises spatial 3-dimensional (3D) representation of the road and all physical objects around the road. The data stored in an occupancy map 530 is also referred to herein as occupancy grid data”, [0115] “Edgels within a certain distance from each other (as determined based upon their respective 3D location)”, [0111] “where each edgel corresponds to a 3D location and is associated with a gradient direction”, [0080] “The lanes represented by the HD map system 100 include lanes that are explicitly marked, for example, white and yellow striped lanes, lanes that are implicit, for example, on a country”, [0081], [0088], [0102], [0119] “line geometry is computed for certain groups of edgels, such as line segments in 3D space connecting groups of edgels, as stored as part of the map”, etc.,); and determining a pose of the camera (Fig. 20A 1060 optimize pose) by minimizing a distance (Fig. 11 1110 and 1112, Fig. 20A 1050 correspondence between edgels and image pixels, [0007] “identifying a transformation that minimizes a distance between the edgels and their corresponding edge pixels”, [0008] “based upon the determined correspondences by determining a transformation that if applied to the subset of edgels minimizes an aggregate distance between the subset of edgels and their corresponding edge pixels”, [0126] “For example, the localization system may attempt to find a transformation that minimizes an aggregate distance between the set of projected edgels on the image and their corresponding edge pixels” , [0126-0127] “Upon determination a transformation, the localization system determines 1114 if the transformed set of edgels and their corresponding edge pixels have reached a threshold level of convergence. Convergence may refer to a measure of how well the transformed edgels correspond with their corresponding edge pixels. In some embodiments, the convergence may be based upon a value of the energy function in Equation (1) described above”, [0134], [0144], etc.,) from 3D world coordinates of at least one Fig. 5, HD Map 510, spatial location information from e.g. Omap as identified above for those lane marking(s) of e.g. [0073], [0080] and corresponding edgels ([0115], [0119]), [0007] “A pose of the vehicle is optimized based upon the determined correspondences by identifying a transformation that minimizes a distance between the edgels and their corresponding edge pixels. The determined transformation can be applied to the initial pose to determine an updated pose of the vehicle”) and at least one pixel location of the at least one corner of the lane marker ([0007] “and their corresponding edge pixels”, [0013] “analyzing the image frame to identify a plurality of edge pixels within the image frame comprises identifying a portion of the image frame corresponding to ground, and identifying the plurality of edge pixels within the identified portion”, etc.,). Zhang fails to explicitly disclose the use of “corner” pixels for the lane marker, however the lane line features of Zhang are understood to comprise such a corner generally, particularly for those instances of pixels associated with ‘lane line segments’ (broken/dashed centerline markings – see Fig. 9) as disclosed in Zhang. Zhang also arguably discloses such corner points in [0116] despite failing to use language ‘corner’, and instead referring to ‘endpoints of the lane line segment’ – “edgels at the endpoints of the lane line segment may be sufficient for performing localization, since they provide constraints on both dimensions (e.g., x and y directions)”. Zhang also identifies the manner in which pixels/image features corresponding to salient, permanent and stationary real-world features make for ideal/suitable comparison/matching markers (e.g. [0102]). Stated differently, PHOSITA would recognize such a corner serves as a salient point of the lane marking (similar in nature to that NPL cited “Monocular Vehicle Self localization method based on Compact Semantic Map” p1 and p2 of Fig. 2 (see page 3 left column top, identifying line landmarks determined with the same method of pole-like landmarks)). Zhang also discloses in e.g. [0072-0073] spatial location information corresponding to e.g. those edgels of 1204 of Fig. 12. Fujii further evidences the obvious nature of a road feature comprising corners of a lane marking ([0136] “In addition, feature points such as corners of lane markings, branching/merging points with other lane markings, and ends of guardrails may also be adopted as reference marks. Points where lanes increase or decrease may also be used as reference marks”). Fujii similarly teaches/suggests the manner in which corners of lane markings serve as salient/high contrast features particularly suited for use as reference/comparison marks. It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to modify the system and method of Zhang such that the pixel locations and corresponding 3D spatial locations concern one or more corners of lane markers as compared to lane marker 1204 more broadly, as taught/suggested by Zhang ([0116]) and Fujii, the motivation similarly taught/suggested therein and also obvious to one of ordinary skill in the art, that such a use of a corner point/feature of a lane marking may benefit from known geometrical constraints between such corner points (serving as efficient points since they provide constraints in both the x and y directions as disclosed in Zhang), while serving as a simple substitution of known/obvious to try salient lane feature alternatives in a manner yielding predictable results with a reasonable expectation of success. As to claim 5, Zhang in view of Fujii teaches/suggests the method of claim 1. Zhang in view of Fujii teaches/suggests the method further comprising generating, from the image, a second image that includes a plurality of pixels (Zhang Fig. 14 binary edge map 1402, [0122], [0133] “the edge map corresponds to a binary image corresponding to at least a portion of a captured image, in which a value of 1 indicates the corresponding pixel of the captured image is on an identified edge, and a value of 0 indicates that the corresponding pixel is not on an identified edge”, Figures 17 and 18, [0133]), wherein each of the plurality of pixels has a value that is directly related to another distance between a pixel and a corner of the lane marker (Zhang [0122] “Each edge pixel is associated with a gradient indicating an intensity and direction”, [0124], [0133] “a distance transform is applied on the edge map”, [0133-0134] “Edgels loaded from the OMap (e.g., based upon the initial pose) are projected on the generated distance transform of the binary image. For example, as illustrated in FIG. 18, the edgels 1802 are projected onto each of the distance transforms of the binary maps. The localization system optimizes the pose by determining a transformation that minimizes a value of the distance transform at the pixels corresponding to the projected edgel, where the values indicate, for each edgel, a distance of the pixel corresponding to the edgel to a nearest edge as indicated by the binary map”, wherein that ‘nearest edge’ of Zhang comprises one or more corners of the lane marker (Zhang [0116] endpoints of a lane line segment and as modified above for the case of claim 1); interpretation for the limitations of claim(s) 5/13 may be drawn from that support in e.g. [0037] – while the terminology ‘directly related’ and ‘another distance’ does not appear explicitly recited in Applicant’s Specification, “directly related to another distance between a pixel and a corner” is understood to be supported by that language ‘a value of each pixel is a function of a distance between a pixel location in the gray-scale image and the … corner of the lane marker’). As to claim 6, Zhang in view of Fujii teaches/suggests the method of claim 1. Zhang in view of Fujii further teaches/suggests the method wherein the lane marker has a rectangular shape (Zhang Figures 13-17). As to claim 7, Zhang in view of Fujii teaches/suggests the method of claim 1. Zhang in view of Fujii further teaches/suggests the method wherein the pose includes values for an orientation and a position of the camera (Zhang [0085] “a location and orientation of the vehicle can be determined”, [0112], etc., in view of the fixed relationship between the camera and vehicle of Zhang, and shared pose accordingly (see Figures); as an additional consideration a plain meaning definition of pose is understood to inherently/necessarily require both of a position and orientation, and a vehicle pose (given the fixed relationship) would read, even if the camera was not oriented in the direction of travel for the vehicle - reference may also be made to the prosecution history associated with parent application 17/225,396, Final mailed 03/16/2023 at page 4 addressing Applicant’s ‘Third’ argument seeking to distinguish a camera pose from a vehicle pose, without reference to any explicitly recited claim language to this effect). As to claim 8, Zhang in view of Fujii teaches/suggests the method of claim 1. Zhang in view of Fujii further teaches/suggests the method wherein the pose of the camera is determined as the vehicle is driven on the road ([0050-0051] “The vehicle computing system 120 continuously provides control signals to the vehicle controls 130, thereby causing an autonomous vehicle to drive along a selected route. The vehicle computing system 120 performs various tasks including processing data collected by the sensors as well as map data received from the online HD map system 110. The vehicle computing system 120 also processes data for sending to the online HD map system 110”, [0056], [0080] “the HD map system 100 stores a representation of a network of lanes to allow a vehicle to plan a legal path between a source and a destination and to add a frame of reference for real time sensing and control of the vehicle”, etc.,). As to claim 9, this claim is the system claim corresponding to the method of claim 1 and is rejected accordingly. As to claims 13-16, these claims are the system claims corresponding to method claims 5-8 respectively, and are rejected accordingly. As to claim 17, this claim is the non-transitory CRM claim corresponding to the method of claim 1 and is rejected accordingly. 2. Claims 2-4, 10-12 and 18-20 are rejected under 35 U.S.C. 103 as being unpatentable over Zhang et al. (US 2019/0271549 A1) in view of Fujii et al. (US 2021/0183099 A1) and Efland et al. (US 2020/0089973 A1). As to claim 2, Zhang in view of Fujii teaches/suggests the method of claim 1. Zhang in view of Fujii further teaches/suggests the method wherein the 3D coordinates of the plurality of comers of the lane marker are obtained (Zhang Fig. 11 1102 in conjunction with 1104 and 1106 – loading, clustering and filtering, and downsampling to arrive at that set provided for correspondence determination in 1110, Fig. 20A 1000-1040) by: obtaining, from the database and based on a location of the vehicle, 3D coordinates of comers of each lane marker in a first set of lane markers (Zhang Fig. 11 1102, Fig. 20A 1000 Load edgels from within a certain radius around an estimated car position from Omap, [0113] “Once an initial pose is determined, the localization system loads 1102 a plurality of edgels from a stored map (e.g., an OMap, such as that generated as part of the map creation phase as described above). In some embodiments, the plurality of edgels may correspond to edgels of the map that are within a certain radius around the location of the initial pose”, [0143] “The system loads 2000 edgels from the OMap that are within a certain radius around the position of the initial pose estimate”); Zhang further suggests obtaining, Zhang Fig. 11 1104 filtering of edgels in view of [0041] “Examples of physical constraints include physical obstacles, such as walls, and examples of legal constraints include legally allowed direction of travel for a lane, speed limits, yields, stops”, [0047] “For example, if an autonomous vehicle needs to drive along a route, the vehicle computing system 120 of the autonomous vehicle provides information describing the route being travelled to the online HD map system 110. In response, the online HD map system 110 provides the required HD maps for driving along the route”, [0057]), a second set of lane markers that are located within a first pre-determined distance from the vehicle (see ‘certain radius around’ the initial pose for that first set above; as an additional interpretation note one or more of the sets may overlap); Zhang further discloses the manner in which system 110 is generally motivated by a desire to minimize the amount of map data that is communicated with the vehicle where possible, without a cost to accuracy (see disclosure for receiving updated portions) – however the loaded edgels of Zhang are suggested as being based on a direction of travel at and/or prior to generating that ‘first’ set equivalent. obtaining, from the first set of lane markers, a third set of lane markers that are located within a pre-determined field of view of the camera (Zhang [0114] “The localization system clusters and filters 1104 the plurality of loaded edgels. In some cases, not all loaded edgels will be useful for pose optimization. For example, edgels that are invisible to cameras are useless for localization. To identify useful edgels, the loaded edgels may be projected onto the captured camera images (e.g., based on the initial pose estimate). Edgels that are projected to be outside the view of the cameras may be discarded and are not utilized as part of the subsequent steps of the localization process”); obtaining a fourth set of lane markers by removing one or more lane markers from the third set of lane markers located past a second pre-determined distance ([0115] “The localization system may cluster the remaining edgels into one or more clusters, based upon distances between the 3D positions of the edgels. FIG. 12 shows an image illustrating edge clustering results according to an embodiment… ”) from the location of the vehicle ([0117] “In some embodiments, edgels of a cluster are organized into buckets based on the orientation of their projected gradient vector. For example, edgels having projected gradient vectors with orientations within a certain range (e.g., a particular range of angles) may be grouped into the same bucket. Within each bucket, edgels that are closest to and furthest from the vehicle (based upon the initial pose) are automatically selected and kept. The remaining edgels of each bucket may then be sampled evenly based on their respective distances to the vehicle”); and Zhang fails to explicitly disclose obtaining a fifth set of lane markers by removing one or more lane markers obstructed by one or more objects on the road from the fourth set of lane markers, wherein the fifth set of lane markers includes the lane marker. Zhang however does disclose in [0120] only relying upon ground edgel pixels for processing, which would exclude those edgels from the map not located on the ground. In terms of the recited order and subsets named accordingly, Zhang at the minimum suggests a FOV based filtering, a driving direction based filtering (in terms of which Maps are retrieved/provided for loading edgels therefrom and semantic map information), an initial pose/location and associated radius based filtering, and a distance based clustering that accounts for respective distances to the vehicle, even if not necessarily in the order recited (such that each subsequently determined sub-set is derived from the previous). Examiner notes however that such an ordering is not understood to be crucial in the filtering at large, and that filtering in view of those various constraints, in a slightly varied order, would constitute an obvious to try variation not requiring undue experimentation. It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to further modify the system and method of Zhang in view of Fujii accordingly, such that the various forms of filtering disclosed occur in the order recited, as a minor variation from that ordering disclosed. Efland evidences the obvious nature of filtering a set of one or more lane markers/interaction points by excluding those determined to be obstructed by one or more objects on a road (Fig. 5 506 filter the set of candidate interaction points, [0006] “determining that a first interaction point in the set of candidate interaction points is partially or fully obstructed by the one or more objects; and removing the first interaction point from the set of candidate interaction points”, [0024-0025], [0043], etc.,). It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to further modify the system and method of Zhang in view of Fujii to further comprise additional filtering modifying that of Zhang 1104, to further comprise obtaining that fifth set of lane markers as recited based on removing one or more lane markers obstructed by one or more objects on the road as taught/suggested by Efland, the motivation as taught/suggested therein and similarly taught/suggested in the filtering of Zhang more broadly (in selecting those associated with the ground and not otherwise ([0120]) in view of that occlusion disclosed in e.g. [0003]), that such a removal ensures those remaining points for subsequent correspondence analysis may accurately be matched. As to claim 3, Zhang in view of Fujii and Efland teaches/suggests the method of claim 2. Zhang fails to explicitly disclose the method wherein the third set of lane markers are obtained by removing at least one lane marker that is occluded by landscapes. While Efland discloses filtering accounting for obstacle embodiments that include e.g. vehicles, pedestrians, debris, fire hydrants, etc., (see e.g. [0024]) as potentially distinct from the terrain itself, the teaching of Efland for such a filtering broadly applies to scenarios involving occlusion independent of what the cause of such an occlusion is. Furthermore, Zhang at the minimum suggests reliance upon 1204 to account for occlusion resultant from landscapes in particular, when disclosing e.g. ‘rolling hills’ in [0003]. It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to further modify the system and method of Zhang in view of Fujii and Efland to further comprise additional filtering that accounts for occlusion resultant from landscapes/terrain, as taught/suggested by Efland and Zhang, the motivation being that same motivation presented above for the case of claim 2 – since occluded features, regardless of the source of occlusion (wherein such an occlusion would be common to/characteristic of terrain comprising rolling hills as disclosed in Zhang), would not be available for accurate subsequent correspondence/distance minimization determinations/processing. As to claim 4, Zhang in view of Fujii and Efland teaches/suggests the method of claim 2. Zhang in view of Fujii and Efland further teaches/suggests the method wherein the direction in which the vehicle is driven is obtained from an inertial measurement unit (IMU) sensor located on the vehicle (Zhang [0049] “An IMU is an electronic device that measures and reports motion data of the vehicle such as velocity, acceleration, direction of movement, speed, angular rate, and so on using a combination of accelerometers and gyroscopes or other measuring instruments”, [0054], etc.,). As to claims 10-12, these claims are the system claims corresponding to method claims 2-4 respectively, and are rejected accordingly. As to claims 18-20, these claims are the non-transitory CRM claims corresponding to method claims 2-4 respectively, and are rejected accordingly. Additional References Prior art made of record and not relied upon that is considered pertinent to applicant's disclosure: Additionally cited references (see attached PTO-892) otherwise not relied upon above have been made of record in view of the manner in which they evidence the general state of the art. Inquiry Any inquiry concerning this communication or earlier communications from the examiner should be directed to IAN L LEMIEUX whose telephone number is (571)270-5796. The examiner can normally be reached Mon - Fri 9:00 - 6:00 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chan Park can be reached on 571-272-7409. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /IAN L LEMIEUX/Primary Examiner, Art Unit 2669
Read full office action

Prosecution Timeline

Sep 06, 2023
Application Filed
Apr 17, 2025
Non-Final Rejection — §101, §103, §DP
Oct 30, 2025
Response after Non-Final Action

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602825
Human body positioning method based on multi-perspectives and lighting system
2y 5m to grant Granted Apr 14, 2026
Patent 12592086
POSE DETERMINING METHOD AND RELATED DEVICE
2y 5m to grant Granted Mar 31, 2026
Patent 12586397
METHOD AND APPARATUS EMPLOYING FONT SIZE DETERMINATION FOR RESOLUTION-INDEPENDENT RENDERED TEXT FOR ELECTRONIC DOCUMENTS
2y 5m to grant Granted Mar 24, 2026
Patent 12579840
BEHAVIOR ESTIMATION DEVICE, BEHAVIOR ESTIMATION METHOD, AND RECORDING MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12573086
CONTROL METHOD, RECORDING MEDIUM, METHOD FOR MANUFACTURING PRODUCT, AND SYSTEM
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
87%
Grant Probability
97%
With Interview (+9.6%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 569 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month