Prosecution Insights
Last updated: April 19, 2026
Application No. 18/148,703

METHOD AND APPARATUS FOR DETECTING LANE LINE

Non-Final OA §103§112
Filed
Dec 30, 2022
Examiner
MORFORD, ALEXANDRA ROBYN
Art Unit
3658
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Huawei Technologies Co., Ltd.
OA Round
3 (Non-Final)
57%
Grant Probability
Moderate
3-4
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 57% of resolved cases
57%
Career Allow Rate
4 granted / 7 resolved
+5.1% vs TC avg
Strong +60% interview lift
Without
With
+60.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
41 currently pending
Career history
48
Total Applications
across all art units

Statute-Specific Performance

§101
16.8%
-23.2% vs TC avg
§103
40.5%
+0.5% vs TC avg
§102
14.3%
-25.7% vs TC avg
§112
27.4%
-12.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 7 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Status of Claims Claims 1-14 and 21-26 are currently pending and are being examined herein. Claims 1, 8, and 21 are amended. Claims 15-20 are cancelled. Joint Inventors This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Information Disclosure Statements The information disclosure statement (IDS) submitted on 5 January 2026 has been considered by the Examiner. Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 29 December 2025 and 20 January 2026 have been entered. Response to Amendment / Remarks Any reference to the prior office action refers to the final rejection dated 20 October 2025. All objections from the prior office action are withdrawn in view of the amendments. All rejections under 35 U.S.C. 101 from the prior office action are withdrawn because the amended claims recite significantly more / a practical application (“automatically adjusting a driving operation of the vehicle based on the fitted lane line”). All other applicant arguments under 35 U.S.C. 101 have been fully considered, but are moot in view of the withdrawal of the rejections under 35 U.S.C. 101. Applicant’s arguments with respect to the rejections under 35 U.S.C. 103 from the prior office action have been fully considered; however, all arguments are moot because the new ground of rejection under 35 U.S.C. 103 (see below) does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Objections The claims are objected to because of the following informalities: Claims 1, 8, and 21: “the lane line reflection points based on the road edge coordinate system that uses road edge line as reference line” should be “the lane line reflection points based on the road edge coordinate system that uses the road edge line as the reference line”. Claims 6, 13, and 26: “a reference line” should be “[[a]] the reference line”. Appropriate corrections are required. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. Claims 1-14 and 21-26 are rejected under 35 U.S.C. 112(a) as failing to comply with the written description requirement. The claims contain subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, at the time the application was filed, had possession of the claimed invention. Claims 1, 8, and 21 have been amended to recite “automatically adjusting a driving operation of the vehicle based on the fitted lane line”; there is no support in the original disclosure for this limitation; therefore this is new matter. The original disclosure does support adjusting certain specific driving operations; to overcome this rejection applicant could amend to claim one or more of the specific driving operations supported in the original disclosure; for example, adjusting speed or adjusting steering angle (see at least paragraphs [0072]-[0074] of the specification). Claims 2-7, 9-14, and 22-26 are rejected for being dependent on a rejected claim. Appropriate corrections are required. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-14 and 21-26 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Pub. No. 2017/0247029 (hereinafter, Watanabe) in view of U.S. Pub. No. 2007/0276599 (hereinafter, Ogawa). Regarding Claim 1, Watanabe discloses A method for improving a vehicle operation based on detecting a lane line (see at least [0037] and [0043]: “The travel control apparatus 1 has a function of performing a coordinate transformation such that a position of an object and a position of lane boundary line indicated in the usual plane coordinate system are indicated on a coordinate system (lane coordinate system) different from the usual plane coordinate system, and a function of generating the target travel trajectory”; “The LIDAR detects the object around the vehicle 2 and the environment on the road such as the lane boundary lines 100 to 103 using light”), comprising: scanning, by a LIDAR, a surrounding environment of a vehicle, to obtain lane line candidate reflection points and road edge information (see at least [0040], [0043], and [0057]: “The LIDAR detects the object around the vehicle 2 and the environment on the road such as the lane boundary lines 100 to 103 using light”); establishing, by a processor, a road edge coordinate system based on the road edge information (see at least [0037]-[0038], [0046]-[0047], [0058], and Fig. 3: “The lane coordinate system is a coordinate system associated with a shape of the lane. Specifically, the lane coordinate system is a coordinate system in which a center line of the lane is a first coordinate axis and an axis orthogonal to the first coordinate axis is a second coordinate axis”; “The lane position recognition unit 11 recognizes the positions of the lane boundary lines 101 and 103 based on the result of detection using the external sensor 31. As a specific example, the lane position recognition unit 11 recognizes the positions of the lane boundary lines 101 and 103 on the XY plane using a function (so-called a sensor fusion function) of combining the image information from the camera, the object information from the radar, and the object information from the LIDAR. The lane position recognition unit 11 may acquire the coordinate positions of the entire of the detection points detected as the lane boundary lines 101 and 103, or may sample a representative point.”); extracting, by the processor, a calculated result from scanned data in the road edge coordinate system, wherein the road edge coordinate system is based on road edge information using road edge line as reference line (see at least [0047], [0050]-[0052], and Fig. 4: “The lane position recognition unit 11 also recognizes a position of a lane center line K, a lane width, a curvature (a shape of the lane) by recognizing the positions of the lane boundary lines 101 and 103”; “The transformation unit 14 projects the lane boundary lines 101 and 103 and the object 200 recognized in the XY plane coordinate system on the lane coordinate system by performing the coordinate transformation”; “The first area calculation unit 15 calculates the first travelable area based on the lane boundary lines 101 and 103 and the object 200 in the lane coordinate system”); transforming, by the processor, the calculated result based on the road edge coordinate system that uses road edge line as reference line into an ego-vehicle coordinate system that is based on a current position and direction of the vehicle (see at least [0037] and [0053]: “The plane coordinate system may be a coordinate system that is fixed to the vehicle 2 while the travelling direction of the vehicle 2 is Y axis and the lateral direction of the vehicle 2 is X axis”; “the travel trajectory generation unit 16 inversely transforms the first travelable area R1 in the lane coordinate system to that in the XY plane coordinate system, and then, generates a second travelable area”), and fitting the lane line in the ego-vehicle coordinate system (see at least [0006]: “a lane position recognition unit configured to recognize positions of lane boundary lines of a lane in which a vehicle travels in front of a vehicle in a plane coordinate system”); and automatically adjusting a driving operation of the vehicle based on the fitted lane line (see at least [0006] and Fig. 4: “a travel trajectory generation unit configured to generate a travel trajectory of the vehicle in the plane coordinate system based on the travelable area and the travelling position; and a control unit configured to perform a steering control on the vehicle based on the travel trajectory”; the lane boundary lines 101 and 103 are part of the travelable area calculation which is used to control the steering). Watanabe does not explicitly disclose extracting, by the processor, a calculated result from scanned data in the road edge coordinate system, wherein the road edge coordinate system is based on road edge information using road edge line as reference line is extracting, by the processor, lane line reflection points from the lane line candidate reflection points in the road edge coordinate system, wherein the road edge coordinate system is based on road edge information using road edge line as reference line. Ogawa, in the same field of vehicles controls, and therefore analogous art, teaches extracting, by the processor, lane line reflection points from the lane line candidate reflection points…based on road edge information using road edge line as reference line (see at least [0023]-[0024], [0033]-[0035], FIG. 5, and FIG. 6: lidar instrument 10 scans to find possible lane markers; “the data on a lane marker are extracted from a combination of the measurement values of the present cycle and the past cycle”; “The extraction unit 21 calculates a projected position and projected range of the lane marker relative to the present cycle based on a projected result, which is projected by the parameter follow-up unit 25 at the previous cycle for use in the present cycle”; points that are not in the expected range based on the road edge information from the last cycle projected to the current cycle are considered disturbances and not extracted as can be seen in FIG. 6). It would have been obvious, before the effective filing date of the invention, with a reasonable expectation of success, to one having ordinary skill in the art, to combine the teachings of Watanabe and Ogawa by completing the lane marker analysis of Ogawa in the lane coordinate system of Watanabe (e.g., creating the lane coordinate system of Watanabe off of the projected centerline of Ogawa and then completing at least some of the processing for the next time step of Ogawa in the straight-line lane coordinate system of Watanabe) with the motivation of accurately extracting lane markers on curved roads while being able to transform the complicated road shape into straight lines to improve processing (see at least Ogawa [0005] and Watanabe [0051]). Furthermore, before the effective filing date of the invention, with a reasonable expectation of success, one having ordinary skill in the art would have found it obvious to try different known calculations, determinations, etc. in any of the known / already used coordinate systems with the motivation of determining coordinate systems that use the least processing power and/or complete calculations the quickest to make a cost effective and responsive system for an autonomous vehicle. Regarding Claim 2, the Watanabe and Ogawa combination teaches the limitations of Claim 1. As previously discussed, Watanabe discloses the road edge coordinate system is based on road edge information using road edge line as reference line. Furthermore, Ogawa further teaches (with the same motivation to combine as Claim 1) wherein the extracting lane line reflection points from the lane line candidate reflection points based on coordinates of the lane line candidate reflection points based on the road edge information using the road edge line as the reference line comprises: determining features of the lane line candidate reflection points based on the coordinates of the lane line candidate reflection points based on the road edge information using the road edge line as the reference line (see at least FIG. 6: points are compared to the projected position); and extracting the lane line reflection points from the lane line candidate reflection points based on the features of the lane line candidate reflection points (see at least FIG. 6: some points are extracted as lane marker and some are considered disturbances). Regarding Claim 3, the Watanabe and Ogawa combination teaches the limitations of Claim 2. As previously discussed, Watanabe discloses the road edge coordinate system is based on road edge information using road edge line as reference line and that the road edge coordinate system is the road edge information converted to obtain a straight line. Furthermore, Ogawa further teaches (with the same motivation to combine as Claim 1) wherein the determining features of the lane line candidate reflection points based on the coordinates of the lane line candidate reflection points based on the road edge information using the road edge line as the reference line comprises: performing feature detection through Hough transform based on the coordinates of the lane line candidate reflection points based on road edge information using road edge line as reference line to obtain a shape for future calculations based on the road edge information (see at least [0043]-[0045] and [0060]: “the generated data on the lane centerline are subjected to the Hough's transformation to calculate a centerline position and centerline shape of the lane”); and the extracting the lane line reflection points from the lane line candidate reflection points based on the features of the lane line candidate reflection points comprises: extracting, from the lane line candidate reflection points, reflection points whose distances from the shape for future calculations based on the road edge information are less than a first threshold as the lane line reflection points (see at least FIG 6: in the next cycle, data outside the projected error range is considered a disturbance and not used). Regarding Claim 4, the Watanabe and Ogawa combination teaches the limitations of Claim 1. As previously discussed, Watanabe discloses the road edge coordinate system is based on road edge information using road edge line as reference line. Furthermore, Ogawa further teaches (with the same motivation to combine as Claim 1) wherein the extracting lane line reflection points from the lane line candidate reflection points based on coordinates of the lane line candidate reflection points based on the road edge information using the road edge line as the reference line comprises: clustering the lane line candidate reflection points based on the coordinates of the lane line candidate reflection points based on the road edge information using the road edge line as the reference line; and determining the lane line reflection points based on a clustering result (see a least FIG. 6: points within the projected error cluster are extracted a lane marker and points outside are considered disturbances). Regarding Claim 5, the Watanabe and Ogawa combination teaches the limitations of Claim 1. As previously discussed, Watanabe discloses the road edge coordinate system is based on road edge information using road edge line as reference line. Furthermore, Ogawa further teaches (with the same motivation to combine as Claim 1) wherein before the extracting lane line reflection points from the lane line candidate reflection points based on coordinates of the lane line candidate reflection points based on the road edge information using the road edge line as the reference line, the method further comprises: filtering out, from the lane line candidate reflection points, reflection points whose distances from a road edge are greater than a threshold based on the road edge information (see at least FIG. 6: points outside of the projected error zone as considered disturbances and not extracted as lane marker). Regarding Claim 6, the Watanabe and Ogawa combination teaches the limitations of Claim 1. Furthermore, Watanabe further discloses wherein the road edge information comprises information about two road edge lines (see at least Fig. 2: lane boundary line 100, lane boundary line 101), and the establishing a road edge coordinate system based on the road edge information comprises: establishing the road edge coordinate system by using a longer edge line in the two road edge lines or a central line of the two road edge lines as a reference line (see at least Fig. 2 and Fig. 4: lane boundary line 103 is a central line between lane boundary line 100 and lane boundary line 101; the lane coordinate system uses lane boundary line 103 as a reference line). Regarding Claim 7, the Watanabe and Ogawa combination teaches the limitations of Claim 1. Furthermore, Watanabe further discloses wherein the road edge information comprises information about two road edge lines (see at least Fig. 2: lane boundary line 100, lane boundary line 101, lane boundary line 103), there is a difference between the two road edge lines (see at least Fig. 2: lane boundary line 100, lane boundary line 101, and lane boundary line 103 are different from one another), and the road edge coordinate system comprises two coordinate systems that are established respectively by using the two road edge lines as reference lines (see at least [0058]-[0059] and FIG. 6: “transformation processing (S110), the transformation unit 14 of the travel control apparatus 1 projects the lane boundary lines 101 and 103 and the object 200 recognized in the XY plane coordinate system on the lane coordinate system by the coordinate transformation”; “the travel control processing illustrated in FIG. 6 is repeatedly executed until the signal to end the operation of the travel control apparatus 1 is acquired”; there are multiple coordinate systems due to the multiple time steps). Regarding Claim 8, most limitations are similar to Claim 1, and rejected for the same reasons as Claim 1. Furthermore, Watanabe discloses An apparatus for detecting a lane line (see at least [0037] and [0039]: “The travel control apparatus 1 has a function of performing a coordinate transformation such that a position of an object and a position of lane boundary line indicated in the usual plane coordinate system are indicated on a coordinate system (lane coordinate system) different from the usual plane coordinate system”), comprising: at least one processor (see at least [0039] and Fig. 1: “The ECU 3 is an electronic control unit including a central processing unit (CPU), read only memory (ROM), random access memory (RAM), and the like”); and one or more memories coupled to the at least one processor and storing programming instructions for execution by the at least one processor to cause the apparatus to perform operations (see at least [0039] and Fig. 1: “The ECU 3 is an electronic control unit including a central processing unit (CPU), read only memory (ROM), random access memory (RAM), and the like”) comprising: obtaining lane line candidate reflection points and road edge information based on scanning information of a surrounding environment of a vehicle from a LIDAR (see at least [0039]-[0040], [0043], and Fig. 1: “The external sensor 31 is a detection device that detects a situation around the vehicle 2. The external sensor 31 includes a camera, radar, and a laser imaging detection and ranging (LIDAR)”; “The LIDAR detects the object around the vehicle 2 and the environment on the road such as the lane boundary lines 100 to 103 using light”). Regarding Claim 9, this claim is substantially similar to Claim 2, and rejected for the same reasons as Claim 2. Regarding Claim 10, this claim is substantially similar to Claim 3, and rejected for the same reasons as Claim 3. Regarding Claim 11, this claim is substantially similar to Claim 4, and rejected for the same reasons as Claim 4. Regarding Claim 12, this claim is substantially similar to Claim 5, and rejected for the same reasons as Claim 5. Regarding Claim 13, this claim is substantially similar to Claim 6, and rejected for the same reasons as Claim 6. Regarding Claim 14, this claim is substantially similar to Claim 7, and rejected for the same reasons as Claim 7. Regarding Claim 21, most limitations are similar to Claims 1 and 8, and rejected for the same reasons as Claims 1 and 8. Furthermore, Watanabe discloses One or more non-transitory computer-readable media storing computer instructions, that when executed by one or more processors, cause a computing device to perform operations (see at least [0039]). Regarding Claim 22, this claim is substantially similar to Claim 2, and rejected for the same reasons as Claim 2. Regarding Claim 23, this claim is substantially similar to Claim 3, and rejected for the same reasons as Claim 3. Regarding Claim 24, this claim is substantially similar to Claim 4, and rejected for the same reasons as Claim 4. Regarding Claim 25, this claim is substantially similar to Claim 5, and rejected for the same reasons as Claim 5. Regarding Claim 26, this claim is substantially similar to Claim 6, and rejected for the same reasons as Claim 6. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALEXANDRA ROBYN MORFORD whose telephone number is (571)272-6109. The examiner can normally be reached Monday - Friday 8:00 AM - 4:00 PM ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thomas Worden can be reached at (571) 272-4876. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /A.R.M./Examiner, Art Unit 3658 /JASON HOLLOWAY/Primary Examiner, Art Unit 3658
Read full office action

Prosecution Timeline

Dec 30, 2022
Application Filed
May 05, 2025
Non-Final Rejection — §103, §112
Aug 11, 2025
Response Filed
Oct 08, 2025
Final Rejection — §103, §112
Dec 29, 2025
Response after Non-Final Action
Jan 20, 2026
Request for Continued Examination
Feb 17, 2026
Response after Non-Final Action
Mar 12, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594669
ROBOT CONTROL METHOD, ROBOT CONTROL SYSTEM, AND COMPUTER READABLE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12576952
SENSOR CALIBRATION SYSTEM FOR WATERCRAFT AND WATERCRAFT
2y 5m to grant Granted Mar 17, 2026
Patent 12472632
OPERATION SYSTEM, OPERATION METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Nov 18, 2025
Patent 12358646
METHOD AND APPARATUS FOR CAPTURING NON-COOPERATIVE TARGET USING SPACE ROBOTIC ARM, AND NON-TRANSITORY STORAGE MEDIUM
2y 5m to grant Granted Jul 15, 2025
Study what changed to get past this examiner. Based on 4 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
57%
Grant Probability
99%
With Interview (+60.0%)
2y 4m
Median Time to Grant
High
PTA Risk
Based on 7 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month