Prosecution Insights
Last updated: April 19, 2026
Application No. 18/494,547

TERMINAL DEVICE LOCALIZATION METHOD AND RELATED DEVICE THEREFOR

Non-Final OA §102
Filed
Oct 25, 2023
Examiner
MEHMOOD, JENNIFER
Art Unit
2664
Tech Center
2600 — Communications
Assignee
Huawei Technologies Co., Ltd.
OA Round
1 (Non-Final)
65%
Grant Probability
Moderate
1-2
OA Rounds
3y 1m
To Grant
95%
With Interview

Examiner Intelligence

Grants 65% of resolved cases
65%
Career Allow Rate
160 granted / 247 resolved
+2.8% vs TC avg
Strong +31% interview lift
Without
With
+30.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
21 currently pending
Career history
268
Total Applications
across all art units

Statute-Specific Performance

§101
2.8%
-37.2% vs TC avg
§103
45.0%
+5.0% vs TC avg
§102
31.9%
-8.1% vs TC avg
§112
17.6%
-22.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 247 resolved cases

Office Action

§102
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-17 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by CN111750864 (Cited in the IDS of 8-23-24). With respect to claim 1, the 864 reference teaches a terminal device (mobile robot) having a vision system which is described in the Description – background wherein a localization method is illustrated by the flow charts of figure 6. The 864 reference teaches obtaining a visual map as a vector map for obtaining first map points to match with first feature points. In the middle of page 6, the 864 reference teaches matching the feature points in the current frame with map points in the candidate map range to obtain matched feature points. This suggest that at a least a first and second map points will be matched with corresponding feature points. For example, in the middle of page 7, the first matching feature points are matching the feature points in the current frame with map points in a map t obtain successfully matched feature points. The first matching map points are map points successfully matched by the first matched feature points. At page 23 of the 864 reference, it teaches back propagation of error X of the second matched map point m at the current frame I and the previous key frame. The reference further states that a second matching map point m is matched to a second matching feature point in the current frame I and the previous key frame j. Hence, the first map points are matched with first feature points and second map points are matched with second feature points. The 864 reference makes an adjustment of the localization result of the mobile robot (see the bottom of page 5). In the middle of page 8, the 864 reference states in part: “taking a map matching error obtained according to the pose of the current frame, the spatial position information of the first matching map point, camera internal parameter and the pixel coordinates of the first matching feature point matched with the first matching map point in the current frame as an initial value of the map matching error, iterative solution is carried out to enable the target function”. Clearly, the 864 references addresses the matched error for between the first feature point and the first map point and the error between the second feature point and the second map point. This is taught at page 8 where in states in part “… taking a matched error ….of the first matching map point a…. and the pixel coordinates of the first matching feature point matched with the first matching map point in the current frame. The 864 reference teaches wherein the virtual map as the vector map obtains an image frame of a historic or reference frame (see the ABSTRACT. Regarding the error of the second feature points and map points, the 864 reference teaches using an interframe matching error obtained according to the pose of the current frame, based on a spatial position of the second matching map point and determining a current frame having proper pose when the second matching feature points of the current frame is less than a second number threshold. With respect to claim 2, the 864 reference teaches obtaining a first feature point in a current frame and a second feature point in another frame which is a “reference frame”, see the ABSTRACT, where it teaches predicting the pose of the current frame according to the relative pose between the reference frame and the current frame. The 86 reference teaches taking a pose of a current frame and a pose of a historic or reference frame (as the other frame) Furthermore, the 864 reference teaches obtaining the first and second feature points as claimed. Particularly, the 864 reference teaches a prediction pose module, the current frame closest to the location success image frame as a reference frame (other frame), according to the reference frame to the current frame of the relative pose, predicting the pose of the current frame. Moreover, at step 901, according to the history frame of the location success, tracing the nearest one frame in the history frame as the reference frame, according to the reference frame to the current frame of the inter-frame motion information (after being adjusted by minimizing the error 26, “… judging whether a predicted pose range exceeds a first threshold, if so, judging that the repositioning is unsuccessful. Note that in the third para. of page the current frame pose prediction, obtaining the predicted pose. Regarding the third limitation of the claim, the 864 reference teaches wherein obtaining a visual map, as a vector map, a second map points matching second feature points (see the middle of page 8) wherein from the visual map, as the applicant’s vector map. The 864 reference further teaches using a historic or reference frame. Regarding the wherein clause as it relates to a vector map, for a second map point and the second feature point, at page 23 of the 864 reference, it teaches back propagation of error X of the second matched map point m at the current frame I and the previous key frame. The reference further states that a second matching map point m is matched to a second matching feature point in the current frame I and the previous key frame j. Hence, this limitation is suggested by the reference. Hence, the first map points are matched with first feature points and second map points are matched with second feature points. At the bottom of page 15, the 864 reference teaches tracing a latest frame in the history frames as a reference frame according to those frames successfully positioned and predicting the pose of the current frame in accordance with intra-frame motion. Information from the reference frame to the current frame to obtain a predicted pose. The 864 reference teaches a terminal device (mobile robot) that obtains other frames such as the historic or reference frames. It appears that at step 201 The reference or historic frames are the basis for prior historic frames or immediately preceding frames. For example, the 864 reference states step 201, according to the history frame of the location success, tracing the latest frame in the history frame as the reference frame, according to the reference frame to the current frame of the inter-frame motion information, the current frame pose prediction, to obtain the prediction pose, so as to utilize the prediction pose to determine the third candidate map range, so as to improve the efficiency of the matching. The historic or reference frame is obtained after a previous adjustment (target adjustment such as iterative least square error methods or the objective function utilizing weights for minimizing the error as set forth at page 8). With respect to claim 3, the 864 reference teaches adjusting a target function (see the middle of page 8) wherein the target function obtains the error value between the first feature point and the first map point and minimizes or optimizes error computation by using an iterative least square method to obtain optimized current frame pose as a positioning result. See the first half of page 7. The Examiner contends that this claimed feature is taught at step 2064 of the 864 reference. The 864 reference teaches that for each first matching characteristic point in the first matching characteristic point set, judging whether the distance between the projection point of the first matching characteristic point in the current frame and the first matching map point matched with the first matching characteristic point in the map is less than the set second distance threshold value, if it is, then judging that the first matching characteristic point is inner point. The 864 reference also teaches performing a calculation of the distance between the second feature map and the second map points and defining an error. The second error, is optimized using a least square error using weighted values, which appears to be an iterative step to make iterations to get the error value less than a predetermined threshold. See page 8 and the middle of page 7 regarding the minimization of both error values taken between the first map points and first feature points and between the second map points and the second feature points. With respect to claim 4, the Examiner contends that at least the first condition of the claim has been met by the 864 reference. The 864 reference teaches at least the portion of the claim where it states, “… a distance between a location of the first feature point in the current image frame and a location of the first map point in the current frame.” The Examiner contends that this limitation is taught by the 864 reference at the bottom of page 4 where it states in part:”… for each matching feature point in the matching feature point set, judging whether the distance between the projection point of the matching feature point in the current frame and matching feature point in the map is small than a set second distance threshold value or not ..” With respect to claim 5, the Examiner contends that at least the first condition of the claim has been met by the 864 reference. The limitation of claim 5 is recited as claim 10 of the 864 reference, and claim 10 is part of the disclosure of the 864 reference. At page 4, wherein the recitation of claim 10 of the reference begins, it recites a distance between the second feature point and the second map point. According to the projection point space position of the matching feature point, the 864 reference teaches judging whether the distance between the projection point of the matching feature point in the current frame and the matching feature point in the map is less than the set second distance threshold value, if so, judging that the matching feature point is the inner point; repeatedly executing the step of judging whether the distance between the projection point of the matching feature point in the current frame and the matching feature point in the map is less than the set second distance threshold value. With respect to claim 6, the Examiner contends that the last limitation of the claim is taught by the 864 reference wherein the claim limitation states: the different is greater than or equal to the preset threshold, performing the next iteration. In the middle of page 8, the 864 reference teaches using the inter-frame matching error obtained according to the pose of the current frame, the spatial position information of the second matching map point, the pose of the previous key frame and the camera internal reference matrix as the initial value of the inter-frame matching error. Therefore, the distance between the second map point and the second feature point constitutes a distance error. This error is minimized by the teaching of the 864 reference. For example, in the middle of page 8, the 864 reference state: “… iterative solution is carried out to enable the target function to obtain the pose of the current frame when the minimum value is obtained. “ With respect to claim 7, the claim recites that a quantity of other image frames is based on a speed of the terminal device. The Examiner contends that the 864 reference teaches this limitation as well. In the Description, the 864 reference teaches the mobile device is often lost in positioning due to errors in the odometer reading (see the Background at lines 1-5.) By mobile device it is presumed to operate substantially in motion. The 864 reference gives an example of error readings due to slippage of the odometer of the mobile device ( page 7, last 11 lines from the bottom). This slippage of the odometer reading means that the mobile device does not travel at a smooth rate of speed, which may have resulted in the mobile device slipping due to acts of nature, where the wheels do not rotate in suitable fashion resulting in lost positioning of the mobile device. See also page 6, last para. The 864 reference teaches that as a result of the slippage of the odometer or change in inertial measurements, the pose of the current frame must be recalculated, (see the top of page 8). Other processing steps are subsequently required as set forth at page 8. The 864 reference (page 8, 10th line from the bottom) states that: “If the current frame is a key frame, judging whether the number of frames in the current sliding window reaches a set first frame threshold value, if so, deleting the keyframe added earliest in the sliding window, otherwise, not deleting the key frame added earlies in the sliding window. Hence, the determination of the number of frames in the current sliding window may be determined by a computation resulting from a change in the odometer or inertia measuring device (page 9, lines 1-2). With respect to claim 8, the 864 reference teaches a terminal device (mobile robot with camera) for shooting a current image frame and making a calculation based on the pose of a historic or reference frame obtained after a prior adjustment (minimization of error so that the error is below a predetermined threshold). As to the limitation of claim 8 identified as (1), the 864 reference teaches calculating based on a camera taking shots from a historic or reference frame (as the “other image frame”. For example, at step 201, according to the history frame of the location success, tracing the latest frame in the history frame as the reference frame, according to the reference frame to the current frame of the inter-frame motion information, the current frame pose prediction, to obtain the prediction pose, so as to utilize the prediction pose..” As to limitation of claim 8, numerated as (2), the 864 reference also teaches calculations between inter-frame poses in which a camera shoots a current frame. The 864 reference teaches the generation or calculation of the difference in the inter-frame pose. The inter-frame matching error obtained by the camera internal reference matrix, as the inter-frame matching error initial value, iteratively solving the pose of the current frame when the target function obtains the minimum value. As to the limitation for performing hierarchical sampling, the Examiner contends that the 864 reference performs hierarchal sampling by means of random sample consensus algorithm (RANSAC), as described at step 206, described at page 18. The reference teaches claimed feature is taught wherein the reference states: “ … so as to improve the accuracy of the matching feature point, thereby improving the accuracy the pose of the current frame… a RANSAC algorithm is used to determine a best matching feature point…’ With respect to claim 9, the 864 reference teaches a terminal device (mobile robot) localization apparatus for localizing the robot with respect to a visual map. The 864 reference teaches that the apparatus is comprising a memory storing a computer program and a processor configured to execute the computer program … to implement the steps of the visual map -based repositioning method. The aforementioned section reads on the pre-amble of claim 9, including the limitation of the processor. The 864 reference teaches obtaining a visual map as a vector map for obtaining first map points to match with first feature points. In the middle of page 6, the 864 reference teaches matching the feature points in the current frame with map points in the candidate map range to obtain matched feature points. This suggest that at a least a first and second map points will be matched with corresponding feature points. For example, in the middle of page 7, the first matching feature points are matching the feature points in the current frame with map points in a map to obtain successfully matched feature points. The first matching map points are map points successfully matched by the first matched feature points. At page 23 of the 864 reference, it teaches back propagation of error X of the second matched map point m at the current frame I and the previous key frame. The reference further states that a second matching map point m is matched to a second matching feature point in the current frame I and the previous key frame j. Hence, the first map points are matched with first feature points and second map points are matched with second feature points. The 864 reference makes an adjustment of the localization result of the mobile robot (see the bottom of page 5). In the middle of page 8, the 864 reference states in part: “taking a map matching error obtained according to the pose of the current frame, the spatial position information of the first matching map point, camera internal parameter and the pixel coordinates of the first matching feature point matched with the first matching map point in the current frame as an initial value of the map matching error, iterative solution is carried out to enable the target function”. Clearly, the 864 references addresses the matched error for between the first feature point and the first map point and the error between the second feature point and the second map point. This is taught at page 8 where in states in part “… taking a matched error ….of the first matching map point a…. and the pixel coordinates of the first matching feature point matched with the first matching map point in the current frame. The 864 reference teaches wherein the virtual map as the vector map obtains an image frame of a historic or reference frame (see the ABSTRACT Regarding the error of the second feature points and map points, the 864 reference teaches using an interframe matching error obtained according to the pose of the current frame, based on a spatial position of the second matching map point and determining a current frame having proper pose when the second matching feature points of the current frame is less than a second number threshold. With respect to claim 10, the 864 reference teaches the mobile robot as a terminal device localization apparatus wherein further operations comprise: the 864 reference teaches obtaining a first feature point in a current frame and a second feature point in another frame which is a “reference frame”, see the ABSTRACT, where it teaches predicting the pose of the current frame according to the relative pose between the reference frame and the current frame. The 86 reference teaches taking a pose of a current frame and a pose of a historic or reference frame (as the other frame) Furthermore, the 864 reference teaches obtaining the first and second feature points as claimed. Particularly, the 864 reference teaches a prediction pose module, the current frame closest to the location success image frame as a reference frame (other frame), according to the reference frame to the current frame of the relative pose, predicting the pose of the current frame. Moreover, at step 901, according to the history frame of the location success, tracing the nearest one frame in the history frame as the reference frame, according to the reference frame to the current frame of the inter-frame motion information (after being adjusted by minimizing the error 26, “… judging whether a predicted pose range exceeds a first threshold, if so, judging that the repositioning is unsuccessful. Note that in the third para. of page the current frame pose prediction, obtaining the predicted pose Regarding the third limitation of the claim, the 864 reference teaches wherein obtaining a visual map, as a vector map, a second map points matching second feature points (see the middle of page 8) wherein from the visual map, as the applicant’s vector map. The 864 reference further teaches using a historic or reference frame. Regarding the wherein clause as it relates to a vector map, for a second map point and the second feature point, At page 23 of the 864 reference, it teaches back propagation of error X of the second matched map point m at the current frame I and the previous key frame. The reference further states that a second matching map point m is matched to a second matching feature point in the current frame I and the previous key frame j. Hence, this limitation is suggested by the reference. Hence, the first map points are matched with first feature points and second map points are matched with second feature points. At the bottom of page 15, the 864 reference teaches tracing a latest frame in the history frames as a reference frame according to those frames successfully positioned and predicting the pose of the current frame in accordance with intra-frame motion. Information from the reference frame to the current frame to obtain a predicted pose. The 864 reference teaches a terminal device (mobile robot) that obtains other frames such as the historic or reference frames. It appears that at step 201 The reference or historic frames are the basis for prior historic frames or immediately preceding frames. For example, the 864 reference states step 201, according to the history frame of the location success, tracing the latest frame in the history frame as the reference frame, according to the reference frame to the current frame of the inter-frame motion information, the current frame pose prediction, to obtain the prediction pose, so as to utilize the prediction pose to determine the third candidate map range, so as to improve the efficiency of the matching. The historic or reference frame is obtained after a previous adjustment (target adjustment such as iterative least square error methods or the objective function utilizing weights for minimizing the error as set forth at page 8). With respect to claim 11, the 864 reference teaches adjusting a target function (see the middle of page 8) wherein the target function obtains the error value between the first feature point and the first map point and minimizes or optimizes error computation by using an iterative least square method to obtain optimized current frame pose as a positioning result. See the first half of page 7. The Examiner contends that this claimed feature is taught at step 2064 of the 864 reference. The 864 reference teaches that for each first matching characteristic point in the first matching characteristic point set, judging whether the distance between the projection point of the first matching characteristic point in the current frame and the first matching map point matched with the first matching characteristic point in the map is less than the set second distance threshold value. The Examiner contends that the 864 reference teaches the at least one limitation as set forth below. The 864 reference also teaches performing a calculation of the distance between the second feature map and the second map points and defining an error. The second error, is optimized using a least square error using weighted values, which appears to be an iterative step to make iterations to get the error value less than a predetermined threshold. See page 8 and the middle of page 7 regarding the minimization of both error values taken between the first map points and first feature points and between the second map points and the second feature points. With respect to claim 12, the Examiner contends that at least the first condition of the claim has been met by the 864 reference. The 864 reference teaches at least the portion of the claim where it states, “… a distance between a location of the first feature point in the current image frame and a location of the first map point in the current frame.” The Examiner contends that this limitation is taught by the 864 reference at the bottom of page 4 where it states in part:”… for each matching feature point in the matching feature point set, judging whether the distance between the projection point of the matching feature point in the current frame and matching feature point in the map is small than a set second distance threshold value or not ..” With respect to claim 13, the Examiner contends that at least the second limitation is suggested by the 864 reference and hence the claim is contemplated by the reference. The particular limitation recites the distance between a location of the second feature point and the second map point. At the top of page 23 of the 864 reference, it state a second matching map point m on the current frame i and the last key frame j of the back projection error, Xm is the current frame i and the second matching feature point in the last key frame j matched with the second matching map point m of the three-dimensional coordinate. The distance between the location is the error that is minimized using the target functions described at page 8. With respect to claim 14, the Examiner contends that the alternative limitation delineated as the last limitation, is met by the 864 reference. Examiner contends that the last limitation of the claim is taught by the 864 reference wherein the claim limitation states: the different is greater than or equal to the preset threshold, performing the next iteration. In the middle of page 8, the 864 reference teaches using the inter-frame matching error obtained according to the pose of the current frame, the spatial position information of the second matching map point, the pose of the previous key frame and the camera internal reference matrix as the initial value of the inter-frame matching error. Therefore, the distance between the second map point and the second feature point constitutes a distance error. This error is minimized by the teaching of the 864 reference. A quantity of iterations occur until the error rate is below a predetermined threshold. For example, in the middle of page 8, the 864 reference state: “… iterative solution is carried out to enable the target function to obtain the pose of the current frame when the minimum value is obtained. “ With respect to claim 15, the claim recites that a quantity of other image frames is based on a speed of the terminal device. The Examiner contends that the 864 reference teaches this limitation as well. At step 2603, the 864 reference makes clear the significance of the speed of the camera in reference to the “point rate”. At page 19, beginning at the middle of the page, the 864 reference teaches that matching feature points should conform to the currently calculated fitting pose estimation. For this to occur, the interior point rate needs to be calculated. In this step, the spatial positions of all the feature points in the current frame are obtained according to the fitting pose estimation and the camera internal parameters. The camera internal parameters include the camera’s speed rate. The camera being mounted on the mobile robot which corresponds with the claimed terminal device. With respect to claim 16, the 864 reference teaches a terminal device (mobile robot with camera) for shooting a current image frame and making a calculation based on the pose of a historic or reference frame obtained after a prior adjustment (minimization of error so that the error is below a predetermined threshold). As to the limitation of claim 16 identified as (1), the 864 reference teaches calculating based on a camera taking shots from a historic or reference frame (as the “other image frame”. For example, at step 201, according to the history frame of the location success, tracing the latest frame in the history frame as the reference frame, according to the reference frame to the current frame of the inter-frame motion information, the current frame pose prediction, to obtain the prediction pose, so as to utilize the prediction pose..” As to limitation of claim 16, numerated as (2), the 864 reference also teaches calculations between inter-frame poses in which a camera shoots a current frame. The 864 reference teaches the generation or calculation of the difference in the inter-frame pose. The inter-frame matching error obtained by the camera internal reference matrix, as the inter-frame matching error initial value, iteratively solving the pose of the current frame when the target function obtains the minimum value. As to the limitation for performing hierarchical sampling, the Examiner contends that the 864 reference performs hierarchal sampling by means of random sample consensus algorithm (RANSAC), as described at step 206, described at page 18. The reference teaches claimed feature is taught wherein the reference states: “ … so as to improve the accuracy of the matching feature point, thereby improving the accuracy the pose of the current frame… a RANSAC algorithm is used to determine a best matching feature point…’ With respect to claim 17, the 864 reference teaches a terminal device (mobile robot) localization apparatus for localizing the robot with respect to a visual map. The 864 reference teaches that the apparatus is comprising a memory storing a computer program and a processor configured to execute the computer program … to implement the steps of the visual map -based repositioning method. The aforementioned section reads on the pre-amble of claim 9, including the limitation of the processor. The 864 reference teaches obtaining a visual map as a vector map for obtaining first map points to match with first feature points. In the middle of page 6, the 864 reference teaches matching the feature points in the current frame with map points in the candidate map range to obtain matched feature points. This suggest that at a least a first and second map points will be matched with corresponding feature points. For example, in the middle of page 7, the first matching feature points are matching the feature points in the current frame with map points in a map t obtain successfully matched feature points. The first matching map points are map points successfully matched by the first matched feature points. At page 23 of the 864 reference, it teaches back propagation of error X of the second matched map point m at the current frame I and the previous key frame. The reference further states that a second matching map point m is matched to a second matching feature point in the current frame I and the previous key frame j. Hence, the first map points are matched with first feature points and second map points are matched with second feature points. The 864 reference makes an adjustment of the localization result of the mobile robot (see the bottom of page 5). In the middle of page 8, the 864 reference states in part: “taking a map matching error obtained according to the pose of the current frame, the spatial position information of the first matching map point, camera internal parameter and the pixel coordinates of the first matching feature point matched with the first matching map point in the current frame as an initial value of the map matching error, iterative solution is carried out to enable the target function”. Clearly, the 864 references addresses the matched error for between the first feature point and the first map point and the error between the second feature point and the second map point. This is taught at page 8 where in states in part “… taking a matched error ….of the first matching map point a…. and the pixel coordinates of the first matching feature point matched with the first matching map point in the current frame. The 864 reference teaches wherein the virtual map as the vector map obtains an image frame of a historic or reference frame (see the ABSTRACT Regarding the error of the second feature points and map points, the 864 reference teaches using an interframe matching error obtained according to the pose of the current frame, based on a spatial position of the second matching map point and determining a current frame having proper pose when the second matching feature points of the current frame is less than a second number threshold. Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEROME GRANT II whose telephone number is (571)272-7463. The examiner can normally be reached M-F 9:00 a.m. - 5:00 p.m.. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer Mehmood can be reached at 571-272-2976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /JEROME GRANT II/Primary Examiner, Art Unit 2664
Read full office action

Prosecution Timeline

Oct 25, 2023
Application Filed
Dec 23, 2025
Non-Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12572774
NEURAL NETWORK PROCESSOR AND METHOD OF NEURAL NETWORK PROCESSING
2y 5m to grant Granted Mar 10, 2026
Patent 10269295
ORGANIC LIGHT EMITTING DISPLAY DEVICE AND DRIVING METHOD THEREOF
2y 5m to grant Granted Apr 23, 2019
Patent 9245189
OBJECT APPEARANCE FREQUENCY ESTIMATING APPARATUS
2y 5m to grant Granted Jan 26, 2016
Patent 8344909
METHOD AND SYSTEM FOR COLLECTING TRAFFIC DATA, MONITORING TRAFFIC, AND AUTOMATED ENFORCEMENT AT A CENTRALIZED STATION
2y 5m to grant Granted Jan 01, 2013
Patent 8294567
METHOD AND SYSTEM FOR FIRE DETECTION
2y 5m to grant Granted Oct 23, 2012
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
65%
Grant Probability
95%
With Interview (+30.6%)
3y 1m
Median Time to Grant
Low
PTA Risk
Based on 247 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month