Prosecution Insights
Last updated: April 19, 2026
Application No. 18/326,233

AUTOMATIC ADJUSTMENT OF PARAMETERS BASED ON PART SURFACE REFLECTIVE INDEX FOR POINT CLOUD ACQUISITION USING A BLUE LIGHT SCANNER

Final Rejection §103
Filed
May 31, 2023
Examiner
MAHROUKA, WASSIM
Art Unit
2665
Tech Center
2600 — Communications
Assignee
Pratt & Whitney Canada Corp.
OA Round
2 (Final)
86%
Grant Probability
Favorable
3-4
OA Rounds
2y 5m
To Grant
93%
With Interview

Examiner Intelligence

Grants 86% — above average
86%
Career Allow Rate
210 granted / 243 resolved
+24.4% vs TC avg
Moderate +6% lift
Without
With
+6.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 5m
Avg Prosecution
29 currently pending
Career history
272
Total Applications
across all art units

Statute-Specific Performance

§101
16.5%
-23.5% vs TC avg
§103
42.8%
+2.8% vs TC avg
§102
17.9%
-22.1% vs TC avg
§112
12.5%
-27.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 243 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The amendment filed on 1/22/2026 has been entered. Claims 1, 8, and 15 were amended. Claims 1-20 remain pending in the application. Claim Rejections - 35 USC § 103 The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Claims 1, 7-8, and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Yamanashi (US 20160098614) in view of Satat (US 20220168898) and Thieret (US 20020149787). Regarding claim 8: Yamanashi discloses: a scanning system (¶ [0029] , ¶ [0038, and FIGS. 1 and 2, gloss determination device 100), comprising: a light source for illuminating a part with a light (¶ [0029] – ¶ [0032], and FIG. 1 and 2, and light 120 and illumination unit 310, and ¶ [0040] “…illuminating unit 310 illuminates face 210)); a gloss meter for measuring a surface reflectivity of the part based on a reflection of the light from the part (¶ [0033] – ¶ [0036], ¶ [0046] “Gloss determining unit 350 determines a gloss condition of face 210 based on the input differential image, and outputs the determination result to skin condition determining unit 360”; ¶ [0035] “…gloss determination device 100 uses camera 130 with a polarization filter to extract, from reflected light, a brightness value of a component which is polarized in the same direction as the polarization direction of the polarized light and a brightness value of a component which is polarized in a perpendicular direction to the polarization direction of the polarized light. Then, gloss determination device 100 determines a part where a difference between the extracted brightness values is large as a part where the degree of specular reflection is high, or the degree of gloss on the surface of the substance is high”); an imaging device (¶ [0029] – ¶ [0032], ¶ [0039], and FIG. 1 and 2, camera 130, photographing unit 320, image acquiring unit 330); and a processor ((¶ [0029] – ¶ [0032], ¶ [0039], FIGS. 1 and 2 and ¶ [0051] “…gloss determination device 100 may have, for example, a CPU (central processing unit)”) configured to: determine a gloss unit for the part based on the surface reflectivity (¶ [0035] “…gloss determination device 100 uses camera 130 with a polarization filter to extract, from reflected light, a brightness value of a component which is polarized in the same direction as the polarization direction of the polarized light and a brightness value of a component which is polarized in a perpendicular direction to the polarization direction of the polarized light. Then, gloss determination device 100 determines a part where a difference between the extracted brightness values is large as a part where the degree of specular reflection is high, or the degree of gloss on the surface of the substance is high”); adjust an acquisition parameter of the scanning system based on the gloss unit (¶¶ [0078] – [0084], ¶ [0078] “Further, the gloss determination device may not present the skin condition, but may have a illumination controller that controls illumination based on a determined gloss condition so that the gloss condition becomes a desired condition. For example, the gloss determination device may have a plurality of light sources which can be individually dimmed, and may change the illumination pattern of the light sources while acquiring a gloss region under each illumination pattern. Then, the gloss determination device may perform illumination in an illumination pattern under which the area of the gloss region becomes minimum, and again photograph the face without using the polarizing filter.”); and scan the part with the imaging device using the acquisition parameter ((¶¶ [0078] – [0084], ¶ [0078] “Further, the gloss determination device may not present the skin condition, but may have a illumination controller that controls illumination based on a determined gloss condition so that the gloss condition becomes a desired condition. For example, the gloss determination device may have a plurality of light sources which can be individually dimmed, and may change the illumination pattern of the light sources while acquiring a gloss region under each illumination pattern. Then, the gloss determination device may perform illumination in an illumination pattern under which the area of the gloss region becomes minimum, and again photograph the face without using the polarizing filter.”) Yamanashi does not specifically teach: a device for moving the light source, the gloss meter and the imaging device to a location in space with respect to the part. However, in the same field of endeavor, Satat teaches: a device for moving the light source, (abstract, “…controlling a moveable component of the robotic device to move along a motion path relative to the target surface, wherein the moveable component comprises a light source and a camera… determining bidirectional reflectance distribution function (BRDF) image data, wherein the BRDF image data comprises the plurality of images converted to angular space with respect to the target surface.”). Therefore, it would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified Yamanashi to incorporate the teachings of Satat by including: a device for moving the light source, and the imaging device to a location in space with respect to the part in order to use the system to determine a material property of a surface. Yamanashi in view of Satat does not specifically teach that the gloss meter is also movable by the device. However, in a related field of endeavor, Thieret teaches: a device for moving the gloss meter (¶ [0042] “…the gloss-meter may be mounted on a movable platform in order to sample a variety of readings in the final image”). Therefore, it would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified Yamanashi and Satat to incorporate the teachings of Thieret by including: a device for moving the gloss meter in order to sample a variety of readings to optimize a subsequent operation. Regarding claim 14: Yamanashi in view of Satat and Thieret discloses the limitations of claim 8 as applied above. Yamanashi further teaches: wherein the processor is further configured to create a scanning program based on the acquisition parameter (¶¶ [0078] – [0084], ¶ [0078] “Further, the gloss determination device may not present the skin condition, but may have a illumination controller that controls illumination based on a determined gloss condition so that the gloss condition becomes a desired condition. For example, the gloss determination device may have a plurality of light sources which can be individually dimmed, and may change the illumination pattern of the light sources while acquiring a gloss region under each illumination pattern. Then, the gloss determination device may perform illumination in an illumination pattern under which the area of the gloss region becomes minimum, and again photograph the face without using the polarizing filter.”). Regarding claims 1 and 7: the claims limitations are similar to those of claims 8 and 14, respectively; therefore, rejected in the same manner as applied to these claims. Claim(s) 2-4, 6, 9-11, and 13, are rejected under 35 U.S.C. 103 as being unpatentable over Yamanashi (US 20160098614) in view of Satat (US 20220168898) and Thieret (US 20020149787) and Sirotin (US 20210279499). Regarding claim 9: Yamanashi in view of Satat and Thieret discloses the limitations of claim 8 as applied above. Yamanashi in view of Satat and Thieret does not specifically teach: wherein the processor is further configured to 6determine a gloss range that includes the gloss unit and adjust the acquisition parameter based on the gloss range. However, in a related field, Sirotin teaches: wherein the processor is further configured to determine a gloss range that includes the gloss unit (¶ [0096] “Operations in process 1400 can include threshold RSR qualification 1402, which can apply RSR against, for example, a ceiling threshold T(C) and a floor threshold T(F). Threshold RSR qualification 1402 can indicate suitability 1404A of the captured image if RSR is within the suitable range, e.g., meets T(C) and T(F), and can indicate non-suitability 1404B of the captured image if RSR value is not within the suitable range”). and adjust the acquisition parameter based on the gloss range (¶ [0098] “In response to the threshold RSR qualification 1402 indicating non-suitability 1404B of the captured image, the process 1400 can proceed to flagging 1410 the current captured image for a follow on action… a routing to either an action 1418 or the image capture remediation 1412. Selection logic for selecting between the action 1418 and the image capture remediation 1412 can be provided. The process 1400 can include with the image capture remediation 1412 a generation of updated or adjustive image capture parameters, and a retaking 1420 of the image of the subject to obtain a second captured image”). Therefore, it would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified Yamanashi in view of Satat and Thieret to incorporate the teachings of Sirotin by including: wherein the processor is further configured to determine a gloss range that includes the gloss unit and adjust the acquisition parameter based on the gloss range in order to achieve an improved image-acquisition process. Regarding claim 10: Yamanashi in view of Satat and Thieret discloses the limitations of claim 8 as applied above. Yamanashi in view of Satat and Thieret does not specifically teach: wherein the acquisition parameter is at least one of: (i) a part template; (ii) an exposure mode; (iii) an exposure time; (iv) a resolution of the imaging device; (v) a coverage factor; (vi) a coverage detail; (vii) a minimum coverage; (viii) a sensor tilt angle; (ix) a minimum number of line shadows; and (x) a scanning area, (xi) a camera viewing angle, (xii) a maximum number of residuals; (xiii) a scanning mode. However, Sirotin teaches: wherein the acquisition parameter is at least one of: (i) a part template; (ii) an exposure mode; (iii) an exposure time; (iv) a resolution of the imaging device; (v) a coverage factor; (vi) a coverage detail; (vii) a minimum coverage; (viii) a sensor tilt angle; (ix) a minimum number of line shadows; and (x) a scanning area, (xi) a camera viewing angle, (xii) a maximum number of residuals; (xiii) a scanning mode (¶ [00078] “…The image capture parameters can define one or more parameters used in the capture of the image that was determined by the reflectance qualification 1306 as being non-conforming. Specific examples can include, but are not necessarily limited to, one or more parameters of the lighting, one or more camera gain parameters, or one or more parameters of the light diffusion, or any combination or sub-combination thereof.”; ¶ [0106] “…“image capture parameter values 1614,” include camera gain 1614-1, camera height 1614-2, camera spectral sensitivity (abbreviated “STY-CMD” in the figure) 1614-3, lighting intensity 1614-4, light diffusivity 1614-5, lighting color-tint 1614-6, light source arrangement (abbreviated “Light Source AGM” in the figure) 1614-7, and background configuration (abbreviated “CFG” in the figure) 1614-8. It will be understood that the FIG. 16 population and configuration of image capture parameter values 1614 is only an example.”). Regarding claim 11: Yamanashi in view of Satat and Thieret discloses the limitations of claim 8 as applied above. Yamanashi in view of Satat and Thieret does not specifically teach: wherein the processor is further configured to adjust a processing parameter for an image obtained by the imaging device based on the gloss unit. However, Sirotin teaches: wherein the processor is further configured to adjust a processing parameter for an image obtained by the imaging device based on the gloss unit (¶ [0098] “In response to the threshold RSR qualification 1402 indicating non-suitability 1404B of the captured image, the process 1400 can proceed to flagging 1410 the current captured image for a follow on action, for example, a selection 1411 between image capture remediation 1412 applying a post processing correction 1414 to the captured image. The post processing correction 1414 can include image processing configured to generate, from the captured image, a corrected reflectance image, having an RSR suitable for biometric identifying 1406. The process 1400 can include a determining 1416 of whether the post processing correction 1414 is successful.”). Regarding claim 13: Yamanashi in view of Satat and Thieret discloses the limitations of claim 8 as applied above. Yamanashi further discloses: wherein the processor is further configured to display the gloss unit at an interface (FIG. 1, display 140, ¶ [0032] “Display 140 with a touch panel has, for example, a liquid crystal display that is slightly larger in size than the human face. Display 140 with a touch panel receives operations from user 200, and displays information for user 200.”) Yamanashi in view of Satat and Thieret does not specifically teach: receive input at the interface for adjusting the acquisition parameter. However, Sirotin teaches: receive input at the interface for adjusting the acquisition parameter (¶ [0168] “A computer system may include a user interface controller under control of the processing system that displays a user interface in accordance with a user interface module”; ¶ [0169] “The user interface may facilitate the collection of inputs from a user” ; ¶ [0170] “…A user activatable object may allow the user to take some action. A display object and a user activatable object may be separate, collocated, overlapping, or nested one within another.”). Regarding claims 2-4, and 6: the claims limitations are similar to those of claims 9-11, and 13 respectively, therefore, rejected in the same manner as applied to these claims. Claim(s) 5, 12 and 15-20 are rejected under 35 U.S.C. 103 as being unpatentable over Yamanashi (US 20160098614) in view of Satat (US 20220168898), Thieret (US 20020149787), Sirotin (US 20210279499), and Hu (US 20210192841). Regarding claim 12: Yamanashi in view of Satat, Thieret, and Sirotin teaches the limitations of claim 11 as applied above. Yamanashi in view of Satat, Thieret, and Sirotin does not specifically teach: wherein the processing parameter is at least one of: (i) a mesh smoothing parameter; (ii) a thinning parameter; (iii) a triangulation criterion; (iv) an edge optimization; (v) a maximum noise threshold; (vi) a minimum point distance; (vii) a maximum allowable residual. However, in a related field, Hu teaches: wherein the processing parameter is at least one of: (i) a mesh smoothing parameter; (ii) a thinning parameter; (iii) a triangulation criterion; (iv) an edge optimization; (v) a maximum noise threshold; (vi) a minimum point distance; (vii) a maximum allowable residual (¶ [0007] “identifying and discarding the one or more outliers in the 3D point cloud to generate the filtered point cloud… identifying the one or more outliers as points in the 3D point cloud that have a standard deviation from the mean Gaussian surface that is greater than a threshold standard deviation”; ¶ [0008] “identifying the one or more outliers as points in the 3D point cloud that are located at a physical distance from the mean Gaussian surface that is greater than a threshold physical distance”; ¶ [0009] “…determine whether at least one point of the point cloud exists within a threshold distance from a sampled point, and identifying a hole proximate to the sampled point upon determining that at least one point of the point cloud does not exist within the threshold distance from the sampled point. The system may also add the sampled point to the reconstruction dataset upon determining that least one point of the point cloud exists within the threshold distance from the sampled point”) Therefore, it would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified Yamanashi in view of Satat, Thieret, and Sirotin to incorporate the teachings of Hu by including: (i) a mesh smoothing parameter; (ii) a thinning parameter; (iii) a triangulation criterion; (iv) an edge optimization; (v) a maximum noise threshold; (vi) a minimum point distance; (vii) a maximum allowable residual in order to achieve capturing an images that have enough detail and are adequately textured. Regarding claim 15: Yamanashi discloses: a method of scanning a part, comprising: illuminating the part with a light from a light source of a scanning system, wherein the scanning system includes an imaging device (¶ [0029] – ¶ [0032], and FIG. 1 and 2, and light gloss determination device 100, 120 and illumination unit 310, camera 130, photographing unit 320, and image acquiring unit 330; ¶ [0040] “…illuminating unit 310 illuminates face 210)); obtaining a reflection of the light reflected from the part (¶ [0033] – ¶ [0036], ¶ [0046] “Gloss determining unit 350 determines a gloss condition of face 210 based on the input differential image, and outputs the determination result to skin condition determining unit 360”; ¶ [0035] “…gloss determination device 100 uses camera 130 with a polarization filter to extract, from reflected light, a brightness value of a component which is polarized in the same direction as the polarization direction of the polarized light and a brightness value of a component which is polarized in a perpendicular direction to the polarization direction of the polarized light. Then, gloss determination device 100 determines a part where a difference between the extracted brightness values is large as a part where the degree of specular reflection is high, or the degree of gloss on the surface of the substance is high”); determining, at a processor, a gloss unit of the light reflected from the part (¶ [0035] “…gloss determination device 100 uses camera 130 with a polarization filter to extract, from reflected light, a brightness value of a component which is polarized in the same direction as the polarization direction of the polarized light and a brightness value of a component which is polarized in a perpendicular direction to the polarization direction of the polarized light. Then, gloss determination device 100 determines a part where a difference between the extracted brightness values is large as a part where the degree of specular reflection is high, or the degree of gloss on the surface of the substance is high”; ¶ [0051] “…gloss determination device 100 may have, for example, a CPU (central processing unit)”); adjusting, at the processor, an acquisition parameter and (¶¶ [0078] – [0084], ¶ [0078] “Further, the gloss determination device may not present the skin condition, but may have a illumination controller that controls illumination based on a determined gloss condition so that the gloss condition becomes a desired condition. For example, the gloss determination device may have a plurality of light sources which can be individually dimmed, and may change the illumination pattern of the light sources while acquiring a gloss region under each illumination pattern. Then, the gloss determination device may perform illumination in an illumination pattern under which the area of the gloss region becomes minimum, and again photograph the face without using the polarizing filter.”); Yamanashi does not specifically teach: moving the light source, the gloss meter and the imaging device to a location in space with respect to the part. However, in the same field of endeavor, Satat teaches: moving the light source, (abstract, “…controlling a moveable component of the robotic device to move along a motion path relative to the target surface, wherein the moveable component comprises a light source and a camera… determining bidirectional reflectance distribution function (BRDF) image data, wherein the BRDF image data comprises the plurality of images converted to angular space with respect to the target surface.”). Therefore, it would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified Yamanashi to incorporate the teachings of Satat by including: a device for moving the light source, and the imaging device to a location in space with respect to the part in order to use the system to determine a material property of a surface. Yamanashi in view of Satat does not specifically teach that the gloss meter is also movable by the device. However, in a related field of endeavor, Thieret teaches: a device for moving the gloss meter (¶ [0042] “…the gloss-meter may be mounted on a movable platform in order to sample a variety of readings in the final image”). Therefore, it would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified Yamanashi and Satat to incorporate the teachings of Thieret by including: a device for moving the gloss meter in order to sample a variety of readings to optimize a subsequent operation. Yamanashi does not specifically teach: adjusting, at the processor, a processing parameter of the scanning system based on the gloss unit. However, Sirotin teaches: adjusting, at the processor, a processing parameter of the scanning system based on the gloss unit (¶ [0098] “In response to the threshold RSR qualification 1402 indicating non-suitability 1404B of the captured image, the process 1400 can proceed to flagging 1410 the current captured image for a follow on action, for example, a selection 1411 between image capture remediation 1412 applying a post processing correction 1414 to the captured image. The post processing correction 1414 can include image processing configured to generate, from the captured image, a corrected reflectance image, having an RSR suitable for biometric identifying 1406. The process 1400 can include a determining 1416 of whether the post processing correction 1414 is successful.”). Therefore, it would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified Yamanashi to incorporate the teachings of Sirotin by including: adjusting, at the processor, a processing parameter of the scanning system based on the gloss unit in order to achieve an improved image by applying a post-processing technique. While Yamanashi teaches: obtaining, at the processor, an image of the part with the imaging device using the acquisition parameter ((¶¶ [0078] – [0084], ¶ [0078] “Further, the gloss determination device may not present the skin condition, but may have a illumination controller that controls illumination based on a determined gloss condition so that the gloss condition becomes a desired condition. For example, the gloss determination device may have a plurality of light sources which can be individually dimmed, and may change the illumination pattern of the light sources while acquiring a gloss region under each illumination pattern. Then, the gloss determination device may perform illumination in an illumination pattern under which the area of the gloss region becomes minimum, and again photograph the face without using the polarizing filter.”); And While Sirotin teaches: obtaining, at the processor, an image of the part with the imaging device using the processing parameter (¶ [0098] “In response to the threshold RSR qualification 1402 indicating non-suitability 1404B of the captured image, the process 1400 can proceed to flagging 1410 the current captured image for a follow on action, for example, a selection 1411 between image capture remediation 1412 applying a post processing correction 1414 to the captured image. The post processing correction 1414 can include image processing configured to generate, from the captured image, a corrected reflectance image, having an RSR suitable for biometric identifying 1406. The process 1400 can include a determining 1416 of whether the post processing correction 1414 is successful.”). Yamanashi in view of Satat, Thieret, and Sirotin does not specifically teach: obtaining, at the processor, a point cloud from an image of the part obtained using the light source and the imaging device operating with the acquisition parameter and the processor operating with the processing parameter. However, Hu teaches: obtaining, at the processor, a point cloud from an image of the part obtained using the light source and the imaging device operating with the acquisition parameter and the processor operating with the processing parameter (¶ [0005] “…receive a three-dimensional (3D) point cloud representing the surface, identify and discard one or more outliers in the 3D point cloud to generate a filtered point cloud using a Gaussian process”; ¶ [0007] “…identifying the one or more outliers as points in the 3D point cloud that have a standard deviation from the mean Gaussian surface that is greater than a threshold standard deviation”; ¶ [0009] “…The system may also add the sampled point to the reconstruction dataset upon determining that least one point of the point cloud exists within the threshold distance from the sampled point”). Therefore, it would have been obvious to a person of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified Yamanashi in view of Satat, Thieret, and Sirotin to incorporate the teachings of Hu by including: obtaining, at the processor, a point cloud from an image of the part obtained using the light source and the imaging device operating with the acquisition parameter and the processor operating with the processing parameter in order to apply the technique to 3D images, obtaining a point cloud from an image is standard in the context of 3D images and only a matter of obviousness to a person of ordinary skill in the art that yields a predicable result. Regarding claim 5: the claims limitations are similar to those of claim 12; therefore, rejected in the same manner as applied to these claims. Regarding claims 16-20: the claims limitations are similar to those of claims 9, 10, 12, 13, and 14, respectively; therefore, rejected in the same manner as applied to these claims. Response to Arguments Applicant’s arguments with respect to claim(s) 8 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to WASSIM MAHROUKA whose telephone number is (571)272-2945. The examiner can normally be reached Monday-Thursday 8:00-5:00 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephen Koziol can be reached at (408) 918-7630. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /WASSIM MAHROUKA/Primary Examiner, Art Unit 2665
Read full office action

Prosecution Timeline

May 31, 2023
Application Filed
Oct 17, 2025
Non-Final Rejection — §103
Jan 22, 2026
Response Filed
Mar 11, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602739
JOINT DENOISING AND DEMOSAICKING METHOD FOR COLOR RAW IMAGES GUIDED BY MONOCHROME IMAGES
2y 5m to grant Granted Apr 14, 2026
Patent 12602950
SYSTEM AND METHODS FOR IDENTIFYING A VERIFIED SEARCHER
2y 5m to grant Granted Apr 14, 2026
Patent 12597248
TARGET OBJECT DETECTION DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12586421
ELECTRONIC DEVICE AND BIOMETRIC AUTHENTICATION METHOD USING SAME
2y 5m to grant Granted Mar 24, 2026
Patent 12579814
COMPUTER VISION-BASED ENERGY USAGE MANAGEMENT SYSTEM
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
86%
Grant Probability
93%
With Interview (+6.4%)
2y 5m
Median Time to Grant
Moderate
PTA Risk
Based on 243 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month