Prosecution Insights
Last updated: April 19, 2026
Application No. 18/040,413

IMAGE PROCESSING DEVICE AND ROBOT CONTROL DEVICE

Final Rejection §103
Filed
Feb 02, 2023
Examiner
HE, WEIMING
Art Unit
2611
Tech Center
2600 — Communications
Assignee
Fanuc Corporation
OA Round
4 (Final)
46%
Grant Probability
Moderate
5-6
OA Rounds
3y 4m
To Grant
60%
With Interview

Examiner Intelligence

Grants 46% of resolved cases
46%
Career Allow Rate
190 granted / 410 resolved
-15.7% vs TC avg
Moderate +14% lift
Without
With
+13.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
40 currently pending
Career history
450
Total Applications
across all art units

Statute-Specific Performance

§101
7.4%
-32.6% vs TC avg
§103
59.2%
+19.2% vs TC avg
§102
12.4%
-27.6% vs TC avg
§112
15.0%
-25.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 410 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The amendment filed on 2/2/26 has been entered and made of record. Claim 1is amended. Claims 7 and 9-11 are cancelled. Claims 1-6 and 8 are pending Response to Arguments Applicant’s arguments with respect to claim 1 have been considered but they are not persuasive. Applicant contends that Hilbert merely uses a histogram of the captured image, and does not describe calculating a reference histogram of a brightness of an object image captured with a reference exposure time, as recited in amended claim 1 (p. 7 of Remarks). Examiner notices that reference Suzuki discloses “The image processing unit 200 executes processing of determining the minimum exposure amount (minimum exposure time)Ts by evaluating the brightness image data Ys at S304, S306, S330, and S332… FIGS. 4A and 5A illustrate histograms of the brightness image data Ys obtained from input images obtained in the photographing before execution of the processing at these steps” in [0071]; “As a result, the histogram of the brightness image data Ys derived from the input image obtained by the photo graphing performed in the subsequent cycle becomes as illustrated in FIG. 4B” in [0075]. Here, a dynamic calculation of a histogram of a brightness is processed on any captured image. The claimed “a reference histogram of a brightness of an object image” can be obtained from the stored object image or a previous object image corresponding to a current object image. Hilbert also discloses “the control unit is configured to detect the scene change in the capturing region based on a histogram of the last-captured image” in [0040]. Here, the last-captured image refers to a previous captured image corresponding to a current captured image. Therefore, either Suzuki or Hilbert teaches calculating a reference histogram of a brightness of an object image. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-6 are rejected under 35 U.S.C. 103 as being unpatentable over SUZUKI (US 2012/0257077 A1) in view of Toyoda (JP2013246149A) and Shen et al. (CN110378216A), further in view of Hilbert et al. (US 2020/0103728 A1). As to Claim 1, SUZUKI teaches A robot system comprising: the image processing device comprising: a first exposure time determination unit that determines a minimum value of an exposure time for image-capture of the object; a second exposure time determination unit that determines a maximum value of the exposure time for image-capture of the object (SUZUKI discloses “Since the exposure amount is adjusted by adjusting the exposure time in this embodiment, the maximum/minimum exposure amount determining unit 212 determines the maximum exposure time TL, and the minimum exposure time Ts” in [0065]); an image-capture condition determination unit that determines the exposure time for image-capture of the object and the number of times of image-capture of the object based on an exposure time range including the determined minimum value of the exposure time and the determined maximum value of the exposure time (SUZUKI discloses “The image quantity determining unit 214 determines the number n of the input images with different exposures obtained during the live-view display operation on the basis of the maximum exposure time TL and the minimum exposure time Ts” in [0066]); a synthetic image generation unit that combines a plurality of object images, which have been captured with the determined exposure time and the determined number of times of image-capture, into a synthetic image (SUZUKI discloses “a live-view image output unit which outputs a live-view image on the basis of the synthesized image created by synthesizing a set of n (n is an integer not less than 2) input images with different exposure amounts obtained from the image pickup unit in the image synthesizing unit during a live-view display operation” in [0007]; see also [0034, 0128].), wherein the synthetic image generation unit saves all the plurality of captured images, or saves the image before tone mapping, thereby generating a synthetic image by a different synthesis method (SUZUKI discloses “The image holding unit 202 is a buffer composed of an SDRAM (Synchronous Dynamic Random Access Memory) or the like and temporarily stores two input images obtained from the image pickup unit 220. The image synthesizing unit 204 creates a synthesized image by synthesizing two input images with different exposures obtained from the image pickup unit 220.” in [0057], see also [0112]). SUZUKI is silent on following limitations: a robot that performs a process of a workpiece; a robot control device that controls the robot; a visual sensor that images the workpiece as an object; and an image processing device that processes a captured image by the visual sensor, wherein the robot control device corrects, using the captured image by the visual sensor, movement of the robot such that the robot performs a predetermined process at a position of the workpiece. Toyoda further discloses “A workpiece position detection device capable of detecting the position of a workpiece accurately in a short time is provided. The workpiece position detection device (1) includes a camera (5) having an exposure adjustment function, a memory unit (103), and a control unit (102). The camera 5 captures images of a plurality of workpieces 8 including a reference workpiece to obtain image data, and the image data is stored in the memory unit 103 . The control unit 102 selects one workpiece 8 from the multiple workpieces 8 and sets a selection area corresponding to the selected workpiece 8” in Abstract; “For this reason, the position and orientation of each workpiece is recognized from image data captured (photographed) by an imaging device, and the results are used to have a work robot sequentially remove the workpieces from the returnable box” in [0002]; “If parts of the image data are too bright or too dark, accurate contour information for the workpiece in that area cannot be obtained” in [0003]; “The robot control unit 20 controls the drive of the robot arm 7 based on the received position data of the workpiece 8, and grasps and removes the workpiece 8” in [0015]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of SUZUKI with the teaching of Toyoda so as to adjust exposure conditions for the selected area and detect the position of a workpiece accurately in a short period of time (Toyoda, [0008]). SUZUKI and Toyoda don’t explicitly teach wherein the image processing device adjusts a parameter regarding image synthesis by means of the saved images when object detection or inspection with the synthetic image has failed, or automatically attempt another parameter adjustment method, thereby preventing system stop. Shen further discloses “In one embodiment, if the target detection algorithm fails to detect a target object in a detection block where biological signals are detected, the camera parameters are adjusted, and the target detection algorithm continues to perform target detection in the detection block where biological signals are detected, in order to avoid the target detection algorithm failing to detect the target object due to ambient light issues. Among them, the camera's video parameters can be the camera's exposure parameters.” in [0106]. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of SUZUKI and Toyoda with the teaching of Shen so as to adjust a camera parameter to make system continuously perform target detection. In response to the argument “a third exposure time determination unit that calculates a reference histogram of a brightness of an object image captured with a reference exposure time between the minimum value of the exposure time and the maximum value of the exposure time, and stores the reference histogram in a storage unit, and calculates a third histogram of the brightness of the object image captured with the reference exposure time, and calculates an exposure time coefficient such that the third histogram is coincident with the reference histogram, wherein the image-capture condition determination unit calculates the exposure time range based on the minimum value of the exposure time, the maximum value of the exposure time, and the exposure time coefficient”, SUZUKI discloses “The image processing unit 200 executes processing of determining the minimum exposure amount (minimum exposure time)Ts by evaluating the brightness image data Ys at S304, S306, S330, and S332… FIGS. 4A and 5A illustrate histograms of the brightness image data Ys obtained from input images obtained in the photographing before execution of the processing at these steps” in [0071]; “As a result, the histogram of the brightness image data Ys derived from the input image obtained by the photo graphing performed in the subsequent cycle becomes as illustrated in FIG. 4B” in [0075]. Hilbert further discloses “A further advantageous embodiment of the system is characterized in that the control unit is configured to detect the scene change in the capturing region based on a histogram of the last-captured image. Thus, the control unit may for example be configured to determine the distribution of the brightness values and to detect a scene change in the capturing region if the histogram of the last-captured image represents a very dark image, particularly an underexposed image, or a very bright image, particularly an overexposed image. The histogram such may in this case represent the statistical accumulation of the grey values or the colour values of the image” in [0040]; “the last-determined measured brightness value is larger than the predetermined first brightness limit value and smaller than a predetermined second brightness limit value” in [0027]; “the control unit is configured to determine the exposure time target value and/or the sensor amplification target value in such a manner that a product value is in a predetermined target range, particularly between 0.1 and 0.9, or a predetermined value, particularly 0.5, wherein the product value is the product of the exposure time target value and the sensor amplification target value standardized to a predetermined product maximum value” in [0035], see also [0081]. Here, the last-captured image is a reference image with a reference exposure time between the min exposure time and max exposure time, while the product value between 0.1 and 0.9 is coincident with the reference histogram. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of SUZUKI, Toyoda and Shen with the teaching of Hilbert so as to determine the exposure time target value and the sensor amplification target value to achieve the histogram of brightness of the image within a target range (Hilbert, [0035]). As to Claim 2, SUZUKI in view of Toyoda, Shen and Hilbert teaches The robot system according to claim 1, wherein the first exposure time determination unit sets the minimum value of the exposure time for image-capture of the object in advance, calculates a brightness of an object image captured with the minimum value of the exposure time, and changes the minimum value of the exposure time in a case where a value based on the calculated brightness is smaller than a first threshold, and repeats image-capture of the object, brightness calculation, and exposure time minimum value change until the value based on the brightness reaches the first threshold or more, and determines the minimum value of the exposure time (SUZUKI discloses “As described above, the exposure time is changed in this embodiment in order to obtain input images with different exposures, thus these two input images with different exposures are discriminated and called a short exposure input image and along exposure input image, respectively… and creates the brightness image data Ys from the short exposure input image and the brightness image data YL from the long exposure input image data, respectively, and outputs them to the maximum/minimum exposure amount determining unit 212. This operation is repeatedly performed by the brightness distribution deriving unit 210…” in [0064]; “The image processing unit 200 executes processing of determining the minimum exposure amount (minimum exposure time)Ts by evaluating the brightness image data Ys at S304, S306, S330, and S332” in [0071]; judging if the brightness is less than a threshold value and adjusting the min exposure time in [0076-0077], see also Fig 3. Hilbert provides such example to explain updating the exposure time. Hilbert discloses “According to step f), the control unit is configured for determining an in particular average brightness of the control image last captured by means of the image sensor as measured brightness value. According to step g), the control unit for updating the exposure time by increasing the exposure time is configured in such a manner that the update is executed by the control unit in the event that the last-determined measured brightness value is smaller than a predetermined first brightness limit value. The steps e) to g) may form a group, wherein the control unit is configured to execute this group of steps repeatedly in groups” in [0026], see also [0076].) As to Claim 3, SUZUKI in view of Toyoda, Shen and Hilbert teaches The robot system according to claim 1, wherein the second exposure time determination unit sets the maximum value of the exposure time for image-capture of the object in advance, calculates a brightness of an object image captured with the maximum value of the exposure time, and changes the maximum value of the exposure time in a case where a value based on the calculated brightness is greater than a second threshold, and repeats image-capture of the object, brightness acquisition, and exposure time maximum value change until the value based on the brightness reaches the second threshold or less, and determines the maximum value of the exposure time (SUZUKI discloses “As described above, the exposure time is changed in this embodiment in order to obtain input images with different exposures, thus these two input images with different exposures are discriminated and called a short exposure input image and along exposure input image, respectively… and creates the brightness image data Ys from the short exposure input image and the brightness image data YL from the long exposure input image data, respectively, and outputs them to the maximum/minimum exposure amount determining unit 212. This operation is repeatedly performed by the brightness distribution deriving unit 210…” in [0064]; “The image processing unit 200 executes the processing of determining the maximum exposure amount (maximum exposure time) TL by evaluating the brightness image data YL at S312, S314. S340. and S342” in [0083]; judging if the brightness is larger than a threshold value and adjusting the max exposure time in [0084-0085], see also Fig 3. Hilbert provides such example to explain updating the exposure time. Hilbert discloses “According to step m), an update of the exposure time is provided by means of the control unit by an increase of the exposure time in the event that the last-determined measured brightness value is smaller than the predetermined first brightness limit value or an update of the exposure time is provided by means of the control unit by a reduction of the exposure time in the event that the last-determined measured brightness value is larger than the predetermined second brightness limit value” in [0031], see also [0076].) As to Claim 4, SUZUKI in view of Toyoda, Shen and Hilbert teaches The image processing device according to claim 1, wherein the captured image used for determining the exposure time is a reduced image (SUZUKI discloses “In the first embodiment described above, the image pickup unit 220 is operating in the monitoring mode during the live-view operation and pixel addition or pixel thinning is performed in the image pickup element 102 at that time, and the number of pixels of the input image is decreased” in [0129].) As to Claim 5, SUZUKI in view of Toyoda, Shen and Hilbert teaches The robot system according to claim 1. The combination of Hilbert further teaches wherein the first exposure time determination unit uses a minimum value of an exposure time for an image captured in advance when setting the minimum value of the exposure time in advance, and the second exposure time determination unit uses a maximum value of an exposure time for an image captured in advance when setting the maximum value of the exposure time in advance (Hilbert discloses “A further advantageous embodiment of the system is characterized in that the exposure time reference value is a target value between a predetermined minimum exposure time for the image sensor and 400-times the minimum exposure time. The minimum exposure time for the image sensor is for example the time value for the exposure time of the sensor, which may be provided minimally for the image sensor. Therefore, it may be the smallest possible exposure time for the image sensor” in [0023], see also predetermined maximum exposure time in [0035].) As to Claim 6, SUZUKI in view of Toyoda, Shen and Hilbert teaches The robot system according to claim 1. The combination of Hilbert further teaches wherein the first exposure time determination unit uses a minimum value of an exposure time for a captured image specified from an external source when setting the minimum value of the exposure time in advance, and the second exposure time determination unit uses a maximum value of an exposure time for a captured image from an external source when setting the maximum value of the exposure time in advance (Hilbert discloses “The iteration value may for example be between a tenth and a twentieth, particularly a sixteenth, of a difference between a predetermined maximum exposure time for the image sensor and the exposure time reference value, particularly the readout time” in [0028]. Here, the predetermined max exposure time is set in advance for the specific image sensor.) Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over SUZUKI in view of Toyoda and Shen, further in view of Hilbert and Zhang et al. (US 2014/0192227 A1). As to Claim 8, SUZUKI in view of Toyoda, Shen and Hilbert teaches The robot system according to claim 1, wherein the synthetic image generation unit is able to specify at least one of a blown-out highlight percentage or a blocked-up shadow percentage for the plurality of captured images, and performs tone mapping for the synthetic image in a state in which a pixel of the synthetic image corresponding to the blown-out highlight percentage is in white and a pixel of the synthetic image corresponding to the blocked-up shadow percentage is in black (SUZUKI discloses “That is, if the number of pixels in a considerably bright section or a blown-out section is large in the brightness image data Ys corresponding to the somewhat underexposed input images photographed in the minimum exposure time Ts, the determination at S304 is positive” in [0074]; “That is, though it is the brightness image data YL corresponding to the somewhat overexposed input image photographed in the maximum exposure time TL if the number of pixels in a considerably dark section or a blocked-up section is large, the determination at S312 becomes positive. In such a case, the image processing unit 200 prolongs the maximum exposure time TL at S314” in [0086]; “In this regard, in this embodiment, the processing at S304, S306, S330, and S332 suppresses blown-out highlights in the input image obtained by the shortest exposure time Ts, while the processing at S312, S314, S340, and S342 suppresses blocked-up shadows in the input image obtained by the longest exposure time TL” in [0094]; see also [0002-0003, 0075, 0079, 0090]. Zhang further teaches tone mapping, such as, “For instance, long-exposure times can be utilized to preserve details of dark regions and short-exposure times can be utilized to preserve details of bright regions (e.g., dynamic range of bright) from saturation. The long-exposure time and the short-exposure time can be combined to generate the resulting HDR image with extended HDR to show the details of both dark and bright regions that are not indicative of light source regions. HDR processing of block 612 may include at least one of applied tone mapping, adjusted exposure time, gamma correction and pixel bit-depth conversion to obtain optimal image quality within these other regions” in [0035].) It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the invention of SUZUKI, Toyoda, Shen and Hilbert with the teaching of Zhang so as to apply tone mapping to generate HDR image showing the details of both dark and bright region (Zhang, [0035]). Conclusion THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to WEIMING HE whose telephone number is (571)270-1221. The examiner can normally be reached Monday-Friday, 8:30am-5:00pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Tammy Goddard can be reached on 571-272-7773. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Weiming He/ Primary Examiner, Art Unit 2611
Read full office action

Prosecution Timeline

Feb 02, 2023
Application Filed
Feb 10, 2025
Non-Final Rejection — §103
May 13, 2025
Response Filed
Jun 10, 2025
Final Rejection — §103
Sep 12, 2025
Request for Continued Examination
Sep 15, 2025
Response after Non-Final Action
Oct 30, 2025
Non-Final Rejection — §103
Feb 02, 2026
Response Filed
Feb 25, 2026
Examiner Interview (Telephonic)
Feb 25, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12567135
MULTIMEDIA PLAYBACK MONITORING SYSTEM AND METHOD, AND ELECTRONIC APPARATUS
2y 5m to grant Granted Mar 03, 2026
Patent 12561876
System and method for an audio-visual avatar creation
2y 5m to grant Granted Feb 24, 2026
Patent 12514672
System, Method And Software Program For Aiding In Positioning Of Objects In A Surgical Environment
2y 5m to grant Granted Jan 06, 2026
Patent 12494003
AUTOMATIC LAYER FLATTENING WITH REAL-TIME VISUAL DEPICTION
2y 5m to grant Granted Dec 09, 2025
Patent 12468949
SYSTEMS AND METHODS FOR FEW-SHOT TRANSFER LEARNING
2y 5m to grant Granted Nov 11, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
46%
Grant Probability
60%
With Interview (+13.8%)
3y 4m
Median Time to Grant
High
PTA Risk
Based on 410 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month