DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . It is responsive to the submission dated 10/13/2023. Claims 1-20 are presented for examination.
Information Disclosure Statement
2. The information disclosure statements (IDSs) submitted on 05/15/2024 are in compliance with the provisions of 37 CFR 1.97 and are being considered by the Examiner.
Claim Rejections - 35 USC § 103
3. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
4. Claims 1-2, and 4-20 are rejected under 35 U.S.C. 103 as being unpatentable over Scherzer (US 20240168465) in view of Vico et al. (US 20160023603).
Regarding claim 1, Scherzer discloses a system for comparing updated content on a [vehicle] display to control content (e.g., an automated line clearance inspection system … that includes a set of image-capturing devices that are controlled via a central processing unit whereby end-run images are compared with control images to determine if a line is cleared. See abstract and paras. 8-9 and 41), comprising:
a display (124a, fig. 1b); a processor (122); and memory (112) coupled to the processor and including one or more programs (see fig.1 and paras. 30-32 and 35-41) that:
generate multiple generated images for presentation on a [vehicle] display (e.g., a set of image-collecting devices 104, including cameras, for obtaining a set of end-run images from each of a set of image-collecting devices (see paras. 11-13). In operation, the display module 124a may generate the images and layouts that are displayed on either the user devices 116, 118 and 120 and/or the computing device 102. See para. 41);
compare generated pixel data (e.g., areas or end-run of collected images including the corresponding bit color channels from each photo sensor (pixel) that is part of the each collected images) of each of the generated images to control pixel data (e.g., bit data or size of control images) of a corresponding control image that is provided for each generated image (e.g., the collected images including a set of control images and a set of end run images; and the system includes a processor for comparing each of the set of end run images with control images from each of the set of image collecting devices (see paras. 11-13). In one embodiment, the image capture module 124f may obtain calibration images and control images and the comparison/verification module 124d compares these images to verify that the image collecting devices 104 are still calibrated. See paras. 41 and 50. Scherzer further teaches: After each series of image capturing events, the control images and their corresponding end run, or end of run images are stored … as 24-bit hexadecimal data from each photo sensor (pixel) that is part of the image collecting device. Each photo sensor has 3–8-bit color channels, with each channel having a value between 0 and 255, with 0 being totally black and 255 totally white, with all of the colors in between represented by the standard ANSI values. The images are then re-sized to a predetermined size (502). In one embodiment, the width may be selected as 200 pixels while the height is based on a height/width ratio of the control image. This re-sizing reduces the noise of the image during image processing. The images can then converted to grayscale images (504) to reduce the effects of lighting condition and to make the colour of the packaging and/or product irrelevant in the image comparison. The images can then be blurred (506). In one embodiment, the blurring may be performed by using a two-dimensional (2D) convolution averaging filter, for example a 5×5 convolution averaging filter. Blurring may also be performed by using a Gaussian blur filter or other blur filters. A difference percentage between the control image and the end run image may then be calculated (508). In one embodiment, the difference in the images (or a difference image) may be calculated or generated using a structural similarity index measure (SSIM) or using a mean square error (MSE) method. SSIM is preferred since, unlike MSE, SSIM uses more advanced statistical parameters in order to find a better difference image based on the actual shapes and structures in the image rather than just the pixels. See paras. 57-58);
determine if the generated pixel data is different than the control pixel data (e.g., the comparison/verification module 124d compares these images to verify that the image collecting devices 104 are still calibrated. See para. 41. The line operator or the user of the computing device can request the image collecting devices obtain the control images from the image collecting devices 446) and to compare the calibration images with control images 448, display control images and percentage difference between calibration image and control image to user 450…… The control images are stored in the database and may be organized by batch number, image description, date and/or time of day. These control images are then compared with the calibration images. Based on the comparison, the line operator can input the second verification or the system can determine that a second verification has been completed if the percentage difference between the calibration image or images and the control image or images is lower than a predetermined threshold. See paras. 50-54. Scherzer further teaches: After each series of image capturing events, the control images and their corresponding end run, or end of run images are stored … as 24-bit hexadecimal data from each photo sensor (pixel) that is part of the image collecting device. Each photo sensor has 3–8-bit color channels, with each channel having a value between 0 and 255, with 0 being totally black and 255 totally white, with all of the colors in between represented by the standard ANSI values. The images are then re-sized to a predetermined size (502). In one embodiment, the width may be selected as 200 pixels while the height is based on a height/width ratio of the control image. This re-sizing reduces the noise of the image during image processing. The images can then converted to grayscale images (504) to reduce the effects of lighting condition and to make the colour of the packaging and/or product irrelevant in the image comparison. The images can then be blurred (506). In one embodiment, the blurring may be performed by using a two-dimensional (2D) convolution averaging filter, for example a 5×5 convolution averaging filter. Blurring may also be performed by using a Gaussian blur filter or other blur filters. A difference percentage between the control image and the end run image may then be calculated (508). See paras. 57-58); and
generate an output image including pixels having generated pixel data that is different than the control pixel data (e.g., The report generating module 124e may generate reports for display to users, or line operators, based on determinations from the comparison/verification module. See para. 41. A black and white image with only outlines of difference is then obtained. See para. 59. The final produced difference image is then saved (520). The difference image may then be transmitted as an inspection report. Each image collecting device will have its own difference image and any differences are highlighted. The image differences are summarized in the inspection report for that product batch and stored in the data storage. See para. 60).
The Scherzer reference differs from the instant claim 1 in that it fails to explicitly disclose comparing the content images on a vehicle display, which is disclosed by Vico. See abstract and paras 4-9 of Vico.
Accordingly, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Scherzer to include the display of content images for comparison on a vehicle display, in the same conventional manner as taught by Vico, so as to provide information regarding the width of the vehicle 10 and rough distance information behind the vehicle, as an overlay to shows the projected path of the vehicle 10, based on the current steering wheel angle of the vehicle, as guidance, to assist the driver to effectively detect objects around the vehicle while driving to prevent accident. See para. 3 of Vico.
As per claim 2, Scherzer, as modified by Vico, discloses the output image does not include pixels of a generated image in which the generated pixel data is not different than the control pixel data (e.g., generating images that is cleared from unwanted products and confirming that the devices are viewing the specified inspection location to reduce or prevent the likelihood of incorrect packaging). See para. 41-47 and 51-53 of Scherzer and the rationale above with respect to claim 1 for reasons of obviousness.
As per claim 4, Scherzer, as modified by Vico, discloses the output image is generated when a difference between the generated pixel data varies from the control pixel data by more than a threshold amount in a region of the generated image. See paras. 51 and 57-59.
As per claim 5, Scherzer, as modified by Vico, discloses the threshold is exceeded when at least three percent of pixels within one region in the generated image are different than the corresponding pixels in the control image. See paras. 57-58 of Scherzer and the rationale above with respect to claim 1 for reasons of obviousness.
As per claims 6-7, Scherzer, as modified by Vico, discloses the output image is generated when a difference between the generated pixel data varies from the control pixel data by more than a threshold amount in a region of the generated image, wherein the threshold is exceeded when at least three percent of pixels within one region in the generated image are different than the corresponding pixels in the control image. See paras. 57-59 of Scherzer and the rationale above with respect to claim 1 for reasons of obviousness.
As per claim 8, Scherzer, as modified by Vico, discloses the difference is determined if a color value for a pixel in the generated image differs from the color value for a corresponding pixel in the corresponding control image. See paras. 57-58 of Scherzer and the rationale above with respect to claim 1 for reasons of obviousness.
As per claim 9, Scherzer, as modified by Vico, discloses each pixel is assigned a color code (e.g., specified bit color channel) and wherein a color code threshold is set for a difference between the color code of a pixel in a generated image from the color code (e.g., data bit) of a corresponding pixel in the corresponding control image. See para. 57 of Scherzer and the rationale above with respect to claim 1 for reasons of obviousness.
As per claim 10, Scherzer, as modified by Vico, discloses the output image is generated when at least a first threshold number of pixels in a region of one of the generated images have a color code that differs from the color code of the corresponding pixels in the corresponding control image. See paras. 57-58 of Scherzer and the rationale above with respect to claim 1 for reasons of obviousness.
As per claim 11, Scherzer, as modified by Vico, discloses the generated images convey information in one or more predetermined review areas and wherein the review areas are compared against corresponding areas in corresponding control images, and wherein other areas not within the review areas in one of the generated images are not compared to corresponding other areas of the corresponding control image so that only portions of the generated image are compared to the corresponding control image. See paras. 61-62, in view of paras. 57-60, of Scherzer and the rationale above with respect to claim 1 for reasons of obviousness.
As per claim 12, Scherzer, as modified by Vico, discloses for a pixel in which the generated pixel data is different than the pixel data of a corresponding area of the corresponding control image, the generated pixel data is compared to an intended pixel data indicative of an intended change, and the pixel is removed from the output image if the generated pixel data for that pixel is the same as or within a threshold of the intended pixel data. See paras. 46-53 and 57-60 of Scherzer and the rationale above with respect to claim 1 for reasons of obviousness.
The invention of claim 13 contains features that correspond in scope with the limitations recited claim 1. As the limitations of claim 1 were found obvious over the combined teachings of Scherzer and Vico, it is readily apparent that the applied prior arts perform the underlying elements. As such, the limitations of claim 13 are, therefore, subject to rejections under the same rationale as claim 1.
Regarding claim 14, said claim has similar limitations to those of claim 2, therefore it is rejected under same rationale as claim 2.
Regarding claim 15, said claim has similar limitations to those of claim 4, therefore it is rejected under same rationale as claim 4.
Regarding claim 16, said claim has similar limitations to those of claim 5, therefore it is rejected under same rationale as claim 5.
Regarding claims 17-18, said claim has similar limitations to those of claims 6-7, therefore it is rejected under same rationale as claims 6-7.
Regarding claim 19, said claim has similar limitations to those of claim 11, therefore it is rejected under same rationale as claim 11.
Regarding claim 20, said claim has similar limitations to those of claim 12, therefore it is rejected under same rationale as claim 12.
Allowable Subject Matter
5. Claim 3 is objected to as being dependent upon a rejected base claim but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims, because the prior art of record fail to teach the system of claim 1 wherein the output image includes adjacent pixels having generated pixel data that is not different from the control pixel data, wherein the adjacent pixels are within an area that includes at least one pixel having generated pixel data that is different from the control pixel data.
Conclusion
6. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Bechtel et al. (US 9230183) discloses an automatic vehicle equipment control system and methods thereof are provided, the system includes at least one imager configured to acquire a continuous sequence of high dynamic range single frame images, a processor, a color spectral filter array including a plurality of color filters, at least a portion of which are different colors, and pixels of an imager pixel array being in optical communication with substantially one spectral color filter, and a lens, wherein the imager is configured to capture a non-saturated image of nearby oncoming headlamps and at least one of a diffuse lane marking and a distant tail lamp in one image frame of the continuous sequence of high dynamic range single frame images, and the system configured to detect at least one of said highway markings and said tail lamps, and quantify light from the oncoming headlamp from data in the one image frame.
7. Any inquiry concerning this communication or earlier communications from the examiner should be directed to WESNER SAJOUS whose telephone number is (571) 272-7791. The examiner can normally be reached on M-F 10:00 TO 7:30 (ET).
Examiner interviews are available via telephone and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice or email the Examiner directly at wesner.sajous@uspto.gov.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Said Broome can be reached on 571-272-2931. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/WESNER SAJOUS/Primary Examiner, Art Unit 2612
WS
01/23/2026