Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Detailed Action
1. This Office Action is in response to the amendment filed on 2/6/2026.
Claim Status
2. Claims 21, 24, 26-27, 33, 35-37, and 39-40 have currently been amended.
Response to Arguments
3. The applicant’s arguments filed 2/6/2026 have been taken into consideration, but are moot in view of new grounds of rejection.
A. In response to the applicant’s argument that the cited prior art fails to teach or suggest a processor coupled to the image capturing device and configured to: receive the image authenticity detection configuration from a server, wherein the image authenticity detection configuration further indicates a set of changes that the device is authorized to make to the first image file:
Newly added prior art reference Felt et al (US 2015/0113441) has been cited, which discloses (as disclosed in fig. 5, par [0013], lines 5-15, par [0013], lines 7-17, and par [0038] of Felt et al), an image storage server providing information to a user device that the user is authorized to modify an image (e.g., the processor configured to receive the image authenticity detection configuration from a server), and the information specifying what modifications that the user may perform on the image, including inserting an animation in the image, inserting a sub-image within the image, inserting a sound in the image, etc. (e.g., wherein the image authenticity detection configuration further indicates a set of changes that the device is authorized to make to the first image file).
B. In response to the applicant’s argument that the cited prior art fails to teach or suggest generating a second image file based on the set of changes to the first image file:
Also see fig. 5 & par [0038] of Felt et al, which disclose generating a second image file based on the set of changes (e.g., publishing the modified, updated image).
Claim Rejections – 35 USC 103
4. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office Action:
A patent may not be obtained through the invention is not identically disclosed or described as set forth in of this title, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negatived by the manner in which the invention was made.
5. Claims 21, 23, 26-34, 36-38, and 40-41 are rejected under 35 USC 103 as being unpatentable over Prasad et al (US 10,380,562) in view of Harrington (US 2004/0086196), further in view of Felt et al (US 2015/0113441).
Regarding claim 21, Prasad et al teaches a device (fig. 2, ‘106), comprising:
an image capturing device configured to generate a first image file (fig. 2, ‘207) including a set of pixels (fig. 2, ‘315), wherein the first image file is generated based on one or more parameters indicated by an image authenticity detection configuration (col. 7, lines 2-13, which discloses that the check image file is created according to required image verification-related specifics), wherein the image authenticity detection configuration is determined based on at least one of device data related to the device, user data about a user (col. 7, lines 20-36, which discloses the user’s signature verification on the check to validate the captured image of the check), or environment data about an environment of the user or the device (col. 9, lines 10-28, which discloses the check image verification being based on the background of the captured check image); and
a processor coupled to the image capturing device (fig. 4, ‘454).
Prasad et al does not explicitly teach wherein the set or changes comprises at least one of changes to the set of pixels included in the first image file or changes to one or more other components of the first image file.
Harrington further teaches wherein the set or changes comprises at least one of changes to the set of pixels included in the first image file (par [0091], lines 1-5, which discloses digital image alteration causing isolated pixel changes in a reconstructed image) or changes to one or more other components of the first image file (par [0091], lines 3-11, which discloses implementing filters to a digital image, such as a noise removing filter and median filter).
It would have been obvious to one of ordinary skill in the art before the effective date of the invention to combine the teachings of Harrington within the embodiment disclosed by Prasad et al because implementing the hidden embedded digital image within an original image (as disclosed in par [0006-0008] of Harrington) within Prasad et al would cause Prasad et al to automatically reconstruct unset of changes to an original digital image which renders great improvement over embodiments that only detect unauthorized image alterations, instead of taking corrective actions.
Prasad et al and Harrington do not explicitly teach the processor configured to receive the image authenticity detection configuration from a server, wherein the image authenticity detection configuration further indicated a set of changes that the device is authorized to make to the first image file and generate a second image file based on the set of changes.
However, Felt et al teaches the processor configured to receive the image authenticity detection configuration from a server (par [0013], lines 5-15, which disclose an image storage server providing information to a user device that the user is authorized to modify an image), wherein the image authenticity detection configuration further indicates a set of changes that the device is authorized to make to the first image file (par [0013], lines 7-17, which discloses the information specifying what modifications that the user may perform on the image, including inserting an animation in the image, inserting a sub-image within the image, inserting a sound in the image, etc.) and generate a second image file based on the set of changes (fig. 5 & par [0038], which disclose publishing the modified, updated image).
It would have been obvious to one of ordinary skill in the art before the effective date of the invention to combine the teachings of Felt et al within the teachings of Prasad et al and Harrington in order to improve image modification in image modification collaborative environments by permitting authenticated users to add/superimpose layers in order to ensure that the collaborative users may only modify images to an extent and level in which the user is authorized to modify (as disclosed in par [0011] of Felt et al).
Regarding claim 23, Prasad et al and Harrington teach the limitations of claim 21.
Prasad et al further teaches wherein the first image file indicates a check with a payee (fig. 7 & col. 11, lines 31-35), an amount of payment (col. 11, lines 60-65), a date, a check number, a routing number (col. 11, lines 60-65), or a bank account number (col. 11, lines 60-65).
Regarding claim 26, Prasad et al does not explicitly teach wherein the set of changes include a checksum of information about the image capturing device used to generate the first image file.
Harrington further teaches wherein the set of changes include a checksum of information about the image capturing device used to generate the first image file (par [0094-0095], which discloses detecting alterations corresponding to the updated digital image).
It would have been obvious to one of ordinary skill in the art before the effective date of the invention to combine the teachings of Harrington within the embodiment disclosed by Prasad et al according to the motivation disclosed regarding claim 21.
Regarding claim 27, Prasad et al and Harrington teach the limitations of claim 21.
Prasad et al further teaches wherein the set of changes include information about the device or an operating system running on the device (fig. 6, ‘620, col. 6, lines 1-5, col. 6, lines 55-67, and col. 11, lines 3-8 which disclose performing conversion of check image files based on software operating on the device and mobile device OS).
Regarding claim 28, Prasad et al and Harrington teach the limitations of claim 21.
Prasad et al further teaches wherein the user data includes information about a fingerprint of the user, a user biometric data, a user date of birth, or an identification number associated with the user (col. 11, lines 65-67, “account number and/or name on the account”).
Regarding claim 29, Prasad et al and Harrington teach the limitations of claim 21.
Prasad et al further teaches wherein the device data includes information about a camera installed on the device, resolutions of the camera (col. 8, lines 63-67, “suitable resolution”), a device model number, or a device operating system information.
Regarding claim 30, Prasad et al and Harrington teach the limitations of claim 21.
Prasad et al further teaches wherein the environment data includes a time when the device sends to the server application the user data, the device data, or the environment data, or a location where the user is located (col. 9, lines 8-13).
Regarding claim 31, Prasad et al and Harrington teach the limitations of claim 21.
Prasad et al further teaches wherein the one or more parameters used to generate the first image file containing the set of pixels include a resolution (col. 1, lines 61-62), an aspect ratio, a color depth of a pixel, or an image format for the first image file (col. 6, lines 55-63).
Regarding claim 32, Prasad et al and Harrington teach the limitations of claim 21.
Prasad et al further teaches wherein the device includes a smart phone (col. 3, lines 45-50), and the image capturing device includes a camera (col. 3, lines 45-50).
Regarding claim 33, Prasad et al teaches a non-transitory computer-readable medium (fig. 2, ‘106) having instructions stored thereon that, when executed by a device, cause the device to perform operations comprising:
generating, by an image capturing device of the device, a first image file (fig. 2, ‘207) including a set of pixels (fig. 2, ‘315), wherein the first image file is generated based on one or more parameters indicated by the image authenticity detection configuration (col. 7, lines 2-13, which discloses that the check image file is created according to required image verification-related specifics), wherein the image authenticity detection configuration is determined based on at least one of device data related to the device, user data about a user (col. 7, lines 20-36, which discloses the user’s signature verification on the check to validate the captured image of the check), or environment data about an environment of the user or the device (col. 9, lines 10-28, which discloses the check image verification being based on the background of the captured check image).
Prasad et al does not explicitly teach wherein the set or changes comprises at least one of changes to the set of pixels included in the first image file or changes to one or more other components of the first image file.
Harrington further teaches wherein the set or changes comprises at least one of changes to the set of pixels included in the first image file (par [0091], lines 1-5, which discloses digital image alteration causing isolated pixel changes in a reconstructed image) or changes to one or more other components of the first image file (par [0091], lines 3-11, which discloses implementing filters to a digital image, such as a noise removing filter and median filter).
It would have been obvious to one of ordinary skill in the art before the effective date of the invention to combine the teachings of Harrington within the embodiment disclosed by Prasad et al because implementing the hidden embedded digital image within an original image (as disclosed in par [0006-0008] of Harrington) within Prasad et al would cause Prasad et al to automatically reconstruct unset of changes to an original digital image which renders great improvement over embodiments that only detect unauthorized image alterations, instead of taking corrective actions.
Prasad et al and Harrington do not explicitly teach the processor configured to receive the image authenticity detection configuration from a server, wherein the image authenticity detection configuration further indicated a set of changes that the device is authorized to make to the first image file and generate a second image file based on the set of changes.
However, Felt et al teaches the processor configured to receive the image authenticity detection configuration from a server (par [0013], lines 5-15, which disclose an image storage server providing information to a user device that the user is authorized to modify an image), wherein the image authenticity detection configuration further indicates a set of changes that the device is authorized to make to the first image file (par [0013], lines 7-17, which discloses the information specifying what modifications that the user may perform on the image, including inserting an animation in the image, inserting a sub-image within the image, inserting a sound in the image, etc.) and generate a second image file based on the set of changes (fig. 5 & par [0038], which disclose publishing the modified, updated image).
It would have been obvious to one of ordinary skill in the art before the effective date of the invention to combine the teachings of Felt et al within the teachings of Prasad et al and Harrington in order to improve image modification in image modification collaborative environments by permitting authenticated users to add/superimpose layers in order to ensure that the collaborative users may only modify images to an extent and level in which the user is authorized to modify (as disclosed in par [0011] of Felt et al).
Regarding claim 34, Prasad et al and Harrington teach the limitations of claim 33.
Prasad et al further teaches wherein the first image file indicates a check with at least one of a payee (fig. 7 & col. 11, lines 31-35), an amount of payment (col. 11, lines 60-65), a date, a check number, a routing number (col. 11, lines 60-65), or a bank account number (col. 11, lines 60-65).
Regarding claim 36, Prasad et al and Harrington teach the limitations of claim 33.
Prasad et al further teaches wherein the set of changes include information about the device or an operating system running on the device (fig. 6, ‘620, col. 6, lines 1-5, col. 6, lines 55-67, and col. 11, lines 3-8 which disclose performing conversion of check image files based on software operating on the device and mobile device OS).
Prasad et al does not explicitly teach wherein the set of changes include a checksum of information about the image capturing device used to generate the first image file.
Harrington further teaches wherein the set of changes include a checksum of information about the image capturing device used to generate the first image file (par [0094-0095], which discloses detecting alterations corresponding to the updated digital image).
It would have been obvious to one of ordinary skill in the art before the effective date of the invention to combine the teachings of Harrington within the embodiment disclosed by Prasad et al according to the motivation disclosed regarding claim 33.
Regarding claim 37, Prasad et al teaches a computer-implemented method, the method comprising:
generating, by an image capturing device of the device, a first image file (fig. 2, ‘207) including a set of pixels (fig. 2, ‘315), wherein the first image file is generated based on one or more parameters indicated by the image authenticity detection configuration (col. 7, lines 2-13, which discloses that the check image file is created according to required image verification-related specifics), wherein the image authenticity detection configuration is determined based on at least one of device data related to the device, user data about a user (col. 7, lines 20-36, which discloses the user’s signature verification on the check to validate the captured image of the check), or environment data about an environment of the user or the device (col. 9, lines 10-28, which discloses the check image verification being based on the background of the captured check image).
Prasad et al does not explicitly teach wherein the set or changes comprises at least one of changes to the set of pixels included in the first image file or changes to one or more other components of the first image file.
Harrington further teaches wherein the set or changes comprises at least one of changes to the set of pixels included in the first image file (par [0091], lines 1-5, which discloses digital image alteration causing isolated pixel changes in a reconstructed image) or changes to one or more other components of the first image file (par [0091], lines 3-11, which discloses implementing filters to a digital image, such as a noise removing filter and median filter).
It would have been obvious to one of ordinary skill in the art before the effective date of the invention to combine the teachings of Harrington within the embodiment disclosed by Prasad et al because implementing the hidden embedded digital image within an original image (as disclosed in par [0006-0008] of Harrington) within Prasad et al would cause Prasad et al to automatically reconstruct unset of changes to an original digital image which renders great improvement over embodiments that only detect unauthorized image alterations, instead of taking corrective actions.
Prasad et al and Harrington do not explicitly teach the processor configured to receive the image authenticity detection configuration from a server, wherein the image authenticity detection configuration further indicated a set of changes that the device is authorized to make to the first image file and generate a second image file based on the set of changes.
However, Felt et al teaches the processor configured to receive the image authenticity detection configuration from a server (par [0013], lines 5-15, which disclose an image storage server providing information to a user device that the user is authorized to modify an image), wherein the image authenticity detection configuration further indicates a set of changes that the device is authorized to make to the first image file (par [0013], lines 7-17, which discloses the information specifying what modifications that the user may perform on the image, including inserting an animation in the image, inserting a sub-image within the image, inserting a sound in the image, etc.) and generate a second image file based on the set of changes (fig. 5 & par [0038], which disclose publishing the modified, updated image).
It would have been obvious to one of ordinary skill in the art before the effective date of the invention to combine the teachings of Felt et al within the teachings of Prasad et al and Harrington in order to improve image modification in image modification collaborative environments by permitting authenticated users to add/superimpose layers in order to ensure that the collaborative users may only modify images to an extent and level in which the user is authorized to modify (as disclosed in par [0011] of Felt et al).
Regarding claim 38, Prasad et al and Harrington teach the limitations of claim 37.
Prasad et al further teaches wherein the first image file indicates a check with at least one of a payee (fig. 7 & col. 11, lines 31-35), an amount of payment (col. 11, lines 60-65), a date, a check number, a routing number (col. 11, lines 60-65), or a bank account number (col. 11, lines 60-65).
Regarding claim 40, Prasad et al and Harrington teach the limitations of claim 37.
Prasad et al further teaches wherein the set of changes include information about the device or an operating system running on the device (fig. 6, ‘620, col. 6, lines 1-5, col. 6, lines 55-67, and col. 11, lines 3-8 which disclose performing conversion of check image files based on software operating on the device and mobile device OS).
Prasad et al does not explicitly teach wherein the set of changes include a checksum of information about the image capturing device used to generate the first image file.
Harrington further teaches wherein the set of changes include a checksum of information about the image capturing device used to generate the first image file (par [0094-0095], which discloses detecting alterations corresponding to the updated digital image).
It would have been obvious to one of ordinary skill in the art before the effective date of the invention to combine the teachings of Harrington within the embodiment disclosed by Prasad et al according to the motivation disclosed regarding claim 37.
Regarding claim 41, Prasad et al, Harrington, and Felt et al do not explicitly teach wherein the device data includes information about a camera installed on the device, resolutions of the camera, a device model number, or device operating system information.
However, Prasad et al teaches wherein the device data includes information about a camera installed on the device, resolutions of the camera, a device model number (fig. 4, ‘456), or device operating system information.
6. Claims 24-25, 35, and 39 are rejected under 35 USC 103 as being unpatentable over Prasad et al (US 10,380,562) in view of Harrington (US 2004/0086196) in view of Felt et al (US 2015/0113441), further in view of Amorium (US 2009/0236412).
Regarding claim 24, Prasad et al teaches wherein the first image file includes a check image (col. 12, lines 5-10, “image file pertaining to the check”).
Prasad et al, Harrington, and Felt et al do not explicitly teach wherein the set of changes include an extra line of pixels along a border of the check image displayed based on the first image file.
Amorium further teaches wherein the set of changes include an extra line of pixels along a border of the check image displayed based on the first image file (par [0028], lines 10-18, which discloses providing a modified arrangement of pixels associated with a check image).
It would have been obvious to one of ordinary skill in the art before the effective date of the invention to combine the teachings of Amorium within the embodiment disclosed by Prasad et al, Harrington, and Felt et al in order to provide improved image authentication when automatically embedding alphanumeric data in addition to data provided by the user that submitted the image (as disclosed in fig. 4B, ‘130 of Amorium) because this feature would allow the image data provided by Prasad et al, Harrington, and Felt et al to be processed with more clarity when additional data is automatically appended to each image in order for each entity viewing the received image to be provided with more in depth data verifying the account associated with the image.
Regarding claim 25, Prasad et al, Harrington, and Felt et al do not explicitly teach wherein the extra line of pixels along the border of the check image includes a visible text related to the one or more parameters used to generate the first image file.
Amorium further teaches wherein the extra line of pixels along the border of the check image includes a visible text related to the one or more parameters used to generate the first image file (fig. 6-7).
It would have been obvious to one of ordinary skill in the art before the effective date of the invention to combine the teachings of Amorium within the embodiment disclosed by Prasad et al, Harrington, and Felt et al according to the motivation disclosed regarding claim 24.
Regarding claim 35, Prasad et al teaches wherein the first image file includes a check image (col. 12, lines 5-10, “image file pertaining to the check”).
Prasad et al, Harrington, and Felt et al do not explicitly teach wherein the set of changes include an extra line of pixels along a border of the check image displayed based on the first image file and wherein the extra line of pixels along the border of the check image includes a visible text related to the one or more parameters used to generate the first image file.
Amorium further teaches wherein the set of changes include an extra line of pixels along a border of the check image displayed based on the first image file (par [0028], lines 10-18, which discloses providing a modified arrangement of pixels associated with a check image) and wherein the extra line of pixels along the border of the check image includes a visible text related to the one or more parameters used to generate the first image file.
It would have been obvious to one of ordinary skill in the art before the effective date of the invention to combine the teachings of Amorium within the embodiment disclosed by Prasad et al, Harrington, and Felt et al in order to provide improved image authentication when automatically embedding alphanumeric data in addition to data provided by the user that submitted the image (as disclosed in fig. 4B, ‘130 of Amorium) because this feature would allow the image data provided by Prasad et al, Harrington, and Felt et al to be processed with more clarity when additional data is automatically appended to each image in order for each entity viewing the received image to be provided with more in depth data verifying the account associated with the image.
Regarding claim 39, Prasad et al teaches wherein the first image file includes a check image (col. 12, lines 5-10, “image file pertaining to the check”).
Prasad et al, Harrington, and Felt et al do not explicitly teach wherein the set of changes include an extra line of pixels along a border of the check image displayed based on the first image file and wherein the extra line of pixels along the border of the check image includes a visible text related to the one or more parameters used to generate the first image file.
Amorium further teaches wherein the set of changes include an extra line of pixels along a border of the check image displayed based on the first image file (par [0028], lines 10-18, which discloses providing a modified arrangement of pixels associated with a check image) and wherein the extra line of pixels along the border of the check image includes a visible text related to the one or more parameters used to generate the first image file.
It would have been obvious to one of ordinary skill in the art before the effective date of the invention to combine the teachings of Amorium within the embodiment disclosed by Prasad et al, Harrington, and Felt et al in order to provide improved image authentication when automatically embedding alphanumeric data in addition to data provided by the user that submitted the image (as disclosed in fig. 4B, ‘130 of Amorium) because this feature would allow the image data provided by Prasad et al, Harrington, and Felt et al to be processed with more clarity when additional data is automatically appended to each image in order for each entity viewing the received image to be provided with more in depth data verifying the account associated with the image.
Conclusion
Applicant's amendment necessitated the new grounds of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Randy A. Scott whose telephone number is (571) 272-3797. The examiner can normally be reached on Monday-Thursday 7:30 am-5:00 pm, second Fridays 7:30 am-4pm.
If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Luu Pham can be reached on (571) 270-5002. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/RANDY A SCOTT/Primary Examiner, Art Unit 2439 20260305