Prosecution Insights
Last updated: April 19, 2026
Application No. 18/428,291

System and method for routing images of interaction request documents of different types

Final Rejection §102§103
Filed
Jan 31, 2024
Examiner
SHIN, SOO JUNG
Art Unit
2667
Tech Center
2600 — Communications
Assignee
BANK OF AMERICA CORPORATION
OA Round
2 (Final)
87%
Grant Probability
Favorable
3-4
OA Rounds
2y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 87% — above average
87%
Career Allow Rate
527 granted / 604 resolved
+25.3% vs TC avg
Strong +16% interview lift
Without
With
+16.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
28 currently pending
Career history
632
Total Applications
across all art units

Statute-Specific Performance

§101
7.6%
-32.4% vs TC avg
§103
37.5%
-2.5% vs TC avg
§102
19.9%
-20.1% vs TC avg
§112
24.2%
-15.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 604 resolved cases

Office Action

§102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. Response to Amendment The amendment filed on 27 February 2026 has been entered. The amendment of claims 1, 2, 6-10, 14-16, and 20 has been acknowledged. Response to Arguments Applicant's arguments filed on 27 February 2026, with respect to the pending claims, have been fully considered but they are not persuasive. Applicant’s Representative submits that the amended claims do not teach sending the first front and back images to the first and second image processing system simultaneously. Applicant’s Representative further submits that the prior art does not teach receiving a first notification indicating that the second converted front and back images of the first front and back images are not generate successfully. The examiner respectfully disagrees. The prior art of record teaches sending the first front and back images to the first and second image processing system simultaneously as indicated in the previous Office action. Please see Strange ¶¶0076: “The mobile device 102 is connected with an image processing unit 104 over a network so that the mobile device 102 can transmit captured images or image data to the image processing unit 104”; Strange Figs. 8-11: the captured images are cropped and transformed to generate a rectangular image; Strange ¶¶0138-¶¶0139: “A geometrical transformation of the document subimage can be performed using the perspective transformation built in step 1210 (step 1215). The geometrical transformation corrects the perspective distortion present in the document subimage … A ‘dewarping’ operation can also be performed on the document subimage (step 1220)”; Strange ¶¶0210: “operations that are either performed sequentially or concurrently” (emphasis added). The operations are directed to two different image processing and are either performed at the same time or sequentially and the images fed into the operations of Fig. 21 are already cropped/transformed as described in Strange Figs. 8-11 & ¶¶0138-¶¶0139. The claims do not specify what the first and second image processing are except that they are both directed to an interaction type and therefore, any image processing that interacts with the images are considered to satisfy this limitation. In addition, the prior art teaches receiving a first notification indicating that the second converted front and back images of the first front and back images are not generate successfully. Please see Strange ¶¶0243: “a test can be flagged as ‘affects overall status.’ These tests are also referred to here as ‘critical’ tests. If a mobile image fails a critical test, the MDIPE 2100 rejects the image and can provide detailed information to the mobile device user explaining why the image was not of a high enough quality for the mobile application and that provides guidance for retaking the image to correct the defects that caused the mobile document image to fail the test, in the event that the defect can be corrected by retaking the image” (emphasis added). An image that does not meet the quality requirement is an image that is not successfully generated. Claim Rejections - 35 USC § 102 Claim(s) 1-4, 6-12, 14-18, and 20 is/are rejected under 35 U.S.C. 102(a)(1) and 102(a)(2) as being anticipated by Strange et al. (US 2013/0297353 A1), hereinafter referred to as Strange. Regarding claim 1, Strange teaches a system comprising: an interceptor system communicatively coupled to an interaction processing system, a first image processing and a second image processing system (Strange Abstract: “An application on a mobile device provides for the initiation and submission of an insurance claim by capturing information and images of documents”; Strange Fig. 2: Complete and Verify Claim Request 214 – the verification process occurs before, i.e., intercepted, before the claim is approved), wherein: the first image processing system is configured to process images of interaction request documents of a first type (Strange Abstract discussed above; Strange Fig. 4 & ¶¶0074: “by capturing images of documents and other information needed for the insurance claim using the image capture capability, then processing the images to extract content which is transmitted to an insurance company for processing of the claim”); the second image processing system is configured to process images of interaction request documents of a second type (Strange Abstract, Fig. 4, & ¶¶0074 discussed above; Strange ¶¶0009: “location-based services”; Strange ¶¶0083: “The vehicle information may be obtained from several different documents and locations which can be captured by an image capture device on the mobile device, including an insurance card, a vehicle registration card or a vehicle identification number (VIN) on the vehicle itself”; Strange ¶¶0086: “any location-based services on the mobile device may be utilized to aid in preparing and submitting the claim information”); and the interceptor system comprises a processor configured to: receive a first front side image and a first back side image of a first interaction request document from a user device of a user (Strange Fig. 17 & ¶¶0252: “a test can be performed that tests whether images of a front and back of a check are actually images of the same document can be performed. The test engine can receive both an image of the front of the check and an image of the back of the check from the preprocessing unit 2110 and use both of these images when executing the test”), wherein: the first interaction request document corresponds to a first interaction to be performed between the user and a first interaction requestor (Strange Abstract & ¶¶0074 discussed above – the interaction is between the user and insurance company); and the first interaction request document is of the first type (Strange ¶¶0009 & ¶¶0083 discussed above – the interaction is location-based in which different locations and documents are obtained/processed; also see Strange Fig. 54); send the first front side image and the first back side image to the first image processing system for processing (Strange ¶¶0076: “The mobile device 102 is connected with an image processing unit 104 over a network so that the mobile device 102 can transmit captured images or image data to the image processing unit 104”; Strange Figs. 8-11: the captured images are cropped and transformed to generate a rectangular image; Strange ¶¶0138-¶¶0139: “A geometrical transformation of the document subimage can be performed using the perspective transformation built in step 1210 (step 1215). The geometrical transformation corrects the perspective distortion present in the document subimage … A ‘dewarping’ operation can also be performed on the document subimage (step 1220)”); send the first front side image and the first back side image to the second image processing system for processing, wherein the first front side image and the first back side image are sent to the second image processing system simultaneously, with the sending of the first front side image and the first back side image to the first image processing system (Strange Figs. 8-11 & ¶¶0076, ¶¶0138-¶¶0139 discussed above already teaches transmitting and processing the document images to generate transformed front and back side images; Strange ¶¶0210: “operations that are either performed sequentially or concurrently” – two different processes are performed either sequentially or simultaneously); receive a first converted front side image and a first converted back side image from the first image processing system (Strange Figs. 8-11 & ¶¶0138-¶¶0139 discussed above); receive a first notification from the second image processing system, wherein the first notification indicates that a second converted front side image of the first front side image and a second converted back side image of the first back side image are not generated successfully (Strange ¶¶0243: “a test can be flagged as ‘affects overall status.’ These tests are also referred to here as ‘critical’ tests. If a mobile image fails a critical test, the MDIPE 2100 rejects the image and can provide detailed information to the mobile device user explaining why the image was not of a high enough quality for the mobile application and that provides guidance for retaking the image to correct the defects that caused the mobile document image to fail the test, in the event that the defect can be corrected by retaking the image”); and send the first converted front side image and the first converted back side image to the interaction processing system (Strange Abstract discussed above; Strange Fig. 23-24 & 27). Regarding claim 2, Strange teaches the system of claim 1, wherein the processor is further configured to: receive a second front side image and a second back side image of a second interaction request document from the user device of the user (Strange Fig. 17 & ¶¶0252 discussed above), wherein: the second interaction request document corresponds to a second interaction to be performed between the user and a second interaction requestor (Strange Abstract, Fig. 4, & ¶¶0009, ¶¶0074, ¶¶0083 discussed above); and the second interaction request document is of the second type (Strange Abstract, Figs. 4, 54, & ¶¶0009, ¶¶0074, ¶¶0083 discussed above; also see Strange ¶¶0091: “time and date may be pre-populated with the current time and date stored in the mobile device, and the location information fields may be prepopulated with the current location of the mobile device, as detected by global positioning satellite (GPS) sensors or cellular or wireless location-based sensors in the mobile device”); send the second front side image and the second back side image to the first image processing system and the second image processing system for processing (Strange Figs. 8-11, & ¶¶0138-¶¶0139 discussed above); receive a third converted front side image and a third converted back side image from the second image processing system (Strange Figs. 8-11 & ¶¶0138-¶¶0139 discussed above); receive a second notification from the first image processing system, wherein the second notification indicates that processing of the second front side image and the second back side image failed (Strange ¶¶0243 discussed above); and send the third converted front side image and the third converted back side image to the interaction processing system (Strange Abstract, Figs. 23-24 & 27 discussed above). Regarding claim 3, Strange teaches the system of claim 1, wherein the processor is further configured to: receive a second front side image and a second back side image of a second interaction request document from the user device of the user (Strange Fig. 17 & ¶¶0252 discussed above), wherein: the second interaction request document corresponds to a second interaction to be performed between the user and a second interaction requestor (Strange Abstract, Fig. 4, & ¶¶0009, ¶¶0074, ¶¶0083 discussed above); and the second interaction request document is of the first type (Strange Abstract, Figs. 4, 54, & ¶¶0009, ¶¶0074, ¶¶0083, ¶¶0091 discussed above); send the second front side image and the second back side image to the first image processing system and the second image processing system for processing (Strange Figs. 8-11, & ¶¶0138-¶¶0139 discussed above); receive a second notification from the first image processing system, wherein the second notification indicates that processing of the second front side image and the second back side image failed (Strange ¶¶0243 discussed above); receive a third notification from the second image processing system, wherein the third notification indicates that processing of the second front side image and the second back side image failed (Strange ¶¶0243 discussed above); send the second notification or the third notification to the interaction processing system (Strange Abstract, Figs. 23-24 & 27 discussed above). Regarding claim 4, Strange teaches the system of claim 1, wherein the processor is further configured to: receive a second front side image and a second back side image of a second interaction request document from the user device of the user (Strange Fig. 17 & ¶¶0252 discussed above), wherein: the second interaction request document corresponds to a second interaction to be performed between the user and a second interaction requestor (Strange Abstract, Fig. 4, & ¶¶0009, ¶¶0074, ¶¶0083 discussed above); and the second interaction request document is of the second type (Strange Abstract, Figs. 4, 54, & ¶¶0009, ¶¶0074, ¶¶0083, ¶¶0091 discussed above); send the second front side image and the second back side image to the first image processing system and the second image processing system for processing (Strange Figs. 8-11, & ¶¶0138-¶¶0139 discussed above); receive a second notification from the first image processing system, wherein the second notification indicates that processing of the second front side image and the second back side image failed (Strange ¶¶0243 discussed above); receive a third notification from the second image processing system, wherein the third notification indicates that processing of the second front side image and the second back side image failed (Strange ¶¶0243 discussed above); and send the second notification or the third notification to the interaction processing system (Strange Abstract, Figs. 23-24 & 27 discussed above). Regarding claim 6, Strange teaches the system of claim 1, wherein the first type is a first geographical region (Strange ¶¶0009 discussed above that the system performs location-based services). Regarding claim 7, Strange teaches the system of claim 6, wherein the second type is a second geographical region different from the first geographical region (Strange ¶¶0009, ¶¶0086, & ¶¶0091 discussed above teach that the system performs the method based on the location). Claim 8 recites the same limitations as claim 1 corresponding to a method. Strange teaches that the system performs the method (Strange ¶¶0009: “Systems and methods are provided”). Therefore, claim 8 is rejected using the same rationale as applied to claim 1 discussed above. Claim 9 is rejected using the same rationale as applied to claim 2 discussed above. Claim 10 is rejected using the same rationale as applied to claim 7 discussed above. Claim 11 is rejected using the same rationale as applied to claim 3 discussed above. Claim 12 is rejected using the same rationale as applied to claim 4 discussed above. Claim 14 is rejected using the same rationale as applied to claim 6 discussed above. Claim 15 recites the same limitations as claim 1 corresponding to a non-transitory computer-readable storage medium. Strange teaches a non-transitory computer-readable medium storing instructions that, when executed by one or more processors, cause the one or more processors to perform the method described in claim 1 (Strange ¶¶0385: “Mobile device 4200 includes a processor 4410. The processor 4410 can be a microprocessor or the like that is configurable to execute program instructions stored in the memory 4420 and/or the data storage 4440. The memory 4420 is a computer-readable memory that can be used to store data and or computer program instructions that can be executed by the processor 4410”). Therefore, claim 15 is rejected using the same rationale as applied to claim 1 discussed above. Claim 16 is rejected using the same rationale as applied to claim 2 discussed above. Claim 17 is rejected using the same rationale as applied to claim 3 discussed above. Claim 18 is rejected using the same rationale as applied to claim 4 discussed above. Claim 20 is rejected using the same rationale as applied to claim 6 discussed above. Claim Rejections - 35 USC § 103 Claim(s) Claim(s) 5, 13, and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Strange et al. (US 2013/0297353 A1), in view of Popadic et al. (US 2007/0156438 A1), hereinafter referred to as Strange and Popadic, respectively. Regarding claim 5, Strange teaches the system of claim 1, wherein the images may have different image formats (Strange ¶¶0078: “Images taken using, for example, a mobile device's camera, can be 24 bit per pixel (24 bit/pixel) JPG images. However, that many other types of images might also be taken using different cameras, mobile devices, etc.”). However, Strange does not appear to explicitly teach that the image and the converted image have different image formats. Pertaining to the same field of endeavor, Popadic teaches that the image and the converted image have different image formats (Popadic ¶¶0020: “It is useful to note when banks produce an account statement 255 (in Account Statement Prep 245) for their customers, they may use a different format than that used for image clearing with other bank”; Popadic ¶¶0090: “Existing banks remote check image capture systems are designed to work with specific check image capture devices. These systems have either been developed by individual banks, or use commercially available hardware and software. Their data/image formats are different from those used for faxes”; Popadic Fig. 3). Strange and Popadic are considered to be analogous art because they are directed to document image processing. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the mobile document processing method and system (as taught by Strange) to use different image formats (as taught by Popadic) because the combination allows the system to interface with commercially available hardware/software that already exist (Popadic ¶¶0090). Claim 13 is rejected using the same rationale as applied to claim 5 discussed above. Claim 19 is rejected using the same rationale as applied to claim 5 discussed above. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SOO J SHIN whose telephone number is (571)272-9753. The examiner can normally be reached M-F; 10-6. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Bella can be reached at (571)272-7778. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Soo Shin/Primary Examiner, Art Unit 2667 571-272-9753 soo.shin@uspto.gov
Read full office action

Prosecution Timeline

Jan 31, 2024
Application Filed
Dec 04, 2025
Non-Final Rejection — §102, §103
Feb 27, 2026
Response Filed
Mar 11, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602768
SURFACE DEFECT DETECTION MODEL TRAINING METHOD, AND SURFACE DEFECT DETECTION METHOD AND SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12586411
TARGET IDENTIFICATION DEVICE, ELECTRONIC DEVICE, TARGET IDENTIFICATION METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12586204
Detecting Optical Discrepancies In Captured Images
2y 5m to grant Granted Mar 24, 2026
Patent 12586216
METHOD OF DETERMINING A MOTION OF A HEART WALL
2y 5m to grant Granted Mar 24, 2026
Patent 12573021
ULTRASONIC DEFECT DETECTION AND CLASSIFICATION SYSTEM USING MACHINE LEARNING
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
87%
Grant Probability
99%
With Interview (+16.0%)
2y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 604 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month