Prosecution Insights
Last updated: April 19, 2026
Application No. 18/395,262

METHOD OF NATIVELY PERFORMING PATTERN MATCHING/LOCATE OBJECT JOB SETUP ON IMAGING BASED DATA CAPTURE DEVICE USING AIMER

Final Rejection §102
Filed
Dec 22, 2023
Examiner
YE, LIN
Art Unit
2638
Tech Center
2600 — Communications
Assignee
Zebra Technologies Corporation
OA Round
2 (Final)
26%
Grant Probability
At Risk
3-4
OA Rounds
2y 5m
To Grant
40%
With Interview

Examiner Intelligence

Grants only 26% of cases
26%
Career Allow Rate
18 granted / 68 resolved
-35.5% vs TC avg
Moderate +14% lift
Without
With
+14.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 5m
Avg Prosecution
8 currently pending
Career history
76
Total Applications
across all art units

Statute-Specific Performance

§101
4.3%
-35.7% vs TC avg
§103
51.1%
+11.1% vs TC avg
§102
34.2%
-5.8% vs TC avg
§112
6.5%
-33.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 68 resolved cases

Office Action

§102
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments 2. Applicant's arguments filed on 12/01/2025 have been fully considered but they are not persuasive. 3. The Applicant submits the following arguments: Regarding claims 1-26, in the remarks on pages 1-3, the Applicant stated that Rhoads fails to teach identifying a model region based on the position of an aiming pattern. Examiner’s Office Action discloses “find a barcode region with in the field of view.” Is a mischaracterization of claimed invention. 4. In response to the arguments, the Examiner respectfully disagrees with the Applicants for the reasons set forth below: The Applicant’s claims are broad and Applicant did not clearly define what exactly the model region and the aiming pattern in the claims. Applicant’s specification [0035] also discloses an example data capture device 180 includes an aiming assembly may be referred to as a presentation scanner or barcode (indicia reader and includes an imaging -based data capture assembly. It is able to generate an aiming pattern in a FOV of the capture device 180. The barcode or look up product information can be consider as claimed “aiming pattern” and the field of view (FOV) can be consider as claimed “model region” reasonably. Therefore, Rhoads teaches “identifying a model region based on the position of an aiming pattern” (See Rhoads page 31 lines 12-13). Claim Rejections - 35 USC§ 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless - (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 13-26 and 1-12 are rejected under 35 U.S.C. 102(a)(1 )/(a)(2) as being anticipated by Rhoads (CA 2792336). In regard to claim 13, note Rhoads discloses a data capture device (smart phones and other mobile devices with a barcode reader application see page 32, lines 16-21) comprising: an imaging-based data capture assembly (phone's camera, see page 1, lines 36-37) configured to capture image data over a field of view; one or more processors (cell phone processor) connected to the imaging assembly; and one or more memories (RAM) storing instructions thereon that, when executed by the one or more processors, are configured to cause the one or more processors to (See Figure 25: responsive to a triggering event at the imaging-based data capture device, enter a job setup mode on the imaging-based data capture device; generate, by the imaging-based data capture assembly, an aiming pattern (e.g., subject is a barcode or look up product information), or and capture image data of a field of view of the imaging-based data capture assembly where the image data includes the aiming pattern; determine, from the image data, a position of the aiming pattern within the image data; identify, at the data capture device, a model region of the image data, the model region being identified, at least partially, based on the position of the aiming pattern (find a barcode region with in the field of view, see page 31 lines 12-13); generate, at the data capture device, a model data corresponding to the model region, the model data being a subset of the image data (extract the region of interest in the image frame); and store the model data for access by a pattern matching process executable at the data capture device during a job deployment mode on the imaging-based data capture device (barcode regions of imagery may be retained, See page 32, lines 25-33, Page 62. Lines 13-19 and Page 77, lines 37-40). In regard to claim 14, note Rhoads discloses wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: responsive to a subsequent triggering event at the data capture device, exit the job setup mode (image capture mod) and enter the job deployment mode (extract an image from a sequence range) of the imaging-based data capture device (page 62, lines 13-19 and page 74, lines 1-30). In regard to claim 15, note Rhoads discloses that wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: in the job deployment mode, capture subsequent image data (capturing many frames) over the field of view, send the subsequent image data to the pattern matching process, and execute the pattern matching process to determine a match between the model data and the subsequent image data (See page 32, lines 25-33, Page 62, Lines 13-19 and page21, lines 15-32) In regard to claim 16, note Rhoads discloses that wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: in the job deployment mode, determine a position of the aiming pattern in the subsequent image data and send the position of the aiming pattern in the subsequent image data to the pattern matching process (See page 22, lines 7-11). In regard to claim 17, note Rhoads discloses that wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: in the job deployment mode, perform image processing on the subsequent image data prior to sending the subsequent image data to the pattern matching process, the image processing comprising a feature detection or an optical character recognition (See page 68, lines 7-9 and page 22, lines 7-11). In regard to claim 18, note Rhoads discloses that wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: center the model region on the position of the aiming pattern (move camera to re-position a particular subject in the center of the image frame, See page 3, 15-25). In regard to claim 19, note Rhoads discloses that wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: capture at least one subsequent image data including the aiming pattern; for each of the at least one subsequent image data, determine a position of the aiming pattern within the subsequent image data, forming a plurality of positions of the aiming pattern; and identify the model region of the image data based the plurality of positions of the aiming pattern (See page 22, lines 7-11). In regard to claim 20, note Rhoads discloses that wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: generate the model data corresponding to the model region by performing an image processing on at least a portion of the image data, the image processing comprising a feature detection or an optical character recognition (See page 68, lines 7-9; page 22, lines 7-11 and See page 32, lines 25-33). In regard to claim 21, note Rhoads discloses that wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: determine the position of the aiming pattern is relative to an imaging sensor has imaging of the imaging assembly (camera) (See page 3, 15-25 and page 28-31). In regard to claim 22, note Rhoads discloses that wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: identify a feature in the image data and determine the position of the aiming data relative to the feature (user's interest in that feature many be recognized, see page 7 lines 4-18). In regard to claim 23, note Rhoads discloses that wherein the instructions, when executed by the one or more processors, further cause the one or more processors to: generate, at the data capture device, at least one additional model data corresponding to the model region, the at least one additional model data having a different shape and/or position from the model data; and store the model data and the at least one additional model data in a ranked manner for access by the pattern matching process (See page15, lines 1-23 and page 52 lines 20-24) In regard to claim 24, note Rhoads discloses that when executed by the one or more processors, further cause the one or more processors to: utilize, based on the ranked manner of the model data (images' metadata) and the at least one additional model data, a desired model data for use during job deployment mode of the imaging-based data capture device (See page 52 lines 20-24) In regard to claim 25, note Rhoads discloses that wherein the imaging assembly is a barcode reader (smart phones and other mobile devices with a barcode reader application see page 32, lines 16-21). In regard to claim 26, note Rhoads discloses that (camera has machine vision functions, see page 49 lines 7 -17 and page 8, lines 7-13) In regard to claim 1, this is a method claim, corresponding to the apparatus in claim 13. Therefore, claim 1 has been analyzed and rejected as previously discussed with respect claim 13. In regard to claim 2, this is a method claim, corresponding to the apparatus in claim 14. Therefore, claim 2 has been analyzed and rejected as previously discussed with respect claim 14. In regard to claim 3 this is a method claim, corresponding to the apparatus in claim 15. Therefore, claim 3 has been analyzed and rejected as previously discussed with respect claim 15. In regard to claim 4, this is a method claim, corresponding to the apparatus in claim 16. Therefore, claim 4 has been analyzed and rejected as previously discussed with respect claim 16. In regard to claim 5, this is a method claim, corresponding to the apparatus in claim 17. Therefore, claim 5 has been analyzed and rejected as previously discussed with respect claim 17. In regard to claim 6, this is a method claim, corresponding to the apparatus in claim 18. Therefore, claim 6 has been analyzed and rejected as previously discussed with respect claim 18. In regard to claim 7, note Rhoads discloses that wherein the model region has a geometric shape (See Page 51, lines 21-28). In regard to claim 8, this is a method claim, corresponding to the apparatus in claim 19. Therefore, claim 8 has been analyzed and rejected as previously discussed with respect claim 19. In regard to claim 9, this is a method claim, corresponding to the apparatus in claim 20. Therefore, claim 9 has been analyzed and rejected as previously discussed with respect claim 20. In regard to claim 10, this is a method claim, corresponding to the apparatus in claim 21. Therefore, claim 10 has been analyzed and rejected as previously discussed with respect claim 21. In regard to claim 11, this is a method claim, corresponding to the apparatus in claim 22. Therefore, claim 11 has been analyzed and rejected as previously discussed with respect claim 22. In regard to claim 12, this is a method claim, corresponding to the apparatus in claim 23. Therefore, claim 12 has been analyzed and rejected as previously discussed with respect claim 23. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Lin Ye whose telephone number is (571 )272-7372. The examiner can normally be reached M-F 9:00-5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter .uspto.gov. Visit https://www .uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /LIN YE/ Supervisory Patent Examiner, Art Unit 2638
Read full office action

Prosecution Timeline

Dec 22, 2023
Application Filed
May 27, 2025
Non-Final Rejection — §102
Dec 01, 2025
Response Filed
Jan 28, 2026
Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12581187
IMAGE PICKUP APPARATUS CONTROLLING AN OPERATING MODE, ITS CONTROL METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 17, 2026
Patent 12513383
ELECTRONIC DEVICE
2y 5m to grant Granted Dec 30, 2025
Patent 12457406
MOBILE SINGLE LEAF SCANNER
2y 5m to grant Granted Oct 28, 2025
Patent 12407938
ADAPTIVE SYNCHRONIZATION FOR AUTOMATIC EXPOSURE CONTROL (AEC)
2y 5m to grant Granted Sep 02, 2025
Patent 12389106
IMAGING APPARATUS, CONTROL METHOD FOR THE SAME, AND STORAGE MEDIUM
2y 5m to grant Granted Aug 12, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
26%
Grant Probability
40%
With Interview (+14.0%)
2y 5m
Median Time to Grant
Moderate
PTA Risk
Based on 68 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month