DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continuation Data
2. This application claims priority to Provisional applications 63/745,700 and 63/558,261, filed January 15, 2025 and February 27, 2024 respectively.
Information Disclosure Statement
3. The Information Disclosure Statement filed on December 10, 2025 has been considered. An initialed copy of the Form 1449 is enclosed herewith.
Claim Rejections - 35 USC § 102
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
4. Claims 1, 3, 5, 10, 11, 16, 18, and 19 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Wesner et al (2023/0281887), hereinafter Wesner.
The applied reference has a common assignee with the instant application. Based upon the earlier effectively filed date of the reference, it constitutes prior art under 35 U.S.C. 102(a)(2). This rejection under 35 U.S.C. 102(a)(2) might be overcome by: (1) a showing under 37 CFR 1.130(a) that the subject matter disclosed in the reference was obtained directly or indirectly from the inventor or a joint inventor of this application and is thus not prior art in accordance with 35 U.S.C. 102(b)(2)(A); (2) a showing under 37 CFR 1.130(b) of a prior public disclosure under 35 U.S.C. 102(b)(2)(B) if the same invention is not being claimed; or (3) a statement pursuant to 35 U.S.C. 102(b)(2)(C) establishing that, not later than the effective filing date of the claimed invention, the subject matter disclosed in the reference and the claimed invention were either owned by the same person or subject to an obligation of assignment to the same person or subject to a joint research agreement.
With respect to claims 1, 5, and 18, Wesner teaches in paragraph 0031 and illustrates in figure 1, a system comprising a camera (image 112 is captured by a camera); a display 110; one or more processors; and one or more computer readable storage media storing instructions that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: capturing, using the camera (image 112 is captured by a camera), a plurality of images of a real scene (real world scene) comprising a plurality of optical patterns 114, wherein the plurality of optical patterns comprises a first optical pattern and a second optical pattern. Paragraph 0182-0183 and 0185 teach calculating a similarity score for each of the plurality of images; comparing similarity scores for each of the plurality of images to a threshold value (threshold criterion); paragraphs 0186-0187 disclose selecting a first image, from the plurality of images, based on the first image having a similarity score that meets or exceeds the threshold value; detecting the first optical pattern and the second optical pattern in the first image; calculating an optical-pattern score for the first optical pattern in the first image; decoding the first optical pattern, after detecting the plurality of optical patterns in the first image, based on the optical-pattern score and without decoding the second optical pattern, so that the first optical pattern is the only pattern of the plurality of optical patterns decoded in the first image; and presenting on the display a second image and a graphical overlay (paragraphs 0043 and 0116) on the second image to indicate the first optical pattern is intended by a user of the system to be decoded (see paragraphs 0110 and 0191-0194).
With respect to claims 3 and 10, Wesner teaches in paragraph 0036, the system, wherein the optical-pattern score is based on a size of the first optical pattern in the first image or on a distance from a configurable point of the first image.
With respect to claims 11 and 19, Wesner teaches in paragraph 0118, wherein the optical-pattern score is based on a distance from a configurable point of the first image.
With respect to claim 16, Wesner teaches in paragraph 0005, wherein the first optical pattern is in a group of optical patterns that are decoded.
With respect to claim 18, see Wesner’s teachings above regarding claims 1 and 5. Additionally, Wesner teaches in paragraph 0110,
Allowable Subject Matter
5. Claims 2, 4, 6-9, 12-15, 17 and 20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form, including all of the limitations of the base claim and any intervening claims.
The following is an examiner’s reason for allowance: Although Wesner teaches a smart barcode detection which includes a capturing, using a camera, a plurality of a real scene comprising a plurality of optical patterns, the above identified prior art of record, taken alone, or in combination with any other prior art, fails to teach or fairly suggest the specific features of claims 2, 4, 6-9, 12-15, 17 and 20 of the present claimed invention. Specifically, prior art fails to teach the claimed system, wherein: the plurality of images is a first plurality of images; the threshold value is a first threshold value; and the operations further comprise: capturing a second plurality of images using the camera, after decoding the first optical pattern in the first image, wherein the second plurality of images comprise the first optical pattern; tracking the first optical pattern in the second plurality of images; presenting the second plurality of images on the display with the graphical overlay, without decoding the first optical pattern in the second plurality of images, based on tracking the first optical pattern in the second plurality of images; calculating similarity scores for each of the second plurality of images; comparing the similarity scores for each of the second plurality of images to a second threshold value; and not decoding the first optical pattern in the second plurality of images based on similarity scores for each of the plurality of images meeting or exceeding the second threshold value or wherein: the threshold value is a first threshold value; and the operations further comprise: identifying a subset of images of the plurality of images having similarity scores that meet or exceed the first threshold value; calculating image scores for each image of the subset of images; comparing the image scores for each image of the subset of images to a second threshold value; and selecting the first image based on an image score of the first image meeting or exceeding the second threshold value. Prior art fails to teach the claimed method, wherein: the plurality of images is a first plurality of images; the threshold value is a first threshold value; and the method further comprises: capturing a second plurality of images using the camera, after decoding the first optical pattern in the first image, wherein the second plurality of images comprise the first optical pattern; calculating similarity scores for each of the second plurality of images; comparing the similarity scores for each of the second plurality of images to a second threshold value; and not decoding the first optical pattern in the second plurality of images based on the similarity scores for each of the plurality of images meeting or exceeding the second threshold value or wherein: the plurality of images is a first plurality of images; and the method further comprises: capturing a second plurality of images using the camera, after decoding the first optical pattern in the first image, wherein the second plurality of images comprise the first optical pattern; tracking the first optical pattern in the second plurality of images; and presenting the second plurality of images on the display with the graphical overlay, without decoding the first optical pattern in the second plurality of images, based on tracking the first optical pattern in the second plurality of images and further fails to teach the method comprising: calculating an optical-pattern score for the second optical pattern, based on how far the second optical pattern is from the configurable point of the first image; comparing the optical-pattern score of the first optical pattern to the optical-pattern score of the second optical pattern; ascertaining that the first optical pattern is closer to the configurable point of the first image than the second optical pattern based on comparing the optical-pattern score of the first optical pattern to the optical-pattern score of the second optical pattern; and decoding the first optical pattern and not the second optical pattern in the first image based on the first optical pattern being closer to the configurable point of the first image than the second optical pattern. Prior art fails to teach the claimed method wherein: the threshold value is a first threshold value; and the method further comprises: identifying a subset of images of the plurality of images having similarity scores that meet or exceed the first threshold value; calculating image scores for each image of the subset of images; comparing the image scores for each image of the subset of images to a second threshold value; and selecting the first image based on an image score of the first image meeting or exceeding the second threshold value and wherein: the plurality of optical patterns comprises three or more optical patterns that are detected in the first image; and the first optical pattern is the only pattern of the plurality of optical patterns decoded in the first image. Lastly, prior art fails to teach the claimed memory device wherein: the plurality of images is a first plurality of images; the threshold value is a first threshold value; and the operations further comprises: capturing a second plurality of images using the camera, after decoding the first optical pattern in the first image, wherein the second plurality of images comprise the first optical pattern; calculating similarity scores for each of the second plurality of images; comparing the similarity scores for each of the second plurality of images to a second threshold value; and not decoding the first optical pattern in the second plurality of images based on the similarity scores for each of the plurality of images meeting or exceeding the second threshold value. The above limitations are not disclosed in prior art and moreover, one of ordinary skill in the art would not have been motivated to come to the claimed invention.
Conclusion
6. The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure: See attached PTO form 892, Refence Cited.
7. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Allyson N. Trail whose telephone number is (571) 272-2406. The examiner can normally be reached between the hours of 7:30AM to 4:00PM Monday thru Friday.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Michael G. Lee, can be reached on (571) 272-2398. The fax phone number for this Group is (571) 273-8300.
Communications via Internet e-mail regarding this application, other than those under 35 U.S.C. 132 or which otherwise require a signature, may be used by the applicant and should be addressed to [allyson.trail@uspto.gov].
Information regarding the status of an application may be obtained from Patent Center. Status information for published applications may be obtained from Patent Center. Status information for unpublished applications is available through Patent Center for authorized users only. Should you have questions about access to Patent Center, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form.
/ALLYSON N TRAIL/Primary Examiner, Art Unit 2876
January 29, 2026