Prosecution Insights
Last updated: April 19, 2026
Application No. 18/691,427

OBJECT LOCALIZATION AND INFORMATION ENCODING SYSTEM

Non-Final OA §103§DP
Filed
Mar 12, 2024
Examiner
CODRINGTON, SHANE WRENSFORD
Art Unit
2667
Tech Center
2600 — Communications
Assignee
Apple Inc.
OA Round
1 (Non-Final)
100%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
0%
With Interview

Examiner Intelligence

Grants 100% — above average
100%
Career Allow Rate
1 granted / 1 resolved
+38.0% vs TC avg
Minimal -100% lift
Without
With
+-100.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
14 currently pending
Career history
15
Total Applications
across all art units

Statute-Specific Performance

§101
5.3%
-34.7% vs TC avg
§103
60.5%
+20.5% vs TC avg
§102
23.7%
-16.3% vs TC avg
§112
7.9%
-32.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1 resolved cases

Office Action

§103 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 03/12/2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 1, 2, 6 8, 9, 11 and 20 and are rejected on the grounds of nonstatutory double patenting as being unpatentable over claims 1, 6, 11, 10, and 13 of U.S. Patent No. 11,709,372 B2 in view of Naimark et al (Naimark hereinafter US 7231063 B2). As per claim 1, ‘372 patent teaches a system (claim 1: “a system”), comprising: a camera (claim 1: “a camera”) comprising a camera lens and an image sensor (claim 1: “a camera lens and an image sensor”); a transparent element on an object side of the camera lens (claim 1: “a cover glass between an object field and the camera lens”), the transparent element including a fiducial pattern configured to affect light received from an object field to cause a diffraction pattern in images formed by the camera lens at a surface of the image sensor (claim 1: the cover glass comprising a fiducial pattern configured to affect light received through the cover glass from the object field to cause a diffraction pattern in images formed by the camera lens at a surface of the camera sensor); and one or more processors configured to process two or more images captured by the camera to extract the diffraction pattern and to determine locations of the diffraction sub-patterns on the image sensor. (claim 1 “one or more processors configured to: apply a correlation technique to at least one image captured by the camera to locate a centroid of the diffraction pattern on the camera sensor”) ‘372 patent does not explicitly teach “wherein the fiducial pattern comprises two or more fiducial sub-patterns each comprising one or more markers, and wherein the diffraction pattern comprises two or more diffraction sub- patterns corresponding to the fiducial sub-patterns”. However, Naimark teaches using patterned optical features that generate multiple diffraction components i.e. sub pattern and sub pattern diffraction features whose locations may be analyzed to determine positional information for alignment, calibration or offset determination. Abstract: “ The design includes "solid" outside mono-color ring and 2-D dense inside coding scheme”, Summary: “The mark may also include a two dimensional (2D) array of encoding areas“ Column 2 line 33 “the invention features a method of detecting fiducial marks including capturing an image of a fiducial mark, the fiducial mark including a circular outer perimeter, patterns on an inside of the outer perimeter for reading an identity code including higher information density per area than a pattern of concentric rings, a two dimensional (2D) array of encoding areas,” Naimark shows that the fiducial areas includes multiple distinct pattern regions via the identity code patterns and a 2D array of encoding areas. This qualifies as two or more fiducial sub patterns and the individual encoding areas function as “markers” or features within the sub patterns. Looking at figure , 3 and 4, we can see that the perimeter of the image is the first instance of the fiducial marker then the inside geometries are the sub pattern) . Note: Naimark also teaches image processing to locate fiducial features by centroid determination and via sub pattern (Column 2 line 12 “The encoding areas inside the circular outer perimeter may be contiguous with the circular outer ring to segment as one large blob, and may also include one or more small blobs inside the circular outer perimeter designated as indexing features. The features may include a feature used to establish an origin, a feature used to establish an x-axis position, and a feature used to establish a y-axis position. (13) In another aspect, the invention features a method for finding a centroid of an outer rim of a fiducial including capturing an image of the fiducial” capturing an image of the fiducial and calculating the centroid provides a multiple fiducial sub pattern marker (encoding areas) and image-based localization via centroid calculation workflow. Thus, Naimark teaches the only material distinction between examined application 18/691/427 and US 11,709,372 B2 namely the use of diffraction sub patterns and associated positional information. Accordingly it would have been obvious to one of ordinary skill at the time this invention was effectively filed to modify the system of claim 1 of US 11,709,372 B2 to employ the sub pattern arrangement taught by Naimark so that instead of relying only on a diffraction pattern centroid generally, the system could use centroids of diffraction sub patterns to provide a more granulated positional information for location of fiducial markers and determining offsets of the transparent cover glass element relative to the camera lens. This modification would have predictably made an improvement on feature localization as well as boost calibration precision by allowing multiple structured diffraction derived reference points rather than a single undifferentiated diffraction signal. As such claim 1 of examined application 18/691/427 is patentably indistinct from claim 1 of U.S. 11,709,372 BR in view of Naimark and claims 2, 6, 8, 9, 11 and likewise are not patentably distinct because their additional limitations do not render them patentably distinct over the patent claims as modified in view of Naimark. As per claim 2, ‘372 patent teaches wherein the one or more processors are further configured to: determine offsets of the transparent element with respect to the camera lens from the determined locations(Claim 1 “determine offsets of the cover glass with respect to the camera lens from the located centroid;”); and apply the determined offsets to one or more images captured by the camera during processing of the one or more images to account for distortion in the one or more images caused by a corresponding shift in the transparent element with respect to the camera lens (claim 1: “apply the determined offsets to one or more images captured by the camera to account for distortion in the one or more images caused by a corresponding shift in the cover glass with respect to the camera lens during processing of the one or more image”). As per claim 6 , 372 teaches wherein the transparent element is a cover glass, and the camera and the cover glass are components of a head-mounted device (HMD). (Claim 10: “wherein the camera and the cover glass are components of a head-mounted device (HMD)”.) As per claim 8, 372 patent teaches wherein the fiducial pattern encodes information about the transparent element(claim 6 “wherein the diffraction pattern is a sine-modulated two-dimensional (2D) Barker code diffraction pattern” This shows diffraction pattern from the fiducial pattern is encoded), and wherein the one or more processors are configured to further process the extracted diffraction pattern to: locate the markers in the fiducial pattern corresponding to centroids of the diffraction sub-patterns (Claim 1: one or more processors configured to: apply a correlation technique to at least one image captured by the camera to locate a centroid of the diffraction pattern”); and determine the information about the transparent element from the located markers and corresponding centroids (claim 11: “determine offsets of the cover glass with respect to the camera lens from the located centroids.”) As per claim 9, ‘372 teaches wherein the encoded information includes one or more of an identifier and a serial number for the transparent element. (Claim 6: “wherein the diffraction pattern is a sine-modulated two-dimensional (2D) Barker code diffraction pattern, wherein the correlation kernel is a 2D Barker code”) As per claim 11, 372 teaches wherein the one or more processors are configured to cause mechanical or software adjustments in the system based on the determined information about the transparent element to adapt the system to the particular transparent element (Claim 13: “determining, by the one or more processors, a shift of the cover glass with respect to the camera lens from the located centroid; and adjusting processing of one or more images captured by the camera to account for the determined shift in the cover glass with respect to the camera lens.”) As per claim 20, 372 teaches A device, comprising: a camera comprising a camera lens and an image sensor (claim 1: “a camera lens and an image sensor”); a transparent element on an object side of the camera lens (claim 1: “a cover glass between an object field and the camera lens”), the transparent element including a fiducial pattern configured to affect light received from an object field to cause a diffraction pattern in images formed by the camera lens at a surface of the image sensor (claim 1: the cover glass comprising a fiducial pattern configured to affect light received through the cover glass from the object field to cause a diffraction pattern in images formed by the camera lens at a surface of the camera sensor); determine offsets of the transparent element with respect to the camera lens from the diffraction patterns in two or more images captured by the camera (Claim 11 “ …locate centroids of the diffraction patterns on the camera sensor; and determine offsets of the cover glass with respect to the camera lens from the located centroids.”) and apply the determined offsets to one or more images captured by the camera during processing of the one or more images to account for distortion in the one or more images caused by a corresponding shift in the transparent element with respect to the camera lens. (claim 1: “apply the determined offsets to one or more images captured by the camera to account for distortion in the one or more images caused by a corresponding shift in the cover glass with respect to the camera lens during processing of the one or more image”). ‘372 patent does not explicitly teach “wherein the fiducial pattern comprises two or more fiducial sub-patterns each comprising one or more markers, and wherein the diffraction pattern comprises two or more diffraction sub- patterns corresponding to the fiducial sub-patterns”. However, Naimark teaches using patterned optical features that generate multiple diffraction components i.e. sub pattern and sub pattern diffraction features whose locations may be analyzed to determine positional information for alignment, calibration or offset determination. Abstract: “ The design includes "solid" outside mono-color ring and 2-D dense inside coding scheme”, Summary: “The mark may also include a two dimensional (2D) array of encoding areas“ Column 2 line 33 “the invention features a method of detecting fiducial marks including capturing an image of a fiducial mark, the fiducial mark including a circular outer perimeter, patterns on an inside of the outer perimeter for reading an identity code including higher information density per area than a pattern of concentric rings, a two dimensional (2D) array of encoding areas,” Naimark shows that the fiducial areas includes multiple distinct pattern regions via the identity code patterns and a 2D array of encoding areas. This qualifies as two or more fiducial sub patterns and the individual encoding areas function as “markers” or features within the sub patterns. Looking at figure , 3 and 4, we can see that the perimeter of the image is the first instance of the fiducial marker then the inside geometries are the sub pattern) . Note: Naimark also teaches image processing to locate fiducial features by centroid determination and via sub pattern (Column 2 line 12 “The encoding areas inside the circular outer perimeter may be contiguous with the circular outer ring to segment as one large blob, and may also include one or more small blobs inside the circular outer perimeter designated as indexing features. The features may include a feature used to establish an origin, a feature used to establish an x-axis position, and a feature used to establish a y-axis position. (13) In another aspect, the invention features a method for finding a centroid of an outer rim of a fiducial including capturing an image of the fiducial” capturing an image of the fiducial and calculating the centroid provides a multiple fiducial sub pattern marker (encoding areas) and image-based localization via centroid calculation workflow. Thus, Naimark teaches the only material distinction between examined application 18/691/427 and US 11,709,372 B2 namely the use of diffraction sub patterns and associated positional information. Accordingly it would have been obvious to one of ordinary skill at the time this invention was effectively filed to modify the system of claim 1 of US 11,709,372 B2 to employ the sub pattern arrangement taught by Naimark so that instead of relying only on a diffraction pattern centroid generally, the system could use centroids of diffraction sub patterns to provide a more granulated positional information for location of fiducial markers and determining offsets of the transparent cover glass element relative to the camera lens. This modification would have predictably made an improvement on feature localization as well as boost calibration precision by allowing multiple structured diffraction derived reference points rather than a single undifferentiated diffraction signal. As such claim 20 of examined application 18/691/427 is patentably indistinct from claim 1 of U.S. 11,709,372 BR in view of Naimark and claims 2, 6, 8, 9, 11 and likewise are not patentably distinct because their additional limitations do not render them patentably distinct over the patent claims as modified in view of Naimark. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 1, 8, 9, 13, 14, and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Marsh et al (Marsh hereinafter US 11380008 B2) in view of Naimark et al (Naimark hereinafter US 7231063 B2). As per claim 1 Marsh teaches a system comprising a camera comprising a camera lens and an image sensor (Figures 3-6, 9 and 15, labels 905 and 906. A photodetector and a lens constitute the minimum hardware of a camera), the transparent element including a fiducial pattern configured to affect light received from an object field to cause a diffraction pattern in images formed by the camera lens at a surface of the image sensor, (Fig 2B label 202, paragraph [0027] “The position of the shadow on the photodetector moves in direct correspondence to the motion of a fiducial 202 in the contact lens subsystem 200. In another embodiment, optical conditioning element 104 is a lens that focuses the received light 301 onto photodetector element 105 and the position of the focused light spot on the detector moves in direct correspondence to the motion of the fiducial 202 on or within the contact lens.”) Marsh also discusses processors configured to process images captured by the camera to extract the diffraction pattern and to determine locations of the diffraction pattern (Abstract “And the electronics process the photodetector output signal to calculate the position of the fiducial.” And column 4 line 44 “ The electronics 106 in the transceiver subsystem 100 may be incorporated into the transceiver module or they may be remote. The purpose of the electronics is to provide a drive signal to the optional light emitter 101 and to process the output signals from the photodetector element 105 in order to extract the instantaneous position of the fiducial 202 on the contact lens” the “electronics” (106) that is used to calculate (determine) position (location) of the pattern is printed circuit board. Column 7 line 7 “PCB (printed circuit board) 106 contains the transceiver electronics. In some embodiments, the complete subsystem is included in a housing 107.” ) Moreover Marsh shows a camera based optical tracking arrangement in which a fiducial associated with a transparent optical element is viewed through an optical conditioning element and states in column 10 line 47 “The optical element acts on the received light to create a light distribution on the photodetector system from which the instantaneous position of the fiducial…can be determined” Marsh further states in column 1 line 51 that “The photodetector element receives a light signal from the fiducial and provides a photodetector output signal. The light signal provides a light intensity pattern at the photodetector. The optical conditioning element receives the light signal and provides a variation in the light intensity pattern on the photodetector in response to changes in the position of the fiducial. “ Marsh is not relied upon for the fiducial pattern comprising two or more sub patterns each comprising one or more markers nor the diffraction pattern comprises two or more diffraction sub patterns Naimark teaches the fiducial pattern comprising two or more fiducial sub patterns each comprising one or more markers (Abstract: “ The design includes "solid" outside mono-color ring and 2-D dense inside coding scheme”, Summary: “The mark may also include a two dimensional (2D) array of encoding areas“ Column 2 line 33 “the invention features a method of detecting fiducial marks including capturing an image of a fiducial mark, the fiducial mark including a circular outer perimeter, patterns on an inside of the outer perimeter for reading an identity code including higher information density per area than a pattern of concentric rings, a two dimensional (2D) array of encoding areas,” Naimark shows that the fiducial areas includes multiple distinct pattern regions via the identity code patterns and a 2D array of encoding areas. This qualifies as two or more fiducial sub patterns and the individual encoding areas function as “markers” or features within the sub patterns. Looking at figure , 3 and 4, we can see that the perimeter of the image is the first instance of the fiducial marker then the inside geometries are the sub pattern) . Note: Naimark also teaches image processing to locate fiducial features by centroid determination and via sub pattern (Column 2 line 12 “The encoding areas inside the circular outer perimeter may be contiguous with the circular outer ring to segment as one large blob, and may also include one or more small blobs inside the circular outer perimeter designated as indexing features. The features may include a feature used to establish an origin, a feature used to establish an x-axis position, and a feature used to establish a y-axis position. (13) In another aspect, the invention features a method for finding a centroid of an outer rim of a fiducial including capturing an image of the fiducial” capturing an image of the fiducial and calculating the centroid provides a multiple fiducial sub pattern marker (encoding areas) and image-based localization via centroid calculation workflow. Naimark supplies the multiple sub pattern marker structure whilst Marsh supplies the transparent element and the diffraction behavior. A person of ordinary skill in the art can expect multiple distinct fiducial sub regions on a diffractive transparent element to produce corresponding distinguishable diffraction derived components. This combination allows for the diffraction pattern to comprise two or more diffraction sub patterns corresponding to the fiducial sub patterns. In regards to “one or more processors configured to process two or more images captured by the camera to extract the diffraction pattern and to determine locations of the diffraction sub-patterns on the image sensor.” Marsh already provides the processor determining pattern position from detector output and Naimark provides the multi feature fiducial and centroid based feature localization. In this modified/combined system the processor necessarily processes captured image data to extract the fiducial optical pattern and determine the locations of its components on the detector. Marsh’s workflow gives the processor architecture and pattern position determination while Naimark supplies the multiple fiducial sub- pattern features and the image based centroid location logistics used to determine their position. Accordingly, a person of ordinary skill in the art at the time this invention was effectively filed would have found it obvious to implement modify Marsh’s fiducial diffractive frame work with Naimark’s multi feature coded fiducial sub pattern structure because doing so would predictably increase the amount of spatially distinguishable information available to the imagining/processing system thereby improving coding possibility and density without changing Marsh’s underlying system. Marsh already teaches that the optical element transforms received light into a pattern on the photodetector from which fiducial position is determined and that this transformation can be carried out by diffraction. Once Naimark’s fiducial is substituted for Marsh’s generic fiducial the multiple encoded regions of Naimark’s fiducial would predictably produce corresponding distinguishable components in the detector patter, Marsh’s system expressly derives fiducial position from the transformed detector pattern generated by the optical system. Essentially Naimark supplies the multiple sub patterns/markers while Marsh supplies the transparent element and diffractive light distribution generation on the detector as well as processing to determine position from that pattern. This Naimark/Marsh system allows for the last two limitations outside of hindsight. The multiple fiducial sub patterns of Naimark would yield multiple corresponding detector pattern components in Marsh’s diffractive system. Marsh’s disclosed processing of the detector output to calculate fiducial position together with Naimark’s sub pattern-based features enables processing captured images to extract the pattern and determine locations of corresponding sub pattern components on the sensor. This gives the advantage of improved localization accuracy and feature discrimination by using multi feature fiducials in a diffraction pattern-based architecture. As per claim 8 The Marsh/Naimark modified system cover all claim limitations previously rejected in claim 1’s 103 rejection. See claim 1’s 103 rejection. Thus, the Marsh/Naimark modified system already provides a base system in which a fiducial patterned transparent element generates a sensor detected optical pattern that is processed by electronics. Naimark supplies the structured fiducial/marker encoding as well as the centroid based decoding logistics. Naimark expressly teaches encoded information in the fiducial pattern (Column 2 line 66 “a fiducial mark for detection by a camera including a circular outer perimeter, and patterns on an inside of the outer perimeter for reading an identity code including higher information density per area…The mark may also include a two-dimensional (2D) array of encoding areas,”), processing the extracted diffraction pattern to locate the markers in the fiducial pattern corresponding to centroids of the diffraction sub pattern ( Column 2 line 19 “…method for finding a centroid of an outer rim of a fiducial including capturing an image of the fiducial…and calculating the centroid of the one solid ellipse-like shape” and column 2 line 28 “the invention features a method including determining a centroid of a framing object and all interior objects of a fiducial image, and calculating a centroid…” because Naimark already shows processor based centroid calculation from an image of the fiducial, the Marsh/Naimark system shows the pattern detected from the transparent element is already the extracted optical pattern. Naimark provides the solidified teaching to use centroid based localization to identify where the fiducial marker features are.) as well as a determination of the transparent element from the located markers corresponding centroids (“capturing an image of a fiducial… determining an origin of the image from a first feature, determining an x-axis position of the image from a second feature, and determining a y-axis position of the image from a third feature…” to read an identity code. Naimark shows that once the fiducial’s structural features are localized the processor uses those located features to read the identity code. This is the same functional relationship of using the located markers/centroids to determine encoded information about the element. Through Marsh, the Marsh/Naimark system allows for a system to use the fiducial diffraction pattern at the detector and the actual electronics used to process the detector output signal to calculate the position of the fiducial. This is how Naimark’s centroid/marker logic is reasonably applied to the extracted pattern logistics in the combined system. Accordingly, a person of ordinary skill in the art at the time this invention was effectively filed would have found it obvious to start with Marsh’s camera based transparent element tracking system in which a fiducial diffraction pattern at the detector is processed to calculate the position of the fiducial and incorporate Naimark’s encoded fiducial structure and centroid based decoding techniques. Naimark expressly teaches a fiducial whose internal regions identify a code and teaches calculating centroids as well as using localized fiducial features to determine the encoded information. Marsh already relies on processor analysis of a detected optical pattern to determine fiducial position. A person of ordinary skill in the art would have recognized that using Naimark’s structured fiducial with centroid based feature localization would predictably improve the system by allowing the same detected pattern to also provide encoded identity information and not just positional information about the transparent element giving it a dual function. This allows the detected optical signature to be linked to a specific identity or configuration while preserving the same optical architecture. It allows for improved feature discrimination and information density of the transparent element. As per claim 9 The Marsh/Naimark modified system cover all claim limitations previously rejected in claim 8’s 103 rejection. See claim 8’s 103 rejection. Naimark teaches wherein the encoded information includes one or more of an identifiers and a serial number ( As previously stated Naimark discloses “patterns on an inside of the outer perimeter for reading an identity code including higher information density per area than a pattern of concentric rings, a two-dimensional (2D) array of encoding areas” an “identity code” an identifier. Naimark’s fiducial encodes an identity code via encoding areas. Furthermore, a serial number is a specific identifier which Naimark’s “identity code” which uniquely identifies the marked item reads on this. Using that identity code as a serial number is a straightforward application of the same encoded identity field. Accordingly, a person of ordinary skill in the art at the time this invention was effectively filed would have found it obvious to configure the encoded information to include one or more identifier and serial number for the transparent element (provided by Marsh). Naimark expressly teaches encoding and reading an identity code which in itself is an identifier. Serial number in turn is a well-known identifier used to uniquely distinguish individual units. This implementation enables unambiguous identification and traceability of the particular transparent element associated with the detected fiducial. This then provides the predictable advantage of improved element management and configuration reliability without altering the fundamental diffraction pattern extraction decoding operation in the Marsh/Naimark system. As per claim 13 Marsh and Naimark cover all claim limitations previously rejected in claim 1’s 103 rejection please see claim 1’s 103 rejection. Naimark teaches that the fiducial pattern includes one or more circular or irregular rings of fiducial sub patterns (Fig 2, Fig 3, Fig 5A and Fig 5B column 3 line 1 “…features a two-dimensional (2D) fiducial for image processing including an outer ring of 1 u width, two data rings, each data ring having a 1 u width, and an inner ring with 1 u width, wherein u is one length unit.” and column 3 line 59 “ a two dimensional (2D) fiducial design for image processing. The design includes a sequence of concentric rings, each of the rings including a plurality of mono-colored information sectors, each sector being of approximately a same size such that inner rings have fewer sectors than outer rings.”) . Accordingly, a person of ordinary skill in the art at the time this invention was effectively filed would have found it obvious to implement the fiducial marks in the Marsh/Naimark system with Naimark’s concept of circular or irregular rings of fiducial sub patterns . A person of ordinary art would have recognized that arranging the sub patterns in a ring geometry would show improvements in detection when detected. Circular and or ring-oriented geometries. Ring based arrangements provide an easier and consistent tracking path for detection of the fiducial features. This improves feature detectability without changing the underlying encoding or decoding mechanics and methodology. Claim 14 Marsh and Naimark cover all claim limitations previously rejected in claim 13’s 103 rejection please see claim 13’s 103 rejection. Naimark shows wherein the one or more markers in each fiducial sub-pattern are arranged in a same pattern. ( Column 3 line 59 “ a two dimensional (2D) fiducial design for image processing. The design includes a sequence of concentric rings, each of the rings including a plurality of mono-colored information sectors, each sector being of approximately a same size such that inner rings have fewer sectors than outer rings.” And column 8 line 22 “ a two dimensional (2D) fiducial design for image processing. The design includes a sequence of concentric rings, each of the rings including a plurality of mono-colored information sectors, each sector being of approximately a same size such that inner rings have fewer sectors than outer rings.”). Accordingly, it would have been obvious to a person of ordinary skill in the art at the time this invention was effectively filed to use Naimark’s concept of sub patterns within the fiducial pattern as having same repeating patterns incorporated concept within the Marsh/Naimark system. Naimark teaches fiducial architecture where the encoding area sub region shows approximately equal sized areas. A person of ordinary skill in the art would know that using the same internal arrangement in the sub patterns improves uniform detection as well as an ease in repetition of decoding in addition to consistent of location extraction. This allows the Marsh/Naimark system to more easily interpret fiducial structure and further increase efficiency. As for claim 15 Marsh and Naimark cover all claim limitations previously rejected in claim 13’s 103 rejection please see claim 13’s 103 rejection. Naimark teaches markers in at least two fiducial sub patterns are arranged in a different pattern (Column 14 line 56 “A two dimensional (2D) fiducial design for image processing comprising: a sequence of concentric rings, each of the rings including a plurality of mono-colored information sectors, each sector being of approximately a same size such that inner rings have less sectors than outer rings.” Naimark shows as the rings move into the center of the pattern, they get smaller yet hold the same size information sectors. The inner rings having less information sectors than the preceding outer ring constitutes differing patterns within the sub patterns. Accordingly, a person of ordinary skill in the art at the time this invention was effectively filed would have found it obvious to include sub patterns that were arranged in differing patterns. Naimark showcases that their fiducial patterns are not limited to strictly uniformity but can also be varied arrangements of encoding areas across the sub patterns. This variation of pattern allows for stronger feature distinguishment and better detectability of spatial difference within the fiducial. It allows for flexibility when unique identification is desired. This also reduces decoding ambiguity in identity and interpretation of fiducial features. Claim 2 ,11 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Marsh et al (Marsh hereinafter US US 11380008 B2 ) in view of Naimark et al (Naimark hereinafter US 7231063 B2) in further view of Lasaruk et al (Lasaruk EP 3293701 A1 METHOD AND APPARATUS FOR THE COMPENSATION OF STATIC IMAGE DISTORTIONS INTRODUCED BY A WINDSHIELD ONTO AN ADAS CAMERA). As per claim 2 Marsh and Naimark cover all claim limitations previously rejected in claim 1’s 103 rejection please see claim 1’s 103 rejection. Marsh nor Naimark are relied upon the claim limitation involving the process of determining offsets of the transparent element with respect to the camera lens from the determined locations nor applying the determined offsets to one or more images captured by the camera during processing of the one or more images to account for distortion in the one or more images caused by a corresponding shift in the transparent element with respect to the camera lens. Lasaruk teaches determination of offsets of the transparent element with respect to the camera lens from the determine locations (Paragraph [0038] “One aspect of the invention is directed at a method for calibration of a camera-based system of a vehicle including a windshield pane, the method comprising the steps of placing an imaging target in form of board with a known pattern in the field of view of a camera of the camera based system, such that the camera can acquire a calibration image of the board through the windshield pane, acquiring a calibration image of the board with the camera, comparing the calibration image to the known pattern, calculating a windshield distortion which is introduced by the windshield pane, and storing the windshield distortion in the camera-based system.” Lasaruk shows estimation of a transparent element distortion which is correlated to the windshields positioning of pose based on observable pattern mapping. Lasaruk then more explicitly ties the distortion to physical parameters in paragraph [0047]: “The physical parameters are preferably the tilt and / or the refractive index and / or the thickness of the windshield, especially preferably all of these parameters combined.” Those two concepts are essentially offset of pose parameters of a transparent element relative to the camera.) and the application of the determined offsets to one or more images captured by the camera during processing of the one or more images to account for distortion in the one or more images caused by a corresponding shift in the transparent element with respect to the camera lens. ( …rectification of images acquired by at least one camera of a camera-based system of a vehicle with a windshield pane…acquiring an image with the camera, and calculating, using the windshield distortion stored in the camera-based system, a set of points in space that is projected to a location on the image” This is the concept of applying “the determined offsets distortion parameters to images” to compensate for distortion introduced by the transparent element. The distortion parameter of “tilt” or a change of “tilt” can reasonably be determined as the claimed “shift”. Furthermore, Lasaruk explains in paragraph [0001] “the new method focuses on compensation of static effects of a windshield to the images of an ADAS camera system. Here, the word "static" refers to the fact that, once calibrated, the windshield related parameters of the proposed camera model are valid for multiple unmeasured windshields of the same type.” Insisting that differences in unmeasured transparent elements can cause distortion. It also suggests the transparent element is interchangeable, may vary in exact fit in regards to the camera system and may shift. Accordingly a person of ordinary skill in the art at the time this invention was effectively filed would have found it obvious, to further modify the Marsh/Naimark system (in which a fiducial associated with a transparent optical element produces a detector pattern that is processed to determine fiducial position and in which fiducial features/centroids are localized from image data) to further incorporate Lasaruk’s calibration based transparent element compensation techniques so that the system determines offset information for the transparent element from observed feature locations and then applies information when processing subsequent images. Marsh already showcases that the optical conditioning element changes the detector intercepted pattern and that electronics process the output to calculate fiducial position while Naimark showcases precise fiducial feature localization through centroid based image analysis. Lasaruk then provides the next step of comparing observed image information to a known pattern to calculate distortion introduced by the transparent pane, store that distortion and use it in an image rectification of acquired images. By applying Lasaruk’s distortion rectification approach to the Marsh/Naimark system, the pipeline has improved accuracy and image fidelity by adapting that rectification of shift processing to particular transparent element. As per claim 11 Marsh and Naimark cover all claim limitations previously rejected in claim 8’s 103 rejection. See claim 1’s 103 rejection. Marsh nor Naimark are relied upon for the claim limitations: one or more processors are configured to cause mechanical or software adjustments in the system based on determined information about the transparent element to adapt the system to the particular element. Lasaruk teaches the one or more processors are configured to cause mechanical or software adjustments in the system (Page 14 line 52 “rectification of images acquired by at least one camera of a camera-based system of a vehicle with a windshield pane, the method comprising the steps of…acquiring an image with the camera…calculating, using the windshield distortion stored in the camera-based system, a set of points in space that is projected to a location on the image.” The adjustment here is the calculation for a corrected image and shows a software adjustment to the systems imaging pipeline.), based on the determined information about the transparent element to adapt the system to the particular transparent element ( Paragraph [0098] “…where this transformation depends on the calculated windshield distortion. This mapping is given by an inversion of the above equations, which may or must be carried out numerically in practice…advantageously, this allows for an analytically precise modelling and rectification of the distortion introduced by a windshield pane with the respective physical parameters.” This shows that the distortion pattern is determined, stored, and used to rectify the image which adapts the system to the particular transparent elements orientation and features. The system behaves differently depending on which distortion parameter is stored. This is adaption to the particular transparent element.) Accordingly, a person of ordinary skill in the art would find it obvious to start at the Marsh/Naimark system (which the processor determines information about the transparent element from the observed fiducial pattern which includes location and encoded information) to configure the processor to make software adjustments based on that determined element specific information as taught by Lasaruk. Naimark expressly teaches fiducials with encoded information including identity code and encoding areas and Marsh teaches the location/positioning of the fiducial. The base system already provides determined information about the fiducial element. Lasaruk then teaches determining specific transparent element (windshield ) information by comparing a captured calibration image to stored known pattern, calculating the distortion of the windshield, storing that distortion and later using that stored information during the method for rectification for processing. A person of ordinary skill in the art would have incorporated Lasaruk’s transparent element compensation pipeline into the Marsh/Naimark workflow because once the transparent element information has been determined it is desirable to use that information to adapt system processing to the particulars of the transparent element to reduce distortions (such as the distortion of diffraction spread) and improve fidelity. Not only does the transparent elements information get identified but also adapts its processing behavior for compensation without changing the underlying fiducial based sensing architecture. As per claim 20 Claim 20 is the device form parallel of the already rejected claim 1 and claim 2 chain. Marsh and Naimark provide the base device of claim 1 and Lasaruk provides the added offset correction functionality. Marsh teaches a detector/camera based system in which an optical element acts on received light to create a light distribution on the photodetector “from which the instantaneous position of the fiducial…can be determined” and teaches electronics that “process the output signals from the photodetector element 105 in order to extract the instantaneous position of the fiducial 202” and teaches electronics that “process the output signals from the photodetector element 105 in order to extract the instantaneous position for the fiducial 202” Marsh also discloses a photodetector embodiment “such as the sensor in a digital camera” . Naimark teaches a fiducial having a “two-dimensional array of encoding areas” and teaches “capturing an image of the fiducial… and calculating the centroid” Lasaruk teaches “comparing the calibration image to the known pattern calculating a windshield distortion” and a “method for rectification of images” that uses stored windshield distortion for later image processing. Again, Marsh and Naimark provide the camera and transparent element as well as the pattern location system and Lasaruk provides the determination of offsets of the transparent element from the determined locations and pattern information and then apply those offsets to downstream image processing for distortion rectification caused by shift. A person of ordinary skill in the art would have been motivated to further incorporate Lasaruk’s concept of transparent element calibration and rectification to the Marsh/Naimark/Lasaruk system. Marsh already processes detector output to get fiducial patterns, Naimark provides structured fiducial features and centroid based image localization and Lasaruk enables the system to use image comparisons in concordance with known stored pattern to calculate transparent element distortion and then fix that distortion in a rectification pipeline. The advantage this device gives is a device that not only localizes the optical pattern but also uses that information to adapt image processing to the particular transparent element to rectify distortion. This improves image fidelity and makes a much more efficient process/device. Claim 3,4, 17 and, 18 are rejected under 35 U.S.C. 103 as being unpatentable over Marsh et al (Marsh hereinafter US US 11380008 B2 ) in view of Naimark et al (Naimark hereinafter US 7231063 B2) in further view of Lasaruk et al (Lasaruk EP 3293701 A1 METHOD AND APPARATUS FOR THE COMPENSATION OF STATIC IMAGE DISTORTIONS INTRODUCED BY A WINDSHIELD ONTO AN ADAS CAMERA) in further view of Slinger et al ( Slinger hereinafter US 7888626 B2). As per claim 3 Marsh , Naimark, and Lasaruk cover all claim limitations previously rejected in claim 2’s 103 rejection please see claim 2’s 103 rejection. Marsh, Naimark and Lasaruk do not teach the deconvolution step. Slinger teaches application of deconvolution technique to recover a response corresponding to the diffraction pattern. ( Column 8 line 15 “The system may therefore include a processor for decoding the output of the detector array to produce an image… the decoding algorithm may comprise a deconvolution algorithm” This is direct disclosure that the processor uses a deconvolution technique as part of its primary decoding and reconstruction pipeline. Moreover, Slinger says states In column 8 line 27 “The object of all decoding algorithms is therefore to recover the scene image by using knowledge of the mask pattern, for instance by performing a deconvolution or cross correlation. Where diffraction effects are significant however the intensity pattern at the detector array no longer corresponds directly to the aperture function. Instead, the diffraction pattern formed at the detector array is in effect a blurred version of the mask pattern. Thus, a decoding algorithm based on the aperture function of the coded aperture array will result in a blurred image.” This corresponds to the claimed concept of “recover a response corresponding to the diffraction pattern” because what Slinger records at the detector array as a diffraction pattern and describes it as a “blurred version” of known pattern. In this context deconvolution is a standard technique to recover the actual image from the blurred diffraction measurement. The Marsh/Naimark/Lasaruk/Slinger four part modified system relationship is as follows: Marsh supplies the base camera system and creates the pattern at the detector, Naimark provides the system the sub pattern fiducial markers and centroid localization, Slinger improves recovery of the detector pattern via deconvolution and Lasaruk uses the improved recovered pattern locations to compute transparent element distortion and rectify the image. Accordingly, a person of ordinary skill in the art at the time this invention was effectively filed would have found it obvious to modify the context of the transparent element calibration and rectification workflow provided by Lasaruk in the Marsh/Naimark/Lasaruk system to further incorporate the deconvolution technique provided by Slinger. The practitioner would do see to recover a response corresponding to the diffraction coded optical pattern before using that same response for feature localization and transparent element distortion/ analysis and downstream rectification. Naimark teaches fiducials having encoded sub patterns and centroid based localization of fiducial features. Lasaruk teaches determining distortion introduced by a transparent pane by comparing captured image information to a known pattern then using the determined distortion for image rectification. Before that downstream rectification, Slinger shows that decoding an encoded optical pattern may be performed using a deconvolution algorithm. The skilled practitioner can predict that applying Slinger’s deconvolution recovery to the optical pattern signal before shift rectification yields more accurate recovery of the detector response and an improvement in the quality of the localized fiducial features once rectified. This system recognizes the problem that blur or encoded optical spreading in the detector signal can degrade feature location accuracy. The four-part system provides the predictable advantage of improved fidelity in determining transparent element induced distortion without changing the underlying fiducial based imagining system. In essence Marsh gives the optical pattern generation on the detector, Naimark gives the fiducial sub pattern structure as well as augmented centroid localization, Lasaruk gives the transparent element calibration distortion issue and correction through rectification and Slinger augments this by giving the exact deconvolution tool used to improve recovery of the pattern before those locations are used for transparent element shift/distortion compensation. As per claim 4 Marsh, Slinger, Naimark, and Lasaruk cover all claim limitations previously rejected in claim 3’s 103 rejection please see claim 3’s 103 rejection. Slinger teaches the background noise removal using multiple mask patterns/frames i.e. filtering images to remove background prior to applying said deconvolution technique (Slinger explicitly shows removal of background noise in Column 2 line 62 “ The use of two complimentary patterns is useful for eliminating background noise”. Slinger shows that the filtering is done by complimentary mask pattern acquisitions as background suppression. We know this filtration of background is done prior to deconvolution because slinger states “It is well known that accurate deconvolution is susceptible to noise, so detector noise may affect this algorithm”. This shows why filtering to remove background is done prior to deconvolution. Slinger also states in column 16 line 14 “Decoding occurs using a deconvolution…While computationally efficient, F(A(x,y)) can have small terms (a general property of large binary arrays, for example), resulting in a noisy reconstruction. Appropriate mask design will minimize this effect.” This is the complimentary patterns filtering noise/background. Mask design is done prior to deconvolution. Accordingly, a person of ordinary skill in the art at the time this invention was effectively filed would have been motivated to further use aspects of Slinger’s concepts in the Marsh/Naimark/Lasaruk/Slinger system. It would have been obvious to a person of ordinary skill in the art to further configuration of processors to filter the two or more captured images to remove background prior to applying the deconvolution as recited in claim 4. Slinger shows that multi pattern acquisition can be used to eliminate background noise and further teaches that deconvolution is susceptible to noise and that noise suppression be done prior. Slinger shows that Background suppression (filtering) prior to deconvolution is predictable to improve signal to noise ratio and reduce reconstruction artifacts. This enables the expected advantage of more accurate recovery for the rectification of the transparent element (is Lasaruk’s step) and better localization without changing the underlying modified Marsh/Naimark/Lasaruk/Slinger system and its operating principles of diffraction/ sub pattern decoding pipeline. As per claim 17 Marsh Naimark and Slinger cover all claim limitations previously rejected in claim 16’s 103 rejection please see claim 16’s 103 rejection. The Marsh/Naimark/Slinger system of claim 16 will not be relied upon for the claim limitations of “determining, by the one or more processors, a shift of the transparent element with respect to the camera lens from the determined locations; and adjusting processing of one or more additional images captured by the camera to account for the determined shift in the transparent element with respect to the camera lens.” Lasaruk teaches determining, by the one or more processors, a shift of the transparent element with respect to the camera lens from the determined locations (Paragraph [0038] “comparing the calibration image to the known pattern, calculating a windshield distortion which is introduced by the windshield pane, and storing the windshield distortion in the camera-based system.” Lasaruk determines distortion of the transparent pane via comparison of observed image feature locations to known locations. This operation determines distortion/misalignment (shift )of the transparent element relative to camera optics from detected locations. This is using observed calibration feature locations compared to known locations to compute shift.) and adjusting processing of one or more additional images captured by the camera to account for the determined shift in the transparent element with respect to the camera lens. (Page 14 line 51 “Method for rectification of images acquired by at least one camera of a camera-based system …comprising the steps of acquiring an image with the camera, and calculating, using the windshield distortion” Lasaruk uses the previously determined windshield distortion to process subsequent acquired images for rectification. This is the claimed adjusting processing of one or more additional images to account for the determined shift. Accordingly, it would have been obvious to a person of ordinary skill in the art at the time this invention was effectively filed to further modify the Slinger/Marsh/Naimark method of claim 16 with the transparent element calibration and rectification teachings of Lasaruk in order to determine shift of the transparent element with respect to the camera lens from the determined pattern feature locations and to adjust processing of subsequently captured images to account for that shift. The Marsh/Slinger/Naimark system of claim 16 already provides a system in which a transparent element having a structured fiducial pattern produces an optical diffraction derived detector pattern, the detector outputs is then processed over multiple captured images, and feature locations are recovered and localized. Lasaruk then fills in the next step of using the observed image feature locations compared with known calibration locations to calculate distortion and use the difference for rectification. Lasaruk teaches “comparing the calibration image to the known pattern, calculating a windshield distortion” that is introduced to the window pane. That distortion is stored in the system. That information is then used for rectification of said distortion. That teaching suggest to the person of ordinary skill in the art that once the Slinger/Marsh/Naimark system of claim 16 determines the pattern feature on the sensor, those locations can be used to determine misalignment and that shift information can be used to rectify and or compensate the optical error of the transparent element. The system now has the advantage of improved image fidelity and consistency in downstream processing by accounting for transparent shift effects in the processing pipeline. As per claim 18 The Marsh/Naimark/Lasaruk/Slinger system cover all claim limitations previously rejected in claim 17’s 103 rejection please see claim 17’s 103 rejection. Lasaruk teaches determining the shift of the transparent element with respect to the camera lens from the determined locations comprises comparing locations of the determined locations on the image sensor to known locations on the image sensor determined during a calibration process.( Paragraph [0038] “…placing an imaging target…with a known pattern…comparing the calibration image to the known pattern, calculating a windshield distortion…” Lasaruk shows a calibration process where an image is acquired then compared to a known pattern. That is the same comparison logic being claimed. Accordingly, a person of ordinary skill in the art at the time this invention was effectively filed would have further modified the Marsh/Naimark/Slinger/Lasaruk system to include Lasaruk’s concepts of a comparing the determined image sensor locations to known locations established during calibration. Once the modified system detects and localizes diffraction derived pattern features from a captured image, a person of ordinary skill in the art would already know the need for a reference baseline to convert those measured feature locations into a reliable estimate of transparent element shift. Lasaruk provides that known solution. With Lasaruk the system is now enabled to use a known calibration pattern to establish expected image sensor locations then compare later observed locations against those known locations to infer the deviation. This improves the system by replacing a relative estimate with a calibrated comparison. This enables more accurate and repeatable determination of transparent element shift and therefore more reliable downstream rectification of images. Claim 6 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Marsh et al (Marsh hereinafter US 11380008 B2) in view of Naimark et al (Naimark hereinafter US 7231063 B2) in further view of Peroz et al (Peroz hereinafter US 10823894 B2). Marsh and Naimark teach the claim limitations previously rejected in claim 1’s 103 rejection. See claim 1’s 103 rejection. The Marsh/Naimark system solidifies the camera and patterned/sub pattern transparent element as well as the computational decoding chain. Naimark specifically has a HMD context (Column 4 line 61 “In an example, the IMU 12 and SC devices 16 are housed in an assembly generally referred to as a sensor assembly (not shown), which is often adapted to mount on the head of a user” and Column 6 line 15 “The graphics processor generates an augmented reality environment by adding to the images provided by the SC device 16 computer generated graphics. These composite images are then displayed on a head mounted display (not shown) by the graphics processor” but does not supply the hardware orientation in regards to cover glass and HMD camera integration. Peroz teaches the transparent element is a cover glass (Column 116 line 56 “The eyepiece can include a cover glass 9810” ) and that the camera and the cover glass are components of a head mounted device (Fig 19, Fig 98, Fig 16B). Accordingly, a person of ordinary skill in the art at the time this invention was effectively filed would have found it obvious to further integrate Marsh/Naimark system workflow which already allows for a camera based fiducial diffraction process into Peroz’s head mounted display format which includes a cover glass being the transparent element with the claimed hardware orientation in regards to the camera. Peroz expressly teaches a head mounted display or a wearable system and further teaches an eyepiece structure that directly specifies a cover glass as the transparent element. The already modified fiducial pipeline in an HMD augmented reality environment where cameras are commonly integrated for world capture and tracking and where cover glass is used in conjunction to optical elements enable a predictable advantage of a headset integrated implementation without changing the underlying diffraction fiducial processing principles of the Marsh/Naimark system. The modified Marsh/Naimark/Peroz system allows for a self-calibrating head mounted imaging platform capable of maintaining accurate spatial alignment and image fidelity despite mechanical variation or offset and environmental disturbance. As per claim 12 Marsh and Naimark teach the claim limitations previously rejected in claim 1’s 103 rejection. See claim 1’s 103 rejection. Marsh nor Naimark are relied upon for the fabrication of the fiducial pattern on the transparent element. Peroz teaches the fiducial pattern being formed on a surface of the transparent element using a nano imprinting lithography process (Column 77 line 28 “It will be noted that lithographic process, including UV, EBL or nanoimprint, can be used to pattern the hard mask layer with the desired diffractive structure.” Column 76 line 63 states “, the hard mask layer is formed using SiO.sub.2” which is a transparent material. Accordingly, a person of ordinary skill in the art at the time this invention was effectively filed would have used Peroz’s concept using nano printing lithography to form the fiducial pattern on the transparent element within the Marsh/Naimark/Peroz system (which requires a fiducial pattern on a transparent element to generate diffraction patterns for camera-based processing). Peroz shows that nano printing lithography is available can be used to pattern a layer with a desired structure. This is a high-resolution repeatable fabrication technique which can be used to create fine fiducial features at scale which allows for improved feature fidelity for surface fiducial pattern without changing the systems underlying diffraction-based detection and processing . Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Marsh et al (Marsh hereinafter US 11380008 B2) in view of Naimark et al (Naimark hereinafter US 7231063 B2) in further view of Lohr et al (Lohr hereinafter US 7324156 B2). Marsh and Naimark cover all claim limitations previously rejected in claim 1’s 103 rejection. See claim 1’s 103 rejection. Marsh nor Naimark are relied upon for the claim limitations of the transparent element is a lens attachment, wherein the camera is a component of a device that includes a cover glass in front of the camera, and wherein the lens attachment is configured to be attached to an inside surface or an outside surface of the cover glass. Lohr teaches a lens attachment (Column 2 line 11 “The optical accessory 122 may be a variety of optical devices designed to enhance camera lens utility. Examples of such optical devices are, but not limited to, a lens cover for protecting the camera lens 110, a zoom lens, and an optical filter. “ This shows the optical accessory is literally a lens attachment) wherein the camera is a component of a device that includes a cover glass in front of the camera, and wherein the lens attachment is configured to be attached to an inside surface or an outside surface of the cover glass. (Lohr discloses attachable mounting in Column 2 line 4 “… attachable portability enhancement device 100 is shown to attach to the portable electronic device 102 by coupling the coupling interface 120 in a receiving groove 126 of the housing 104 around the camera lens 110.” And that the optical element is positioned over the camera lens in column 2 line 4 “When the attachable portability enhancement device 100 is attached to the portable electronic device 102, the optical accessory 122 lines up with the camera lens 110, and provides an added function to the camera lens 1” Thus Lohr shows the mounting location limitation physically being at the device exterior at the camera region whilst being at the camera lens line of sight. Accordingly, a person of ordinary skill in the art at the time the inventions was effectively filed given the already modified Marsh/Naimark system (directed to camera-based imaging and processing of the claimed optical information) would have found it obvious to further provide an attachable lens attachment as taught by Lohr. Lohr teaches an attachable optical accessory that attaches to the device at the camera region and aligns with the camera lens. This modification solves the practical problem of enabling optional optical functions without permanently changing the base imaging system and provides the advantage of allowing the user to select or swap optical functionality while preserving the underlying image processing pipeline of the Marsh/Naimark system. Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Marsh et al (Marsh hereinafter US 11380008 B2) in view of Naimark et al (Naimark hereinafter US 7231063 B2) in further view of Dangelmaier et al (Dangelmaier hereinafter US 20170261765 A1). Marsh and Naimark cover all claim limitations previously rejected in claim 8’s 103 rejection. See claim 1’s 103 rejection. Neither reference are relied upon for the narrowing of what the transparent element is and what the encoded info actually contains. Dangelmaier teaches the transparent element being a lens formed according to a prescription for a user (Paragraph [0027] Fig.1 is a perspective view of a progressive power spectacles with spectacles lenses) and wherein the encoded information includes prescription information for the transparent element (Fig 3 shows encoded information and in paragraph [0040] The information from the data matrix code of the marking 32 individualizes the spectacles lens … information in the marking 32 is comprised of a database address for a database in which specifications of the spectacles lens manufacturer in respect of the spectacles lens are stored…Alternatively, data matrix code of the marking 32 can contain the information…the data matrix code of the marking … comprise the information in respect of the material of the spectacles lens, the refractive index … the value of the curvatures of the spectacles lens 4 on the front surface and back surface, at the far and near reference points (14, 16) or at the positions opposite these points.) Accordingly, a person of ordinary skill in the art would have further modified the Slinger/Naimark system (directed towards camera captured diffractive fiducial pattern processing with centroid marker localization and recovery of encoded information) by implementing Dangelmaier concepts by having the transparent element specifically be a prescription lens and to encode prescription information on that lens in a readable pattern. Dangelmaier expressly teaches storing optical parameters and lens reference point coordinates in a data matric code on the prescription lens and even further teaches capturing that code using a digital camera with light passing through the lens and decoding the stored information. This modified system allows for reliable identification of user specific prescription optics within the imaging system. This gives automatic lens identification and configuration and a reduction of mismatched error without changing the underlying diffraction pattern extraction process nor the centroid based marker localization operations. Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over Marsh et al (Marsh hereinafter US 11380008 B2) in view of Naimark et al (Naimark hereinafter US 7231063 B2) in further view of Slinger et al (Slinger hereinafter US 7888626 B2). Marsh teaches receiving light from an object field at a transparent element on an object side of a camera lens, the transparent element including a fiducial pattern (Figure 3, figure 4 and figure 9 Note: Claim 16 describes the same front end optical structure already relied upon on claim 1 ) refracting, by the camera lens, the light received through the transparent element to form an image at a surface of an image sensor, (Figure 2A, Figure 3 , Figure 4, Column 3 line 43 “optical conditioning element 104 is a lens that focuses the received light 301 onto photodetector element 105 and the position of the focused light spot on the detector moves in direct correspondence to the motion of the fiducial”) and the fiducial pattern affects the light to cause a diffraction pattern in the image ( Column 1 line 51 “The photodetector element receives a light signal from the fiducial and provides a photodetector output signal. The light signal provides a light intensity pattern at the photodetector”) Marsh also discusses processors configured to process images captured by the camera to extract the diffraction pattern and to determine locations of the diffraction pattern (Abstract “And the electronics process the photodetector output signal to calculate the position of the fiducial.” And column 4 line 44 “ The electronics 106 in the transceiver subsystem 100 may be incorporated into the transceiver module or they may be remote. The purpose of the electronics is to provide a drive signal to the optional light emitter 101 and to process the output signals from the photodetector element 105 in order to extract the instantaneous position of the fiducial 202 on the contact lens” the “electronics” (106) that is used to calculate (determine) position (location) of the pattern is printed circuit board. Column 7 line 7 “PCB (printed circuit board) 106 contains the transceiver electronics. In some embodiments, the complete subsystem is included in a housing 107.” ) Naimark teaches the fiducial pattern comprising two or more fiducial sub patterns each comprising one or more markers (Abstract: “ The design includes "solid" outside mono-color ring and 2-D dense inside coding scheme”, Column 2 line 33 “the invention features a method of detecting fiducial marks including capturing an image of a fiducial mark, the fiducial mark including a circular outer perimeter, patterns on an inside of the outer perimeter for reading an identity code including higher information density per area than a pattern of concentric rings, a two dimensional (2D) array of encoding areas,” Naimark shows that the fiducial areas includes multiple distinct pattern regions via the identity code patterns and a 2D array of encoding areas. This qualifies as two or more fiducial sub patterns and the individual encoding areas function as “markers” or features within the sub patterns. Looking at figure , 3 and 4, we can see that the perimeter of the image is the first instance of the fiducial marker then the inside geometries are the sub pattern) . As stated in claim 1 the limitation of the diffraction pattern comprises two or more diffraction sub patterns corresponding to the fiducial sub patterns is covered by Marsh in view of Naimark. Naimark supplies the multiple sub pattern marker structure whilst Marsh supplies the transparent element and the diffraction behavior. A person of ordinary skill in the art can expect multiple distinct fiducial sub regions on a diffractive transparent element to produce corresponding distinguishable diffraction derived components. This combination allows for the diffraction pattern to comprise two or more diffraction sub patterns corresponding to the fiducial sub patterns. In regards to determining by the one or more processors, locations of the diffraction sub patterns on the image sensor. Marsh already provides the processor determining pattern position from detector output and Naimark provides the multi feature fiducial and centroid based feature localization. In this modified/combined system the processor necessarily processes captured image data to extract the fiducial optical pattern and determine the locations of its components on the detector. Marsh’s workflow gives the processor architecture and pattern position determination while Naimark supplies the multiple fiducial sub- pattern features and the image based centroid location logistics used to determine their position. Marsh nor Naimark are relied upon for applying by one or more processors a deconvolution technique to the two or more images to recover a response corresponding to the diffraction pattern. Slinger teaches capturing by the image sensor two or more images (Column 21 line 30 “ the processor is arranged to combine multiple frames of data from the detector array output”) and applying a deconvolution technique to the two or more images to recover a response corresponding to the diffraction pattern (Column 11 line 5 “The method preferably comprises the step of decoding the output of the detector array to provide an image. This can be done either directly on the output of the detector array by a local processor or the output can be transmitted for remote decoding or recorded and processed later. The step of decoding comprises applying one or more of a deconvolution “ and column 11 line 31 “The method preferably comprises the step of decoding the output of the detector array to provide an image. This can be done either directly on the output of the detector array by a local processor or the output can be transmitted for remote decoding or recorded and processed later. The step of decoding comprises applying one or more of a deconvolution “ and column 19 line 35 “Fortunately, it has been found that the impact of diffraction on decoded image quality is not as severe as might be expected: even with simple deconvolution kernels, good images can be recovered”). Accordingly, it would have been obvious to a person of ordinary skill in the art to combine Marsh’s Naimark and Slinger to arrive at the method of claim 16. Freeman supplies the transparent optical element and the detector pattern generation behavior including the diffractive optical behavior already relied upon in claim 1 rejection. Naimark supplies the fiducial having multiple encoded sub pattern marker structure and also supplies a further centroid-based localization of fiducial features. Slinger supplies the explicit teaching of processing multiple data frames and using a deconvolution algorithm to recover a response from an encoded optical detector pattern. A person of ordinary skill in the art would have been motivated to use Naimark’s encoded sub pattern fiducial in Marsh’s diffractive transparent element system because doing so predictably increases the amount of distinguishable fiducial structure available for optical detection and feature extraction. As previously argued for claim 1, multiple distinct fiducial sub regions on a diffractive transparent element would yield corresponding distinguishable diffractive sub patterns in the detector. Slinger then showcases the known signal recovery step of applying deconvolution to multiple captured diffractive frames in order to recover the response corresponding to the encoded optical pattern, knowing that there can be a quality drop due to the nature of diffraction itself. Naimark’s centroid based localization in this modified system also suggests determining the locations of the recovered diffraction derived components on the sensor. The combination addresses recovering and locating structured diffraction derived fiducial information with the advantage of improved recovery fidelity and accuracy in the transparent element system. As per claim 19 Marsh, Naimark and Slinger cover the claim limitations rejected in claim 16’s 103 rejection. Please refer to claim 16’s 103 rejection. The Marsh/Naimark/Slinger system provides the underlying method in which a transparent element with a fiducial pattern produces a detector image pattern, multiple images are processed and deconvolution is applied and locations of recovered pattern features are determined. Naimark teaches the fiducial pattern encodes information about the transparent element (Column 13 line 36 “a pattern of contrasting regions on an inside of the outer perimeter that is not symmetrical about a central point of the fiducial mark, and that identifies an identity code” An identity code is encoded information carried by the fiducial pattern. Under broadest reasonable interpretation that identity code is information about the element carrying the fiducial.) locating the markers in the fiducial pattern corresponding to centroids of the diffraction sub pattern ( Column 2 line 21 “ capturing an image of the fiducial…and calculating the centroid of the one solid ellipse-like shape.” And more generally speaking “ determining a centroid of a framing object and all interior objects of a fiducial image” This shows Naimark expressing centroid based localization of fiducial features. In the already established modified/combined system of claim 16 the recovered diffraction derived components are what provide the centroids. The additional steps of claim 19 is to then use those centroids to locate the corresponding fiducial markers. Naimark gives marker to centroid localization logic.), and determining the information about the transparent element from the located markers and corresponding centroids (As stated previously Naimark shows patterns on an inside of the outer perimeter for reading identity code and teaches a 2D array of encoding areas with features for establishing reference frame for reading the patterns. Once the fiducial features are located Naimark teaches reading the encoded identity information from those fiducial features. This supports determining the information about the transparent element from the located markers and their corresponding centroid locations. Accordingly, person of ordinary skill in the art at the time this invention was effectively filed would have found it obvious to incorporate Naimark’s concepts so that the recovered diffraction pattern information is not only used to localize but also for reading encoded information from the fiducial pattern. Naimark teaches fiducials whose internal pattern identifies an identity code, has a 2D array of encoding areas and teaches centroid based processing pf fiducial features including the calculation of the centroid for said fiducial objects. Once the Marsh/Naimark/Slinger system recovers and localizes diffraction-based pattern components from the transparent element it is obviously desirable to use those located components to identify which particular transparent element is present or what information is associated with said element. This allows the detected diffraction fiducial signal to be linked to element specific information and provides the advantage of enabling the same recovered pattern data to support both localization and information extraction without changing the fundamental diffraction pattern pipeline. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHANE WRENSFORD CODRINGTON whose telephone number is (571)272-8130. The examiner can normally be reached 8:00am-5pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Bella can be reached at (571) 272-7778. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SHANE WRENSFORD CODRINGTON/Examiner, Art Unit 2667 /TOM Y LU/Primary Examiner, Art Unit 2667
Read full office action

Prosecution Timeline

Mar 12, 2024
Application Filed
Mar 24, 2026
Non-Final Rejection — §103, §DP (current)

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
100%
Grant Probability
0%
With Interview (-100.0%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 1 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month