Detailed Action
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The amendment filed 01/11/2026 has been entered. Claims 1-20 remain pending in the application. Applicant’s amendments to the Claims have overcome each and every objection and 112(b) rejections previously set forth in the Non-Final Office Action mailed 09/11/2025.
Response to Arguments
Applicant’s arguments filed 01/11/2026 with respect to claim 1 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Given the amendments to claims 1, 10, and 19, reference to Kopylov is being relied upon to teach dependent claims 2-7, 11-16, and 20 more-consistently with the instant claim language, as shown below.
Given the amendments to claims 9 and 18, reference to Mastrototaro is being relied upon to teach claims 9 and 18 more-consistently with the instant claim language, as shown below.
Claim Objections
Claims 1-2, 4, 10-11, 13, and 19-20 are objected to because of the following informalities:
In claims 1-2, 10-11, and 19-20, “the at least one marker” should be “the at least one internal marker” for clarity.
In claims 4, 13, and 20, “the material object” should be the “the target object material” for clarity.
Appropriate correction is required.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-8, 10-17, and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Kopylov et al. (US 20240005472 A1, published January 4, 2024 with a priority date of June 30, 2022) in view of P. Lawley, "Applications of Ultrasonic Non Destructive Testing in 3D Printing", The Journal of Undergraduate Research, vol. 13, no. 4, pp.27-42, 2015, hereinafter referred to as Kopylov and Lawley, respectively.
Regarding claim 1, and similarly for claims 10 and 19, Kopylov teaches a method comprising:
performing imaging to scan an interior of a target object (Fig. 4A; see para. 0136 – “For model application workflow 417, according to one embodiment, a camera of a defect detection system generates image data 448, which includes images of a dental appliance generated from different viewpoints (e.g., different rotation settings of a platform).”);
providing scanned images generated by the ultrasound imaging as input to a machine learning software system configured to host a convolutional neural network (CNN) model (Fig. 4A; see para. 0137 – “Image data 448 may be input into defect estimator 467, which may include a trained neural network.”; see para. 0139 – “In one embodiment, a single machine learning model is trained and used for model application workflow 417, where the single machine learning model is a convolutional neural network that takes as input a set of multiple neighboring images.”);
detecting, by the CNN model, a location of at least one internal marker inside the target object using the scanned images (Fig. 4A; see para. 0137 – “Based on the image data 448, defect estimator 467 [CNN model] outputs initial feature vectors 470, which may be defect predictions [internal marker] that can include defect location and/or size (e.g., in the form of coordinates of a bounding box or other bounding shape around a defect) for each predicted defect and a defect confidence score for each predicted defect.”),
wherein detecting the location of the at least one internal marker comprises at least one of the CNN determining internal intensity variations at a location corresponding to the at least one internal marker and the CNN determining internal intensity variations at a separate location than the location corresponding to the at least one internal marker (see para. 0115 – “Deep neural networks may learn in a supervised (e.g., classification) and/or unsupervised (e.g., pattern analysis [which inherently includes intensity variations]) manner.”; see para. 0124 – “For example, for an artificial neural network being trained to perform defect classification, there may be a first class (defect), and a second class (no defect). Moreover, the class, prediction, etc. may be determined for each pixel in the image, or may be determined for each region or group of pixels of the image. For pixel level segmentation, for each pixel in the image, the final layer applies a probability that the pixel of the image belongs to the first class, a probability that the pixel belongs to the second class, and so on.”); and
extracting identification information stored in the at least one marker, wherein extracting identification information comprises identifying spatial patterns of the at least one marker within the target object (see para. 0115 – “Deep neural networks may learn in a supervised (e.g., classification) and/or unsupervised (e.g., pattern analysis [which inherently includes intensity variations]) manner.”; see par. 0140 – “Each defect estimator 467 and defect classifier 472 may be trained to detect a particular type or class of defect.”).
Kopylov teaches imaging the interior of a target object, but does not explicitly teach where the imaging is ultrasound imaging.
Whereas, Lawley, in the same of field of endeavor, teaches ultrasound imaging a 3D printed object performing, using an ultrasound scanner, ultrasound imaging to scan an interior of a target object (Figs. 6, 8-9, and 12, C-scan images of 3D printed object as ultrasound images; see pg. 29, para. 1 "By using this testing method over a surface, a C-scan image (a 2D image in which colors represent height or depth) can be determined by analyzing the waveform signal, which describes the features of a specimen along a plane of the part or on a surface."; see pg. 40, para. 2 "The C-scans obtained of the initial test specimen and the internal features specimen identified that ultrasonic testing is able to detect backside surface and internal features of 3D printed parts.").
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified imaging to scan the interior of a target object, as disclosed in Kopylov, by performing, using an ultrasound scanner, ultrasound imaging to scan the interior of a target object, as disclosed in Lawley. One of ordinary skill in the art would have been motivated to make this modification in order to further verify the accuracy and success of their print for manufacturers who are unable to measure tolerances and part features within an assembly or printed part, as taught in Lawley (see pg. 40, para. 2)
Furthermore, regarding claims 2 and 11, Kopylov further teaches wherein the at least one marker includes at least one code or material defect (see para. 0113 – “FIG. 4A illustrates a model training workflow 405 and a model application workflow 417 for defect detection [material defect] of dental appliances, in accordance with an embodiment of the present disclosure.”).
Furthermore, regarding claims 3 and 12, Kopylov further teaches wherein the at least one code is an embedded bar code or an embedded quick response (QR) code (see para. 0191 – “For example, after an appliance is thermoformed, the aligner may be laser marked with a part number identifier (e.g., serial number, barcode, or the like). In some embodiments, the system may be configured to read (e.g., optically, magnetically, or the like) an identifier (barcode, serial number, electronic tag or the like) of the mold to determine the part number associated with the aligner formed thereon.”).
Furthermore, regarding claims 4 and 13, Kopylov further teaches wherein the at least one code is used to authenticate the material object (see para. 0191 – “For example, after an appliance is thermoformed, the aligner may be laser marked with a part number identifier (e.g., serial number, barcode, or the like). In some embodiments, the system may be configured to read (e.g., optically, magnetically, or the like) an identifier (barcode, serial number, electronic tag or the like) of the mold to determine the part number associated with the aligner formed thereon [authenticate object].”).
Furthermore, regarding claims 5 and 14, Kopylov further teaches wherein the target object comprises a three dimensional (3D) printed part (see para. 0083 – “In some embodiments, a mold for a dental appliance [target object] may be fabricated using additive manufacturing techniques (also referred to herein as “3D printing”).”).
Furthermore, regarding claims 6 and 15, Kopylov further teaches wherein the target object comprises one or more of a metallic material, a plastic material, or a composite material (see para. 0175 – “The appliance [target object] can include a shell (e.g., a continuous polymeric shell or a segmented shell)…”).
Furthermore, regarding claims 7 and 16, Kopylov further teaches wherein the CNN model utilizes explainable artificial intelligence (XAI) (see para. 0137 – “Based on the image data 448, defect estimator 467 outputs initial feature vectors 470, which may be defect predictions that can include defect location and/or size (e.g., in the form of coordinates of a bounding box or other bounding shape around a defect) for each predicted defect and a defect confidence score for each predicted defect.”; see para. 0140 – “Each defect estimator 467 and defect classifier 472 may be trained to detect a particular type or class of defect.” Explanation of what influences (i.e., location, size, type of defect) the machine learning model output (i.e., defect score) as explainable Al).
Furthermore, regarding claims 8 and 17, Lawley further teaches wherein the ultrasound scanner comprises a transducer element and the ultrasound imaging is conducted using the transducer element (Fig. 1; see pg. 28, para. 3 "The waves sent from the transducer diffuse through a couplant, often water or a custom couplant product, and then through a test specimen. The waves then reflect off discontinuities in the material, which can be a surface, flaw, feature, or defect. As the transducer receives the wave signal after reflection, the time difference between the sent and received waves can be correlated to a depth or distance from the transducer at which the reflection took place." Where ultrasound imaging via an ultrasound transducer is inherent).
Furthermore, regarding claim 20, Kopylov further teaches wherein the at least one marker includes at least one code or material defect, wherein the at least one code is an embedded bar code or an embedded quick response (QR) code, and wherein the at least one code is used to authenticate the material object (see para. 0191 – “For example, after an appliance is thermoformed, the aligner may be laser marked with a part number identifier (e.g., serial number, barcode, or the like). In some embodiments, the system may be configured to read (e.g., optically, magnetically, or the like) an identifier (barcode, serial number, electronic tag or the like) of the mold to determine the part number associated with the aligner formed thereon.”).
The motivation for claims 8 and 17 was shown previously in claims 1 and 10.
Claims 9 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Kopylov in view of Lawley, as applied to claims 1 and 10 above, respectively, and in further view of Mastrototaro et al. (US 20080161664 A1, published July 3, 2008), hereinafter referred to as Mastrototaro.
Regarding claims 9 and 18, Kopylov in view of Lawley teaches all of the elements disclosed in claim 1 and 10 above, and
Lawley teaches wherein the ultrasound scanner comprises at least one sensor (Fig. 1; see pg. 28, para. 3 "The waves sent from the transducer diffuse through a couplant, often water or a custom couplant product, and then through a test specimen. The waves then reflect off discontinuities in the material, which can be a surface, flaw, feature, or defect. As the transducer receives the wave signal after reflection, the time difference between the sent and received waves can be correlated to a depth or distance from the transducer at which the reflection took place." Where the transducer is a sensor).
Kopylov in view of Lawley teaches receiving ultrasound imaging measurements from a sensor, but does not explicitly teach determining ultrasound imaging measurements from the sensor fails.
Whereas, Mastrototaro, in an analogous field of endeavor, teaches determining whether the at least one sensor comprises determining that a sequence of ultrasound imaging measurements from the at least one sensor fails at least one variance test by an error amount exceeding a threshold error (see para. 0063 – “If the total error Et is greater than the threshold, the algorithm will indicate a sensor failure…”; see para. 0039 – “In further alternative embodiments, the sensor, controller, and/or infusion communication systems may utilize a cable, a wire, fiber optic lines, RF, IR, or ultrasonic transmitters and receivers, or the like instead of the electrical traces.”).
It would have been obvious for one of ordinary skill in the art before the effective filing date of the claimed invention to have modified receiving ultrasound imaging measurements from a sensor, as disclosed in Kopylov in view of Lawley, by also determining ultrasound imaging measurements from the sensor fails, as disclosed in Mastrototaro. One of ordinary skill in the art would have been motivated to make this modification in order to improve the dependability and reliability of the sensor, as taught in Mastrototaro (see para. 0048).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Reutzel et al. (US 20230234137 A1, published July 27, 2023 with a priority date of August 12, 2020) discloses machine learning can be used to train the NN (neural network) to correlate the sensor data to a defect label or a non-defect label by looking to certain patterns in the sensor data at the x,y,z location to identify a defect in the CT scan at x,y,z. The NN can then be used to predict where defects are or will occur during an actual build of a part.
Pickerd et al. (US 20230306578 A1, published September 28, 2023 with a priority date of March 28, 2022) discloses the scanned object is determined to contain a defect when a distance between first and second output feature maps is large.
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Nyrobi Celestine whose telephone number is 571-272-0129. The examiner can normally be reached on Monday - Thursday, 7:00AM - 5:00PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Pascal Bui-Pho can be reached on 571-272-2714. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/N.C./Examiner, Art Unit 3798