Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This action is in response to the applicant’s communication filed on 01/11/2024
Claims 1-8, 10-13 are pending
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1, 5-7 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Ong et al. USPGPUB 2021/0031458 A1 (hereinafter Ong).
Regarding claim 1, Ong teaches A 3D printer (Fig. 1-7, Par. [0028] “methods S200 and S300 can be executed by an additive manufacturing system (hereinafter the "system") to control and adjust build parameters in real-time and to detect build failure based on the specific geometry of a build and data collected by the system 100 during one or more build cycles of an additive manufacturing process S100” – system 100 is interpreted as a 3d printer) comprising:
a vat (Fig. 2A, Par. [0056] “resin reservoir contained in the build tray” – interpreted as a vat) having an at least partially transparent bottom for receiving liquid photoreactive resin for producing a solid component (Fig. 1, 5, Par. [0045] “the build window 110 is substantially transparent (e.g., exhibiting greater than 85% transmittance) to the emissive spectrum of the projection system and thus passes electromagnetic radiation output by the projection system 120 into the resin above the build window 110 and separation membrane 16” – build window is interpreted as a transparent bottom);
a building platform for pulling the solid component out of the vat layer by layer (Par. [0089] “build platform 106 (adhered to the first layer) in preparation for photocuring a subsequent layer”; Par. [0063] “the system 100 can retract the build platform 106 upward by a first distance in order to separate the current layer of the build from the separation membrane 160 and then advance the build platform 106 downward-by a second distance less than or equal to the first distance-in preparation for curing a successive layer of the build”);
a projector for projecting the layer geometry onto the transparent bottom (Par. [0040] “The projection system 120 is electrically coupled to the controller; receives potentially software-modified frames corresponding to full or partial cross-sections of a three-dimensional model of the build; and projects electromagnetic radiation through the build window no and separation membrane 160 in the engaged configuration (and during the photocuring phase) to selectively photocure volumes of the resin according to build parameters and the received frames” – full or partial cross-sections of a three-dimensional model of the build is interpreted to be the layer geometry; Par. [0099] “the controller instructs the projection system 120 to irradiate selective areas of the resin between the separation membrane 160 and the build platform 106 corresponding to a first layer of the build”); a transport apparatus for moving the building platform at least downward and upward in the vat (Par. [0063] “the system 100 can retract the build platform 106 upward by a first distance in order to separate the current layer of the build from the separation membrane 160 and then advance the build platform 106 downward-by a second distance less than or equal to the first distance-in preparation for curing a successive layer of the build”); and
a control device for controlling the projector and the transport apparatus (Par. [0036] “imbedded computational device running computer code (hereinafter the "controller"), which electronically actuates the build platform 106 (e.g., via a linear actuation system) and controls the projection system 120”),
wherein the control device optimally feed forward controls the pull-off movement of the building platform in the 3D printer using a neural network (Par. [0181] “the system 100, executes a retraction feedback model to control the set of retraction phase parameters including retraction delay, initial retraction speed, and retraction distance” – retraction is interpreted as pull-off movement; Par. [0181] “More specifically, the system 100 can execute the retraction feedback model … as a machine learning model defining an input vector representing the set of sensor data streams and an output vector representing values of the set of retraction phase parameters for the subsequent build cycle”; Par. [0194] “machine learning model (e.g., such as an artificial neural network) that receives an input vector representing the set of sensor data streams” – Ong’s model predicts values of the set of retraction phase parameters for subsequent build cycles based on sensor input data, which constitutes predictive model-based control applied prior to actuation and therefore represents feed-forward control.).
Regarding claim 5, Ong teaches all the limitations of the base claims as outlined above.
Ong further teaches a user interface for inputting information about the nature or type of liquid photoreactive resin currently being used (Par. [0146] “the system 100 can receive a set of target build characteristics from a user interface in order to select a set of build parameters based on the geometry of the build and the build material for additive manufacturing process S100”).
Regarding claim 6, Ong teaches all the limitations of the base claims as outlined above.
Ong further teaches wherein the neural network has been trained with data describing a time of detachment of a component and forces occurring in a layer during detachment (Fig. 12 – Time of detachment of the component can be seen in the initial separation timing graph and force profile where the maximum differential pressure occurs. Force profile and initial separation timing can also be seen to come from the set of sensor data streams, which is the data source used in the neural network model.; Par. [0178] “system 100 can execute the pressurization feedback model … as a machine learning model defining an input vector representing the set of sensor data streams and an output vector representing values of the set of pressurization phase parameters for the subsequent build cycle”; Par. [0194] “machine learning model ( e.g., such as an artificial neural network) that receives an input vector representing the set of sensor data streams”; Par. [0141] “system 100 can prevent physical destruction of the newly-photocured build upon separation of the separation membrane 160 from the build window no and separation of the separation membrane 160 from the current
layer of the build.” – newly-photocured build serves as the component being detached), as well as at least one of the following characteristic values:
(i) properties of the liquid photoreactive resin as material,
(ii) an area solidified in the respective exposed layer,
(iii) an energy distribution introduced in the area to be solidified, in order to enable the neural network to predict the forces and detachment times that will occur with this material, so that the output of the neural network can be used to optimize the pull-off movement (Par. [0182] “system 100 can calculate an initial separation speed, via the retraction feedback model, based on an adhesive force (per unit area), a green strength of the photocured build material, the type of the build material, and/or the build geometry (including any support material).”).
Regarding claim 7, Ong teaches all the limitations of the base claims as outlined above.
Ong further teaches, wherein the 3D printer has a force measuring device for detection of data of the time of detachment of the solid component and the forces occurring during detachment (Par. [0184] “system 100 can execute feedback techniques by measuring the force experienced at the build platform with a z-axis load cell installed in the build platform.” Fig. 12 – Time of detachment can be seen in the initial separation timing graph and force profile where the maximum differential pressure occurs), wherein the data of said in combination with at least one of the following characteristic values:
(i) properties of the liquid photoreactive resin as material,
(ii) the area solidified in the respective exposed layer,
(iii) the energy distribution introduced into the area to be solidified, are made available by the force measuring device for training the neural network, and the neural network trains itself or is trained further on the basis of these data (Par. [0182] “system 100 can calculate an initial separation speed, via the retraction feedback model, based on an adhesive force (per unit area), a green strength of the photocured build material, the type of the build material, and/or the build geometry (including any support material).”; Par. [0184] “The system 100 can then correlate the z-axis force measured at the build platform to local, geometry specific, stress and strain on the build based on the three-dimensional geometry of the build and according to the finite element model”; Par. [0149] “system 100 can capture temperature profile including a bulk resin temperature (via a temperature sensor) and an interface temperature distribution (via thermographic camera), a force profile (via a load cell in the build platform) representing the force applied to the build platform over time…”).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 2-3, 8, 10, 12-13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ong et al. USPGPUB 2021/0031458 A1 (hereinafter Ong) in view of Wang et al. (Data-driven simulation for fast prediction of pull-up process in bottom-up stereo-lithography, 2018) (hereinafter Wang).
Regarding claim 2, Ong teaches all the limitations of the base claims as outlined above.
Ong further teaches using at least one of the following characteristic values in a neural network:
(i) properties of the liquid photoreactive resin as material,
(ii) an area solidified in the respective exposed layer,
(iii) an energy distribution introduced in the area to be solidified (Par. [0182] “system 100 can calculate an initial separation speed, via the retraction feedback model, based on an adhesive force (per unit area), a green strength of the photocured build material, the type of the build material, and/or the build geometry (including any support material).”),
wherein the control device feed forward controls the pull-off movement of the building platform in the 3D printer using the neural network (Par. [0185] “The system 100 can also execute the retraction feedback model to control the retraction speed of the build platform as it actuates away from the build window (after separation of the build from the build window)”).
Ong does not explicitly teach wherein the neural network determines the degree of adhesion of the solid component and the control device controls the pull-off movement based on the degree of adhesion.
However, Wang teaches wherein the neural network determines the degree of adhesion of the solid component and the control device controls the pull-off movement based on the degree of adhesion (Page 31 “To simulate the pull-up separation process, three important components, built part, PDMS silicone film, and the bonding between the part and film, have to be modeled.”; Page 30 “NN based prediction model for quickly predicting separation stress distribution during the pull-up process of a bottom-up SLA system”; Page 30 “goal of the proposed predictive scheme is to adaptively adjust the pull-up speed according to the predicted attachment stresses” – Separation stress represents the force required to detach the cured layer from the build surface and therefore corresponds to the degree of adhesion between the cured layer and the build surface).
Ong and Wang are analogous art because they are from the same field of endeavor and contain functional similarities. They both relate to 3d printing.
Therefore, at the time of effective filing date, it would have been obvious to a person of ordinary skill in the art to modify the above 3d printer using a neural network, as taught by Ong, and incorporate controlling the pull-up speed based on a neural network determining separation stress on the built part, as taught by Wang.
One of ordinary skill in the art would have been motivated to improve the through-put and reliability of the bottom-up projection based additive manufacturing process as suggested by Wang (Page 30).
Regarding claim 3, the combination of Ong and Wang teaches all the limitations of the base claims as outlined above.
Ong further teaches wherein the neural network calculates a force profile in accordance with the degree of adhesion, wherein the force is specified as a function of the travelled stroke and/or the time (Fig. 12 – force profile shown to include force over time; Fig. 13 – shows calculation of retraction speed, distance, and force from force profile; Par. [0110] “the system 100 applies, via the linear actuation system, force over time according to a material specific force profile consistent with the green strength and geometry of the build,”), and wherein the control device additionally feed forward controls the pull-off movement of the building platform in the 3D printing using the neural network on the basis of the calculated force profile (Par. [0151] “the system 100 can: capture a force profile during a build cycle; extract characteristics of the force profile representing each stage of the retraction phase in the additive manufacturing process; compare the identified characteristics of the force profile to a set of target characteristics; and adjust the build parameters of the system 100 to achieve a set of target build characteristics.”; Par. [0186] “system 100 can calculate a retraction distance for the subsequent build cycle based on the force profile captured during the previous build cycle.”;).
Regarding claim 8, Ong teaches a neural network for controlling a 3D printer (Fig. 1-7, Par. [0028] “methods S200 and S300 can be executed by an additive manufacturing system (hereinafter the "system") to control and adjust build parameters in real-time and to detect build failure based on the specific geometry of a build and data collected by the system 100 during one or more build cycles of an additive manufacturing process S100” – system 100 is interpreted as a 3d printer; Par. [0181] “More specifically, the system 100 can execute the retraction feedback model … as a machine learning model defining an input vector representing the set of sensor data streams and an output vector representing values of the set of retraction phase parameters for the subsequent build cycle”; Par. [0194] “machine learning model (e.g., such as an artificial neural network) that receives an input vector representing the set of sensor data streams”) that comprises:
a vat (Fig. 2A, Par. [0056] “resin reservoir contained in the build tray” – interpreted as a vat) having an at least partially transparent bottom for receiving liquid photoreactive resin for producing a solid component (Fig. 1, 5, Par. [0045] “the build window 110 is substantially transparent (e.g., exhibiting greater than 85% transmittance) to the emissive spectrum of the projection system and thus passes electromagnetic radiation output by the projection system 120 into the resin above the build window 110 and separation membrane 16” – build window is interpreted as a transparent bottom);
a building platform for holding and pulling out the solid component layer by layer from the vat (Par. [0089] “build platform 106 (adhered to the first layer) in preparation for photocuring a subsequent layer”; Par. [0063] “the system 100 can retract the build platform 106 upward by a first distance in order to separate the current layer of the build from the separation membrane 160 and then advance the build platform 106 downward-by a second distance less than or equal to the first distance-in preparation for curing a successive layer of the build”);
a projector for projecting the layer geometry onto the transparent bottom (Par. [0040] “The projection system 120 is electrically coupled to the controller; receives potentially software-modified frames corresponding to full or partial cross-sections of a three-dimensional model of the build; and projects electromagnetic radiation through the build window no and separation membrane 160 in the engaged configuration (and during the photocuring phase) to selectively photocure volumes of the resin according to build parameters and the received frames” – full or partial cross-sections of a three-dimensional model of the build is interpreted to be the layer geometry); a transport apparatus for at least moving the building platform downward and upward in the vat (Par. [0063] “the system 100 can retract the build platform 106 upward by a first distance in order to separate the current layer of the build from the separation membrane 160 and then advance the build platform 106 downward-by a second distance less than or equal to the first distance-in preparation for curing a successive layer of the build”); and
a control device for controlling the projector and the transport apparatus characterized in that by means of the neural network via the control device, the pull-off movement of the building platform in 3D printing is optimally feed forward controlled (Par. [0036] “imbedded computational device running computer code (hereinafter the "controller"), which electronically actuates the build platform 106 (e.g., via a linear actuation system) and controls the projection system 120”),
wherein the neural network accounts for at least one of the following characteristic values:
(i) properties of the liquid photoreactive resin as material,
(ii) an area solidified in the respective exposed layer,
(iii) an energy distribution introduced into the area to be solidified, wherein by means of the neural network via the control device, the pull-off movement of the building platform in the 3D printer is feed forward controlled on the basis of the determined degree of adhesion (Par. [0181] “the system 100, executes a retraction feedback model to control the set of retraction phase parameters including retraction delay, initial retraction speed, and retraction distance” – retraction is interpreted as pull-off movement; Par. [0181] “More specifically, the system 100 can execute the retraction feedback model … as a machine learning model defining an input vector representing the set of sensor data streams and an output vector representing values of the set of retraction phase parameters for the subsequent build cycle”; Par. [0194] “machine learning model (e.g., such as an artificial neural network) that receives an input vector representing the set of sensor data streams” – Ong’s model predicts values of the set of retraction phase parameters for subsequent build cycles based on sensor input data, which constitutes predictive model-based control applied prior to actuation and therefore represents feed-forward control.; (Par. [0182] “system 100 can calculate an initial separation speed, via the retraction feedback model, based on an adhesive force (per unit area), a green strength of the photocured build material, the type of the build material, and/or the build geometry (including any support material)”).
Ong does not explicitly teach wherein the neural network determines a degree of adhesion of the solid component to the transparent bottom.
However, Wang teaches wherein the neural network determines a degree of adhesion of the solid component to the transparent bottom (Page 31 “To simulate the pull-up separation process, three important components, built part, PDMS silicone film, and the bonding between the part and film, have to be modeled.”; Page 30 “NN based prediction model for quickly predicting separation stress distribution during the pull-up process of a bottom-up SLA system”; Page 30 “goal of the proposed predictive scheme is to adaptively adjust the pull-up speed according to the predicted attachment stresses” – Separation stress represents the force required to detach the cured layer from the build surface and therefore corresponds to the degree of adhesion between the cured layer and the build surface).
Ong and Wang are analogous art because they are from the same field of endeavor and contain functional similarities. They both relate to 3d printing.
Therefore, at the time of effective filing date, it would have been obvious to a person of ordinary skill in the art to modify the above 3d printer using a neural network, as taught by Ong, and incorporate determining separation stress on the built part using a neural network, as taught by Wang.
One of ordinary skill in the art would have been motivated to improve the through-put and reliability of the bottom-up projection based additive manufacturing process as suggested by Wang (Page 30).
Regarding claim 10, the combination of Ong and Wang teaches all the limitations of the base claims as outlined above.
Ong further teaches wherein the neural network calculates a force profile in accordance with the degree of adhesion, wherein the force is specified as a function of the travelled stroke and/or the time (Fig. 12 – force profile shown to include force over time; Fig. 13 – shows calculation of retraction speed, distance, and force from force profile; Par. [0110] “the system 100 applies, via the linear actuation system, force over time according to a material specific force profile consistent with the green strength and geometry of the build,”), and wherein by using the neural network via the control device the pull-off movement of the build platform in the 3D printing is feed forward controlled on the basis of the calculated force profile (Par. [0151] “the system 100 can: capture a force profile during a build cycle; extract characteristics of the force profile representing each stage of the retraction phase in the additive manufacturing process; compare the identified characteristics of the force profile to a set of target characteristics; and adjust the build parameters of the system 100 to achieve a set of target build characteristics.”; Par. [0186] “system 100 can calculate a retraction distance for the subsequent build cycle based on the force profile captured during the previous build cycle.”).
Regarding claim 12, the combination of Ong and Wang teaches all the limitations of the base claims as outlined above.
Ong further teaches wherein the 3D printer comprises a user interface for inputting information about a nature or type of liquid photoreactive resin currently being used, wherein the neural network takes this input into account (Par. [0146] “the system 100 can receive a set of target build characteristics from a user interface in order to select a set of build parameters based on the geometry of the build and the build material for additive manufacturing process S100”).
Regarding claim 13, the combination of Ong and Wang teaches all the limitations of the base claims as outlined above.
Ong further teaches wherein the 3D printer has a force measuring device for detection of data of a time of detachment of the solid component and forces occurring in a layer during detachment (Par. [0184] “system 100 can execute feedback techniques by measuring the force experienced at the build platform with a z-axis load cell installed in the build platform.” Fig. 12 – Time of detachment can be seen in the initial separation timing graph and force profile where the maximum differential pressure occurs; Par. [0141] “system 100 can prevent physical destruction of the newly-photocured build upon separation of the separation membrane 160 from the build window no and separation of the separation membrane 160 from the current layer of the build.” – newly-photocured build serves as the component being detached), wherein the data in combination with at least one of the following characteristic values:
(i) properties of the liquid photoreactive resin as material,
(ii) an area solidified in the respective exposed layer,
(iii) an energy distribution introduced into the area to be solidified are made available by the force measuring device for training the neural network, characterized in that the neural network is trained on the basis of these data (Par. [0182] “system 100 can calculate an initial separation speed, via the retraction feedback model, based on an adhesive force (per unit area), a green strength of the photocured build material, the type of the build material, and/or the build geometry (including any support material).”; Par. [0184] “The system 100 can then correlate the z-axis force measured at the build platform to local, geometry specific, stress and strain on the build based on the three-dimensional geometry of the build and according to the finite element model”; Par. [0149] “system 100 can capture temperature profile including a bulk resin temperature (via a temperature sensor) and an interface temperature distribution (via thermographic camera), a force profile (via a load cell in the build platform) representing the force applied to the build platform over time…”).
Claim(s) 4, 11 is/are rejected under 35 U.S.C. 103 as being unpatentable over Ong et al. USPGPUB 2021/0031458 A1 (hereinafter Ong) in view of Wang et al. (Data-driven simulation for fast prediction of pull-up process in bottom-up stereo-lithography, 2018) (hereinafter Wang), and further in view of Chen et al. USPGPUB 2013/0295212 A1 (hereinafter Chen).
Regarding claim 4, the combination of Ong and Wang teaches all the limitations of the base claims as outlined above.
Ong and Wang teach a neural network, but do not explicitly teach wherein the neural network takes into account another movement in the horizontal axis.
However, Chen teaches wherein where in the neural network takes into account another movement in the horizontal axis (Par. [0032] “relative movement between the translation stage and the vat that applies the shearing force may include horizontal, longitudinal movement”; Par. [0033] “The process controller may, while in both the first and the second horizontal positions, causes the mask image projection system to project a two dimensional image of the highest un-solidified layer of the three-dimensional object through the bottom of the vat and into the liquid resin, thereby causing at least a portion of the liquid resin to solidify in the shape of the two-dimensional image”).
Ong, Wang, and Chen are analogous art because they are from the same field of endeavor and contain functional similarities. They all relate to 3d printing.
Therefore, at the time of effective filing date, it would have been obvious to a person of ordinary skill in the art to modify the above 3d printer using a neural network, as taught by Ong and Wang, and incorporate taking into account another movement in the horizontal axis, as taught by Chen.
One of ordinary skill in the art would have been motivated to improve building time as suggested by Chen (Par. [0112]).
Regarding claim 11, the combination of Ong and Wang teaches all the limitations of the base claims as outlined above.
Ong teaches a neural network, but does not explicitly teach wherein the neural network, in addition to a movement of the building platform in a vertical axis, also takes into account another movement in a horizontal axis.
However, Chen teaches wherein the neural network, in addition to a movement of the building platform in a vertical axis, also takes into account another movement in a horizontal axis (Par. [0032] “relative movement between the translation stage and the vat that applies the shearing force may include horizontal, longitudinal movement”; Par. [0033] “The process controller may, while in both the first and the second horizontal positions, causes the mask image projection system to project a two dimensional image of the highest un-solidified layer of the three-dimensional object through the bottom of the vat and into the liquid resin, thereby causing at least a portion of the liquid resin to solidify in the shape of the two-dimensional image”).
Ong, Wang, and Chen are analogous art because they are from the same field of endeavor and contain functional similarities. They all relate to 3d printing.
Therefore, at the time of effective filing date, it would have been obvious to a person of ordinary skill in the art to modify the above 3d printer using a neural network, as taught by Ong and Wang, and incorporate taking into account another movement in the horizontal axis, as taught by Chen.
One of ordinary skill in the art would have been motivated to improve building time as suggested by Chen (Par. [0112]).
Citation of Pertinent Prior Art
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Chander et al. [ USPGPUB 2019/0054700 A1] teaches machine learning for additive manufacturing.
Mehr et al. [USPGPUB 2018/0341248 A1] teaches real-time adaptive control of additive manufacturing processes using machine learning.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PETER XU whose telephone number is (571)272-0792. The examiner can normally be reached Monday-Friday 9am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Mohammad Ali can be reached at (571) 272-4105. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PETER XU/ Examiner, Art Unit 2119
/MOHAMMAD ALI/ Supervisory Patent Examiner, Art Unit 2119