DETAILED ACTION
In Reply filed 3/17/2025, claims 1 and 3-7 are pending and claims 1 and 7 are amended. Claims 1 and 3-7 are considered in the current Office Action.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action.
Claims 1 and 7 are rejected under 35 U.S.C. 103 as being unpatentable over US 20170355147 (“Buller”), in view of US 20130328227 (“McKinnon”).
Regarding claim 1, Buller teaches an apparatus for additively manufacturing ([0142], a 3D printing system), comprising:
a chamber (Fig. 6, a chamber 607);
a work table (Fig. 6, substrate 609), wherein a build region is provided on the work table (Fig. 6, “material bed 604” is on the work table 609);
a base plate (Fig. 6, a base 602), wherein the base plate is disposed in the build region (Fig.6, base plate 602 is disposed within the build region of powder bed 604);
a material layer forming apparatus ([0159], layer dispensing mechanism (e.g., recoater));
an irradiation apparatus ([0008], energy source that provides a directional energy beam);
an imaging apparatus ([0068], an optical detector which comprises cameras) comprises an overall imaging camera ([0204] imaging system can image an overall surface of the build region) and a partial imaging camera ([0204] imaging system can image a surface that includes an exposed surface of a material bed), wherein the overall imaging camera and the partial imaging camera are disposed above the build region (Fig. 8, detector 820 is located above the build region 804);
the overall imaging camera is fixed ([0221] the sensor can be stationary) in the chamber (Fig. 8, detector 820 is located in the chamber) and images the first region (Fig. 2, region 200) to obtain the first image including an outer edge of the base plate ([0192] the detector can capture markers 203 at the outer edge of the surface), the partial imaging camera is movably provided ([0221] the sensor can be moving) in the chamber (Fig. 8, detector 820 is located in the chamber) and images the second region ([0204] an exposed surface of a material bed) to obtain the second image ([0204] imaging system can image a surface that includes an exposed surface of a material bed);
an image processing apparatus ([0068], an image processor);
a control apparatus ([0007], The control includes computer control);
and a camera moving apparatus ([0221], sensor can be moving; Fig. 21 detector moving within the chamber, which necessarily include a moving apparatus coupled to the moving detector) disposed above the build region (Fig. 21, moving detector 2116 disposed above the build region) and the partial imaging camera is attached to the camera moving apparatus (Fig. 8, since the camera moving apparatus moves the partial imaging camera, it would be expected that the partial imaging camera is connected to the camera moving apparatus);
wherein the chamber covers the build region (Fig. 6, “material bed 604” is disposed in the chamber 607);
the material layer forming apparatus forms a material layer on an upper surface of the base plate by supplying material powder (Fig. 6, material dispensing mechanism 616 dispenses material on an upper surface of the base 602 to form a material bed 604);
the irradiation apparatus ([0009] the transforming energy beam) forms a solidified layer by irradiating the material layer ([0009] transforming and hardening a powder material in the material bed into a 3D object) with a laser beam or an electron beam ([0016], energy source includes an infrared (IR) beam array);
the imaging apparatus images ([0203] optical sensor including a camera generates images) a first region including the entire build region to obtain a first image ([0204] image include any surface, such as an overall surface of the build region),
the image processing apparatus analyzes the first image including an outer edge of the base plate ([0192] the detector can capture markers 203 at the outer edge of the surface) to obtain position information of the base plate in the build region ([0204], using the triangulation measurements and/or image processing to evaluate the roughness of a surface and any components used to effectuate these measurements; roughness evaluation of the base plate necessarily involve position information of the base plate in the build region because the camera has to ascertain the position of the base plate in order to begin the evaluation);
the imaging apparatus images ([0203] optical sensor including a camera generates images) a second region which is a part of the first region ([0160], FIGS. 25D-25F shows a portion of the material bed being imaged) and includes the first calculated coordinates to obtain a second image ([0200], images of any surface at various positions are taken and since the second image is a portion of the first image, the coordinates that are located in the first images can be included);
and the image processing apparatus analyzes the second image to obtain position information of the detection target point located an outer edge of the base plate ([0192] the detector can capture and subsequently analyze markers 203 at the outer edge of the surface) in the build region ([0204], using the triangulation measurements and/or image processing to evaluate the roughness of a surface and any components used to effectuate these measurements; roughness evaluation of the base plate necessarily involve position information of the base plate in the build region because the camera has to ascertain the position of the base plate in order to begin the evaluation);
the control apparatus creates a movement command ([0192] movement is controlled by the controller) of the partial imaging camera using the first calculated coordinates ([0192] the detector is moved at predetermined path which includes calculated coordinates), and the camera moving apparatus moves the partial imaging camera according to the movement command ([0185], the detector is coupled and moved by the controller according to command).
Buller does not teach calculating coordinates of at least one detection target point located on the outer edge of the base plate in the plan view as first calculated coordinates from the position information of the base plate and calculating coordinates of the detection target point located on the outer edge of the base plate in the plan view as second calculated coordinates from the position information of the detection target point located on the outer edge of the base plate in the plan view to set a building coordinates system based on the base plate disposed in the build region prior to building.
McKinnon teaches an apparatus for additive manufacturing ([0003]), comprising an imaging device ([0029], “the camera”) capable of calculating coordinates of at least one detection target point located on the outer edge of the base plate in the plan view (Fig. 2D, locations on the outer edge of the base plate are captured and analyzed) as first calculated coordinates ([0029] calculate distance between the camera and the parallax reference pattern which involves calculating the coordinates of the pattern, because coordinates are necessary for determining the position of the pattern relative to the camera) from the position information of the base plate ([0029], distance calculation is determined from the distance of the tool platform reference plane);
and calculating coordinates of the detection target point located on the outer edge of the base plate in the plan view (Fig. 2D, locations on the outer edge of the base plate are captured and analyzed) as second calculated coordinates ([0029] calculate a second distance between the camera and the parallax reference pattern which involves calculating the coordinates of the pattern, because coordinates are necessary for determining the position of the pattern relative to the camera) from the position information of the detection target point ([0029], distance calculation is determined from the distance of the tool platform reference plane) to set a building coordinates system based on the base plate disposed in the build region ([0097], determining the vertical displacement and the degree of parallelism of the workpiece platform and the tool platform reference plane) prior to building ([0095], determination takes place before fabrication of the composite model).
McKinnon and Buller are both considered to be analogous to the claimed invention because they are in the same field of additive manufacturing device. It would have been obvious to one with ordinary skill in the art before the effective filing date to modify the controller for the imaging apparatus in Buller to incorporate calculating coordinates of a detection target point based on the position information obtains from captured image as taught by McKinnon as detailed above, in order to calibrate the imaging surface and the tool platform reference plane (McKinnon, [0097]).
Regarding claim 7, Buller teaches a method for additively manufacturing an object (Abstract, methods of three-dimensional printing process), comprising:
a material layer forming step ([0159], pre-transformed material is moved by a leveling mechanism);
a solidifying step ([0009], hardening powder material into 3D object);
first ([0199], optical detectors image a surface of the material bed comprising untransformed material) and second image obtaining steps ([0199], optical detectors image a surface of the material bed comprising a portion of the 3D object);
first ([0200], process image to determine topography, roughness, and/or reflectivity of the surface comprising the untransformed material) and second image analysis steps ([0200], process image to determine topography, roughness, and/or reflectivity of the surface comprising the 3D object);
and first ([0204] using the triangulation measurements to evaluate the roughness of an exposed surface of a material bed) and second calculation steps ([0204] using the triangulation measurements to evaluate the roughness of a surface of the printed 3D object),
wherein in the material layer forming step, in a chamber covering a build region provided on a work table (Fig. 6, “material bed 604” is on the work table 609 and disposed in the chamber 607), material powder is supplied to an upper surface of a base plate disposed in the build region to form a material layer (Fig. 6, material dispensing mechanism 616 dispenses material on an upper surface of the base 602 to form a material bed 604; [0159], pre-transformed powder material is flattened/layered from the pre-transformed material source to the material bed by a leveling mechanism ),
in the solidifying step, the material layer is irradiated with a laser beam or an electron beam to form a solidified layer ([0009], using the transforming energy beam to harden into the three dimensional object; [0016], The energy source is an infrared beam),
a first region including the entire build region ([0204] image include any surface, such as an overall surface of the build region) is imaged to obtain a first image in the first image obtaining step ([0203] optical sensor including a camera generates images),
the first image is analyzed to obtain position information of the base plate in the build region in the first image analysis step ([0204], using the triangulation measurements and/or image processing to evaluate the roughness of a surface and any components used to effectuate these measurements; roughness evaluation of the base plate necessarily involve position information of the base plate in the build region because the camera has to ascertain the position of the base plate in order to begin the evaluation),
a second region which is a part of the first region and includes the first calculated coordinates ([0200], images of any surface at various positions are taken and since the second image is a portion of the first image, the coordinates that are located in the first images can be included) is imaged ([0160], FIGS. 25D-25F shows a portion of the material bed being imaged) to obtain a second image by a partial imaging camera in the second image obtaining step ([0203] optical sensor including a camera generates images),
the second image is analyzed to obtain position information of the detection target point in the build region in the second image analysis step ([0204], using the triangulation measurements and/or image processing to evaluate the roughness of a surface and any components used to effectuate these measurements; roughness evaluation of the base plate necessarily involve position information of the base plate in the build region because the camera has to ascertain the position of the base plate in order to begin the evaluation).*
Buller does not teach calculating coordinates of at least one detection target point located on the outer edge of the base plate in the plan view as first calculated coordinates from the position information of the base plate and calculating coordinates of the detection target point located on the outer edge of the base plate in the plan view as second calculated coordinates from the position information of the detection target point located on the outer edge of the base plate in the plan view to set a building coordinates system based on the base plate disposed in the build region prior to building.
McKinnon teaches an apparatus for additive manufacturing ([0003]), comprising an imaging device ([0029], “the camera”) capable of calculating coordinates of at least one detection target point located on the outer edge of the base plate in the plan view (Fig. 2D, locations on the outer edge of the base plate are captured and analyzed) as first calculated coordinates ([0029] calculate distance between the camera and the parallax reference pattern which involves calculating the coordinates of the pattern, because coordinates are necessary for determining the position of the pattern relative to the camera) from the position information of the base plate ([0029], distance calculation is determined from the distance of the tool platform reference plane);
and calculating coordinates of the detection target point located on the outer edge of the base plate in the plan view (Fig. 2D, locations on the outer edge of the base plate are captured and analyzed) as second calculated coordinates ([0029] calculate a second distance between the camera and the parallax reference pattern which involves calculating the coordinates of the pattern, because coordinates are necessary for determining the position of the pattern relative to the camera) from the position information of the detection target point ([0029], distance calculation is determined from the distance of the tool platform reference plane) to set a building coordinates system based on the base plate disposed in the build region ([0097], determining the vertical displacement and the degree of parallelism of the workpiece platform and the tool platform reference plane) prior to building ([0095], determination takes place before fabrication of the composite model).
It would have been obvious to one with ordinary skill in the art before the effective filing date to modify the controller for the imaging apparatus in Buller to incorporate calculating coordinates of a detection target point based on the position information obtains from captured image as taught by McKinnon as detailed above, in order to calibrate the imaging surface and the tool platform reference plane (McKinnon, [0097]).
Claim 3 is rejected under 35 U.S.C. 103 as being unpatentable over US 20170355147 (“Buller”), in view of US 20130328227 (“McKinnon”), as applied in claim 1, further in view of US 20200391443 (“Thorpe”).
Regarding claim 3, Buller teaches a fully automatic mode and a semi-automatic mode as operation modes ([0303], control may be manual and/or programmed), and the control apparatus comprises an input part ([0268], a user interface (UI) provides status of one or more components) and a calculation part ([0178], performing image analysis; [0256], processed using a triangulation technique).
Buller does not teach in the fully automatic mode, the calculation part calculates coordinates of the detection target point as first calculated coordinates from the position information of the base plate to create the movement command, and in the semi-automatic mode, the calculation part calculates coordinates of the detection target point as first calculated coordinates from additional position information of the base plate input by the input part to create the movement command, and a mode switching part that switches the operation mode.
McKinnon teaches an apparatus for additive manufacturing ([0003]), comprising an imaging device ([0029], “the camera”) capable of calculating coordinates of at least one detection target point as first calculated coordinates ([0029] calculate distance between the camera and the parallax reference pattern which involves calculating the coordinates of the pattern, because coordinates are necessary for determining the position of the pattern relative to the camera) from the position information of the base plate ([0029], distance calculation is determined from the distance of the tool platform reference plane),
and calculating coordinates of the detection target point as first calculated coordinates ([0029] calculate distance between the camera and the parallax reference pattern which involves calculating the coordinates of the pattern, because coordinates are necessary for determining the position of the pattern relative to the camera) from additional position information of the base plate ([0029], distance calculation is determined from the distance of the tool platform reference plane) input by the input part to create the movement command ([0268], a user interface (UI) provides status of one or more components; [0188] user can specified state/properties, includes the properties of the platform reference plane within the material bed).
It would have been obvious to one with ordinary skill in the art before the effective filing date to modify the controller for the imaging apparatus in Buller to incorporate calculating coordinates of a detection target point based on the position information obtains from captured image as taught by McKinnon as detailed above, in order to calibrate the imaging surface and the tool platform reference plane (McKinnon, [0097]).
Thorpe teaches an additive manufacturing device ([0002], a handheld 3D printing device), comprising a mode switching part that switches the operation mode ([0046], the user interface 51 switches between at least two operating modes, for example manual and automatic), and an input part that receives input ([0052], actuators 52 receives user inputs).
Thorpe and Buller are both considered to be analogous to the claimed invention because they are in the same field of additive manufacturing device. It would have been obvious to one with ordinary skill in the art before the effective filing date to modify the controller of printing device in Buller to incorporate a mode switching part and an input part that received input as taught by Thorpe as detailed above, in order to allows for an easy way to control and/or change the settings during use of the device (Thorpe, [0006]).
Claim 4 is rejected under 35 U.S.C. 103 as being unpatentable over US 20170355147 (“Buller”), in view of US 20130328227 (“McKinnon”) and US 20200391443 (“Thorpe”), as applied in claim 3, further in view of US 20220277511 (“Ogasawara”).
Regarding claim 4, Buller teaches a manual mode as the operation mode ([0303], the control may be manual);
an operation part ([0120] motor or scanner) for operating the camera moving apparatus ([0120], sensors and/or detectors can be movable by coupling with the aid of a motor and/or a scanner);
and the camera moving apparatus moves the partial imaging camera ([0120], sensors and/or detectors can be movable by coupling with the aid of a motor and/or a scanner) according to the movement command [0192], the movement is controlled) in the fully automatic mode and the semi-automatic mode ([0303], the control may be manual and/or programmed).
Buller does not teach moving the partial imaging camera according to an operation of the operation part by an operator in the manual mode.
Ogasawara teaches an additive manufacturing device ([0032], a 3D printer), comprising moving the partial imaging camera according to an operation ([0015] operating the position and attitude (orientation) of a camera corresponding to a virtual viewpoint) of the operation part by an operator in the manual mode ([0015], a virtual viewpoint is generated by an end user).
Ogasawara and Buller are both considered to be analogous to the claimed invention because they are in the same field of additive manufacturing device. It would have been obvious to one with ordinary skill in the art before the effective filing date to modify the movement controller of the imaging camera in Buller to incorporate an operation part which is operable by an operator in manual mode as taught by Ogasawara as detailed above, in order to either generate a virtual viewpoint image corresponding to the same 3D model as the 3D model used to generate the product or to obtain information related thereto (Ogasawara, [0015]).
Claims 5 and 6 are rejected under 35 U.S.C. 103 as being unpatentable over US 20170355147 (“Buller”), in view of US 20130328227 (“McKinnon”), as applied in claim 1, further in view of US 20220339883 (“Moran”).
Regarding claim 5, Buller does not teach a detection target point is located on an outer edge of the base plate in a plan view.
Moran teaches an additive manufacturing device ([0032], a 3D printer), wherein the detection target point is located on an outer edge of the base plate in a plan view (Fig. 4 shows a plan view of the reference component or target 430 located on an outer edge of the base plate).
Moran and Buller are both considered to be analogous to the claimed invention because they are in the same field of additive manufacturing device. It would have been obvious to one with ordinary skill in the art before the effective filing date to modify the base plate in Buller to incorporate a detection target point located on an outer edge of the base plate in a plan view as taught by Moran, in order to calibrate the printer either before fabrication of a new product, or in real time or near-real time during fabrication of the product (Moran, [0026]).
Regarding claim 6, Buller does not teach a detection target point is located at a corner of the base plate in a plan view.
Moran teaches wherein the detection target point is located at a corner of the base plate in a plan view (Fig. 4 shows a plan view of the reference component or target 430 located at a corner of the base plate).
It would have been obvious to one with ordinary skill in the art before the effective filing date to modify the base plate in Buller to incorporate a detection target point located on a corner of the base plate in a plan view as taught by Moran, in order to calibrate the printer either before fabrication of a new product, or in real time or near-real time during fabrication of the product (Moran, [0026]).
Response to Arguments
Applicant's arguments filed 3/17/2025 have been fully considered but they are not persuasive.
Regarding claim 1, applicant argues that Buller fails to teach the “overall imaging camera” and “partial imaging camera” because the detector 820 detects the emitted energy beam of the energy source 817 installed outside the chamber instead of imaging a region to obtain an image, citing [0028] of Buller which reads “the detector located inside the enclosure (e.g., 820) may be disposed at a position that may detect the emitted energy beam by the second energy source (e.g., 817)....One or more detector (e.g., sensor) may evaluate the cleanliness and/or contamination of the optical window by measuring an alteration in a penetration and/or reflection of an energy beam (e.g., light beam) that is projected (e.g., shined) onto the optical window (e.g., at least one surface of the optical window).” (Remarks, pg. 3, para. 2).
The examiner respectfully disagrees. Buller teaches detector 820 which is capable of detecting energy beam by energy source 817. However, Buller also teaches the detector can be an optical detector which comprises cameras ([0204] imaging system can image an overall surface of the build region or a portion of the build region).
Moreover, the applicant argues that the material bed 604 is smaller than the base 602, whereas according to amended claims 1 and 7, the base plate is in the build region and therefore the build region is bigger than the base plate. The claim recites “the base plate is disposed in the build region” and “the overall imaging camera images a first region to obtain a first image including an outer edge of the base plate”. If the base plate is smaller than the build region, the prior art would fail to meet these limitations.
The examiner respectfully disagrees. As shown in Fig. 6, the material bed 604 encompasses the shaded region which is larger than the base 602. Furthermore, as shown in Fig. 6, the base 602 is located in the build region 604. Buller teaches a work table (Fig. 6, substrate 609), wherein a build region is provided on the work table (Fig. 6, “material bed 604” is on the work table 609) and a base plate (Fig. 6, a base 602), wherein the base plate is disposed in the build region (Fig.6, base plate 602 is disposed within the build region of powder bed 604). Buller also teaches the overall imaging camera images the first region (Fig. 2, region 200) to obtain the first image including an outer edge of the base plate ([0192] the detector can capture markers 203 at the outer edge of the surface).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TIFFANY YU HUANG whose telephone number is (571)272-2643. The examiner can normally be reached 9:00AM - 5:00 PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Susan Leong can be reached at (571) 270-1487. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
TIFFANY YU. HUANG
Examiner
Art Unit 1754
/SUSAN D LEONG/Supervisory Patent Examiner, Art Unit 1754