DETAILED ACTION
TELE-INSPECTION SYSTEM AND METHOD
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
No information disclosure statement (IDS) was submitted.
Oath/Declaration
The Oath/Declaration submitted on 01/26/2024 is noted by the Examiner.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1-20 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-20 of copending Application No 18/425,184, 18/418,671 and 18/418,714. Although the claims at issue are not identical, they are not patentably distinct from each other because: This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented. The subject matter claimed in the instant application is fully disclosed in the patent and is covered by the patent since the patent and the application are claiming common subject matter, as follows:
Instant Application
Application 18/425,184
Application 18/418,671
Application 18/418,714
Claim 1:
A method for controlling an inspection process used in a manufacturing environment, comprising:
(a) installing equipment used for or related to an inspection process in a manufacturing environment;
(b) positioning a plurality of sensors within the manufacturing environment in proximity to the inspection equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment;
(c) connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the inspection equipment; and
(d) connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the inspection equipment by the processor, wherein the inspection equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the manufacturing inspection process.
Claim 1:
A method for controlling a material gouging process used in a manufacturing environment, comprising:
(a) installing equipment used for or related to a material gouging process “inspection process” in a manufacturing environment;
(b) positioning a plurality of sensors within the manufacturing environment in proximity to the gouging equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment;
(c) connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the gouging equipment; and
(d) connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the arc gouging equipment by the processor, wherein the gouging equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the gouging process.
Claim 1:
A method for controlling a material removal process used in a manufacturing environment, comprising:
(a) installing equipment used for or related to a material removal process in a manufacturing environment;
(b) positioning a plurality of sensors within the manufacturing environment in proximity to the material removal equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment;
(c) connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the material removal equipment; and
(d) connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the material removal equipment by the processor, wherein the material removal equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the material removal process.
Claim 1:
A method for programming equipment used for or related to a manufacturing process, comprising:
(a) installing equipment in a manufacturing environment;
(b) positioning a plurality of sensors within the manufacturing environment in proximity to the equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment;
(c) connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the equipment;
(d) connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the equipment by the processor, wherein the equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the manufacturing process; and (e) using the software to save a teachpoint in a program file.
Claim 11:
A method for controlling an inspection process used in a manufacturing environment, comprising:
(a) installing equipment used for or related to an inspection process in a manufacturing environment;
(b) positioning a plurality of sensors within the manufacturing environment in proximity to the inspection equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment;
(c) connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the inspection equipment;
(d) connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the inspection equipment by the processor, wherein the inspection equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the manufacturing inspection process; and
(e) displaying a real-time video of the manufacturing environment to the user during the inspection process.
Claim 12:
A method for controlling an arc gouging process used in a manufacturing environment, comprising:
(a) installing equipment used for or related to an arc gouging process “inspection process” in a manufacturing environment;
(b) positioning a plurality of sensors within the manufacturing environment in proximity to the arc gouging equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment;
(c) connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the arc gouging equipment;
(d) connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the arc gouging equipment by the processor, wherein the arc gouging equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the arc gouging process; and
(e) providing a computer network across which the processor communicates with the arc gouging equipment.
Claim 11:
A method for controlling a material removal process used in a manufacturing environment, comprising:
(a) installing equipment used for or related to a weld grinding process in a manufacturing environment;
(b) positioning a plurality of sensors within the manufacturing environment in proximity to the weld grinding equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment;
(c) connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the weld grinding equipment; and
(d) connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the weld grinding equipment by the processor, wherein the weld grinding equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the weld grinding process.
Claim 13:
A method for remotely programming equipment used for or related to a manufacturing process, comprising:
(a) installing equipment in a manufacturing environment;
(b) positioning a plurality of sensors within the manufacturing environment in proximity to the equipment, wherein the plurality of sensors are configured to gather data from the manufacturing environment; (c) connecting at least one processor to the plurality of sensors, wherein the at least one processor includes software for receiving data from the plurality of sensors and the equipment;
(d) connecting at least one manual controller to the processor, wherein the at least one manual controller receives motion input from a user of the manual controller, wherein the software on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the equipment by the processor, wherein the equipment, which is physically remote from the at least one controller, executes the motion commands in real-time during the manufacturing process; and
(e) saving a teachpoint in a program file, wherein the software:
(i) determines whether the equipment has moved a predetermined minimum distance from a previously saved teachpoint location; and
(ii) adds the teachpoint to the program file if the equipment moved the predetermined minimum distance from the previously saved teachpoint location.
Although the conflicting claims 1- 20 in the instant application and 1-20 in co-pending application are not identical, they are not patentably distinct from each other because they are substantially similar in scope and they use the similar limitation to produce the same end results of an additive fabrication device and method. For example, it is clear that the gouging process is identical or similar to the “inspection process” mention in the instant application.
It would have been obvious to a person with ordinary skill in the art at the time of the invention was made to modify or to omit the additional elements of claims mention in the Table above of the Instant application and Provisional Application No. 18/425,184, 18/418,671 and 18/418,714 to arrive at claims 1-20 of the instant application because the person would have realized that the remaining elements would perform the same function as before. “Omission of element and its function in combination is obvious expedient if the remaining elements perform same functions as before.” See In re Karlson (CCPA) 136 USPQ 184, decide Jan 16, 1963, Appl. No. 6857, U.S. Court of Customs and Patent Appeals.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1 and 11 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Wang et al. [herein after Wang] (US 2016/0229050).
Regarding claim 1, Wang discloses a method for controlling an inspection process used in a manufacturing environment, comprising: (a) installing equipment used for or related to an inspection process in a manufacturing environment (Fig. 2); (b) positioning a plurality of sensors (12c) within the manufacturing environment in proximity to the inspection equipment (Fig. 2), wherein the plurality of sensors (12c) are configured to gather data from the manufacturing environment (¶0012-¶0013); (c) connecting at least one processor (14c) to the plurality of sensors (12c), wherein the at least one processor (14c) includes software for receiving data from the plurality of sensors and the inspection
PNG
media_image1.png
392
684
media_image1.png
Greyscale
equipment (¶0017-¶0018); and (d) connecting at least one manual controller (12b) to the processor (14c), wherein the at least one manual controller (12b) receives motion input from a user of the manual controller (12b), wherein the software (“a robot controller 12b that includes a data interface that accepts motion commands and provides actual motion data”; ¶0012) on the processor (14c) mathematically transforms the motion input into corresponding motion commands that are sent to the inspection equipment by the processor (14c, see claim 3), wherein the inspection equipment (Fig. 2), which is physically remote from the at least one controller (12b), executes the motion commands in real-time during the manufacturing inspection process (¶0012).
Regarding claim 11, Wang discloses a method for controlling an inspection process used in a manufacturing environment, comprising: (a) installing equipment used for or related to an inspection process in a manufacturing environment (Fig. 2); (b) positioning a plurality of sensors (12c) within the manufacturing environment in proximity to the inspection equipment (Fig. 2), wherein the plurality of sensors (12c) are configured to gather data from the manufacturing environment (¶0012-¶0013); (c) connecting at least one processor (14c) to the plurality of sensors (12c), wherein the at least one processor includes software for receiving data from the plurality of sensors and the inspection equipment (¶0017-¶0018); (d) connecting at least one manual controller (12b) to the processor (14c), wherein the at least one manual controller (12b) receives motion input from a user of the manual controller (12b), wherein the software (“a robot controller 12b that includes a data interface that accepts motion commands and provides actual motion data”; ¶0012), on the processor mathematically transforms the motion input into corresponding motion commands that are sent to the inspection equipment by the processor (see claim 3), wherein the inspection equipment (Fig. 2), which is physically remote from the at least one controller (12b), executes the motion commands in real-time during the manufacturing inspection process (¶0012); and (e) displaying (14b) a real-time video of the manufacturing environment to the user during the inspection process (¶0017-¶0018).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 2-3 and 12-13 rejected under 35 U.S.C. 103 as being unpatentable over Wang et al. [herein after Wang] (US 2016/0229050).
Regarding claims 2 and 12, Wang further discloses the inspection equipment includes a robot (12a) having an end effector (12d) for evaluating the integrity of welded material [(12e); ¶0014], and wherein the end effector includes a sensor [(12c); Fig. 2].
Wang fails to explicitly disclose the end effector includes a PAUT probe, an Eddy current probe, or combinations thereof.
However, Eddy current probes are devices used to detect surface and near-surface defects in conductive materials by inducing eddy currents.
Wang discloses [“sensors 12c includes a wrist force sensor. The haptic control loop sends the force measurement from the force sensor to the controlling device 14a.”; (¶0028)].
Therefore, it would have been obvious to one having ordinary skill in the art at the time Applicants invention was filed in the field of method for controlling an inspection process used in a manufacturing environment, to modify Wang, to include a PAUT probe, an Eddy current probe, or combinations thereof, as Wang already discloses a force sensor, for the benefit of providing a device
such as cameras, microphones, position sensors, proximity sensors and force sensors, that observe the robot station 12.
Regarding claims 3 and 13, Wang further discloses (a) using at least one of the sensors in the plurality of sensors to measure a distance between the end effector and the welded material (¶0012); and (b) disabling the user’s control of the inspection equipment if the distances varies from a predetermined operating distance range (¶0019 & ¶0031).
Allowable Subject Matter
Claims 4-10 and 14-20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRANDI N HOPKINS whose telephone number is (571)270-7042. The examiner can normally be reached M & F 9-5 and T-TH, 6-4.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kristina Deherrera can be reached at (303) 297-4237. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BRANDI N HOPKINS/Primary Examiner, Art Unit 2855