DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This office action is in response to the patent application 18/426,565 originally filed on January 30, 2024. Claims 1-6 are presented for examination. Claims 1 and 6 are independent.
Information Disclosure Statement
The Information Disclosure Statements filed on 1/30/2024 and 10/17/2025 have been considered. Initialed copies of the Form 1449 are enclosed herewith.
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55 on February 29, 2024. This application claims foreign priority of JP2023-024009 (Japan), filed February 20, 2023.
Claim Rejections - 35 USC § 101
35 U.S.C. § 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-6 are rejected under 35 U.S.C. § 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more.
Claim 1 is directed to “a work training support system” (i.e. a machine) and claim 6 is directed to “a work training support method” (i.e. a process), hence the claims are directed to one of the four statutory categories (i.e. process, machine, manufacture, or composition of matter). In other words, Step 1 of the subject-matter eligibility analysis is “Yes.”
However, the claims are drawn to an abstract idea of “providing work training support,” either in the form of “certain methods of organizing human activity,” in terms of managing personal behavior or relationships or interactions between people (including social activities, teaching and following rules or instructions), or reasonably in the form of “mental processes,” in terms of processes that can be performed in the human mind (including an observation, evaluation, judgement or opinion) which are “performed on a computer” (per MPEP 2106(III)(C) “A Claim That Requires a Computer May Still Recite a Mental Process”).
Regardless, the claims are reasonably understood as either “certain methods of organizing human activity” or “mental processes,” which require the following limitations (from representative claim 6):
“detecting… a movement of a tool during a first period, wherein the first period is a period in which an instructor performs work using the tool;
generating a first image using a detection result… during the first period, wherein the first image represents the movement of the tool during the first period; and
displaying, during a second period, the first image in a field of view of a trainee… wherein the second period is a period in which the trainee performs the work using the tool.”
These limitations simply describe a process of data gathering and manipulation, which is partially analogous to “collecting information, analyzing it, and displaying certain results of the collection analysis” (i.e. Electric Power Group, LLC, v. Alstom, 830 F.3d 1350, 119 U.S.P.Q.2d 1739 (Fed. Cir. 2016)). Hence, these limitations are akin to an abstract idea which has been identified among non-limiting examples to be an abstract idea. In other words, Step 2A, Prong 1 of the subject-matter eligibility analysis is “Yes.”
Furthermore, the claims do not include additional elements that either alone or in combination are sufficient to claim a practical application because to the extent that, e.g., “a work training support system,” “a detector,” “a controller,” and “a head mounted display” are claimed, as these are merely claimed to add insignificant extra-solution activity to the judicial exception (e.g., data gathering) and/or do no more than generally link the use of a judicial exception to a particular technological environment or field of use. In other words, the claimed “providing work training support,” is not providing a practical application, thus Step 2A, Prong 2 of the subject-matter eligibility analysis is “No.”
Likewise, the claims do not include additional elements that either alone or in combination are sufficient to amount to significantly more than the judicial exception because to the extent that, e.g. “a work training support system,” “a detector,” “a controller,” and “a head mounted display” are claimed these are all generic, well-known, and conventional computing elements. As evidence that these are generic, well-known, and conventional computing elements, Applicant’s specification discloses them in a manner that indicates that the additional elements are sufficiently well-known that the specification does not need to describe the particulars of such additional elements to satisfy 35 U.S.C. § 112(a), per MPEP § 2106.07(a) III (a), which satisfies the Examiner’s evidentiary burden requirement per the Berkheimer memo.
Specifically, the Applicant’s claimed “a work training support system” is described in paragraph [0009] as follows: “The work training support system 10 includes a camera 100, a sensor 150, a controller 200, and a display 300. The camera 100, the sensor 150, and the display 300 are connected to the controller 200 through wired communication or wireless communication.” Paragraph [0010] states that “the camera 100 and the sensor 150 is sometimes referred to as a detector.” Paragraph [0011] describes the controller as “including a processor 201, a memory 202, an input/output interface 203, and an internal bus 204.” These elements are all components of a generic computer.
Furthermore, the “a head mounted display” is described as a generic head mounted display device having only the most basic features of being worn on the head of a user, displaying content on a display, and communicating with a computer.
Therefore, these elements are all reasonably interpreted as generic computers or generic computing components, which provide no details of anything beyond ubiquitous standard equipment. As such, the claimed limitations are reasonably understood as not providing anything significantly more than the judicial exception. Therefore, Step 2B, of the subject-matter eligibility analysis is “No.”
In addition, dependent claims 2-5 do not provide a practical application and are insufficient to amount to significantly more than the judicial exception. As such, dependent claims 2-5 are also rejected under 35 U.S.C. § 101, based on their dependency to independent claim 1.
Therefore, claims 1-6 are rejected under 35 U.S.C. § 101 as being directed to non-statutory subject matter.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-6 are rejected under 35 U.S.C. 102(a)(1) and 35 U.S.C. 102(a)(2) as being anticipated by Meess et al. (hereinafter “Meess,” US 2018/0130376).
Regarding claim 1, and substantially similar limitations in claim 6, Meess discloses a work training support system comprising:
a detector detecting a movement of a tool during a first period, wherein the first period is a period in which an instructor performs work using the tool (Meess [0087], “when a welder performs a weld (e.g., expert welder, instructor, a trainee, etc.), the position, orientation and movement of welding tool 460 of the welder and welding process parameters such as welding voltage, welding current, wire feed speed, etc. are recorded. After completing the weld, the welder can select an appropriate menu item that “clones” the procedure. The “cloned” procedure is then stored and can serve as a reference for future welding procedures,” an instructor records a weld during a first period, then clones it for a second period);
a controller programmed to generate a first image using a detection result of the detector during the first period, wherein the first image represents the movement of the tool during the first period (Meess [0087-0088], “The “cloned” procedure is then stored and can serve as a reference for future welding procedures. Preferably, the upper and lower target thresholds, target values or preferred variations can be entered manually by the welder and, more preferably, are automatically entered using default values, e.g., ±5% or some other appropriate value. Preferably, the upper and lower target thresholds, target values or preferred variations are configurable by the user… Preferably, when the position, orientation or movement of the welding tool 460 and/or the welding process parameters fall outside the upper and lower target thresholds, target values or preferred variations, the programmable processor-based subsystem 410 changes an attribute, e.g., color, shape, size, intensity or brightness, position and/or some other characteristic, of the appropriate visual cue. For example, the programmable processor-based subsystem 410 can change the color of the visual cue from green to yellow to red depending on the amount of deviation, and/or the visual cues can graphically show the amount of deviation from the target,” displaying visual cues that represent the movement of the tool cloned from the instructor or expert; also Meess [0092], “The stored target weld data can be that of a successful weld by an expert welder or even a successful prior welding run by the user. In some embodiments, the stored target weld data is based on computer modeling for the specific type of weld and/or testing of similar prior welds. Preferably, the stored target weld data includes information related to the weld weave pattern, TIG welding information such as filler frequency, welding start and stop indicators, weld length, welding sequencing, and welding progression. When a welding operation is started, visual and audio cues based on the stored target weld data will aid the user in creating the new weld,” providing visual cues based on the stored target weld data); and
a head mounted display worn on a head of a trainee, wherein the head mounted display displays the first image in a field of view of the trainee during a second period, wherein the second period is a period in which the trainee performs the work using the tool (see Meess Fig. 2 and [0042], “a welding helmet 12 may include a camera 26 mounted at or proximate to the point of view of the welder. In the example where the visual display 24 is a video monitor, the camera 26 may provide video pictures of the welding work area 20 to the display 24. Further, the camera 26 can be used to record the welding operation as it is ongoing, so that the welding operation can be viewed at a later time.”; Meess [0087], “The “cloned” procedure is then stored and can serve as a reference for future welding procedures.”; also Meess [0109], “the overlaid virtual weld object 902 is a representation of an ideal weld profile. Preferably, the virtual weld object 903 is partially transparent so that the user can readily see the underlying weld 480D.”).
Regarding claim 2, Meess discloses wherein the controller generates the first image representing the movement of the tool during the first period and a path of the movement of the tool during the first period (Meess [0109], “by overlaying the ideal virtual weld object 902 over the weld 480D, the user is able to see a complete profile of the actual weld as compared to an ideal weld and not just the potential problem areas.”).
Regarding claim 3, Meess discloses wherein the detector detects the movement of the tool during the second period, the controller generates a second image using a detection result of the detector during the second period, wherein the second image represents the movement of the tool during the second period, and the head mounted display displays the first image and the second image superimposed on each other after the second period (Meess [0109], “the virtual weld object 903 is partially transparent so that the user can readily see the underlying weld 480D. By overlaying the virtual weld object 902 on top of the completed weld 408D, the user and/or another observer can visually see a comparison of the actual weld produced by the user with an ideal weld.”).
Regarding claim 4, Meess discloses wherein the controller generates the first image representing the movement of the tool during the first period and a path of the movement of the tool during the first period, wherein at least one of a thickness and a color of each part of the path in the first image is represented depending on a first physical quantity related to the movement of the tool during the first period, and the controller generates the second image representing the movement of the tool during the second period and a path of the movement of the tool during the second period, wherein at least one of a thickness and a color of each part of the path in the second image is represented depending on a second physical quantity related to the movement of the tool during the second period (see Meess Fig. 23, showing an ideal weld path together with the color coded actual weld path; also Meess [0106], “the color, shape, size and/or intensity (or brightness) of the curves 904A, B of the virtual weld object 904 can change to aid the user. In the embodiment of FIG. 23, the attributes 950A and 950B along the curves 904A, B, respectively, of the virtual weld object 904 can be displayed differently from the rest of the virtual weld object 904 and/or can change when the welding tool 460 is at (or immediately prior to) the curve 904A and/or 904B in order to inform the user that there is a change in direction of the weld path. For example, the attributes 950A and 950B can be set to a different color than the rest of the virtual weld 904. In addition, different colors can be assigned to the attributes 950A and 950B to indicate the direction of the change in the weld path of joint 480C. For example, 950A can be red to indicate a change to the right and 950B can be blue to indicate a change to the left. Of course, the use of the color attribute is exemplary and other attributes can be displayed or changed as desired. For example, 950A, B can have a different intensity than the rest of the virtual weld object 904.”; also Meess [0109], “By overlaying the virtual weld object 902 on top of the completed weld 408D, the user and/or another observer can visually see a comparison of the actual weld produced by the user with an ideal weld.”).
Regarding claim 5, Meess discloses wherein the detector detects the movement of the tool during the second period, the controller generates a third image using a detection result of the detector during the first period and a detection result of the detector during the second period, wherein the third image represents a physical quantity related to the movement of the tool during the first period and a physical quantity related to the movement of the tool during the second period side by side, and the head mounted display displays the first image and the third image during the second period (Meess [0088], “the programmable processor-based subsystem 410 can change the color of the visual cue from green to yellow to red depending on the amount of deviation, and/or the visual cues can graphically show the amount of deviation from the target. For example, as seen section A of FIG. 18, the travel speed 708 shows the target value T and the amount of deviation from the target value 730 using a pointer 732 against a meter-type graphic. As seen in section A, the current travel speed is slightly fast. For CTWD 704, the indicator 734 moves relative to target 736 as shown by arrow 738. For work angle 706, the cross-hairs 740 move relative to target circle 742. Of course, the type of graphic and method of showing deviation is not limited to just these preferred embodiments. Preferably, one or more of the visual cues 700 can be programmed to change one or more attributes such as, e.g., color, shape, size, intensity or brightness, position and/or some other characteristic of the visual cue 700 when the tracking parameter and/or welding process parameter associated with the visual cue deviates from a target, e.g. falls outside of upper and lower target thresholds, target values or preferred variations for the type of welding process. Preferably, the amount of deviation is taken into account when determining a change in one or more attributes of the visual cue,” wherein the physical quantity is the deviation between the user’s tool movements and the ideal movements, and visual cues represent the deviation as the third image).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Asikainen et al. (US 2021/0008413) Interactive personal training system
Becker (US 2022/0258268) Weld tracking systems
Wallace et al. (US 2023/0419855) Simulator for skill-oriented training of a healthcare practitioner
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Stephen Alvesteffer whose telephone number is (571)272-8680. The examiner can normally be reached M-F 8:00-6:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Peter Vasat can be reached at 571-270-7625. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/STEPHEN ALVESTEFFER/Examiner, Art Unit 3715