DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
CLAIM INTERPRETATION
2. The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
3. Use of the word “means” (or “step for”) in a claim with functional language creates a rebuttable presumption that the claim element is to be treated in accordance with 35 U.S.C. § 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph). The presumption that § 112(f) (pre-AIA § 112, sixth paragraph) is invoked is rebutted when the function is recited with sufficient structure, material, or acts within the claim itself to entirely perform the recited function.
Absence of the word “means” (or “step for”) in a claim creates a rebuttable presumption that the claim element is not to be treated in accordance with 35 U.S.C. § 112(f) (pre-AIA 35 U.S.C. 112, sixth paragraph). The presumption that § 112(f) (pre-AIA § 112, sixth paragraph) is not invoked is rebutted when the claim element recites function but fails to recite sufficiently definite structure, material or acts to perform that function.
Claim elements in this application that use the word “means” (or “step for”) are presumed to invoke § 112(f) except as otherwise indicated in an Office action. Similarly, claim elements that do not use the word “means” (or “step for”) are presumed not to invoke § 112(f) except as otherwise indicated in an Office action.
4. Claim limitations “user detection means”, “firearm detection means”, “target detection means” and “data management and analysis unit” in claim 1 have been interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because it uses/they use a generic placeholder “means” or “unit” coupled with functional language “suitable for” without reciting sufficient structure to achieve the function. Furthermore, the generic placeholder is not preceded by a structural modifier.
Since the claim limitation(s) invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, claim 1 has been interpreted to cover the corresponding structure described in the specification that achieves the claimed function, and equivalents thereof.
A review of the specification shows that the following appears to be the corresponding structure described in the specification for the 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph limitations:
user detection means – wearable sensors with cardiac, sweat, respiratory, etc. detection groups;
firearm detection means – holster detection, grip detection, etc.;
target detection means – targets and data collector in Fig. 2;
data management and analysis unit – computer 9 – Fig 1a.
5. If applicant wishes to provide further explanation or dispute the examiner’s interpretation of the corresponding structure, applicant must identify the corresponding structure with reference to the specification by page and line number, and to the drawing, if any, by reference characters in response to this Office action.
If applicant does not intend to have the claim limitation(s) treated under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112 , sixth paragraph, applicant may amend the claim(s) so that it/they will clearly not invoke 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, or present a sufficient showing that the claim recites/recite sufficient structure, material, or acts for performing the claimed function to preclude application of 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
For more information, see MPEP § 2173 et seq. and Supplementary Examination Guidelines for Determining Compliance With 35 U.S.C. 112 and for Treatment of Related Issues in Patent Applications, 76 FR 7162, 7167 (Feb. 9, 2011).
Claim Rejections - 35 USC § 112
6. The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
7. Claims 1-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claims 1, 6, 7, 9, 11, 13, 14 and 20 utilize the term “suitable for” in describing functionality of items such as the user detection means, data management and analysis unit, etc. This term renders the claims indefinite, because it does not make clear whether the recited items are actively configured to perform these functions, or whether these items could be configured to perform these functions.
Dependent claims 2-13 and 15-20 inherit the deficiencies of their respective parent claims through their dependencies, and are thus rejected for the same reasons.
Claim Rejections - 35 USC § 102
8. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
9. Claims 1-5, 8, 11, 12, 14-17 and 19 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Stanley (US 2009/0111073 A1).
Regarding claims 1-5, 8, 11, 12, 14-17 and 19, Stanley discloses a training system for using a firearm, comprising:
a training region, in which training sessions are executable, wherein the training region comprises fixed or movable targets (see Fig. 5);
a firearm (120 – Par. 42);
user detection means suitable for detecting user physical status data (e.g. ocular sensor 116, recognition sensor 120 – Par’s. 40-41);
firearm detection means suitable for detecting firearm status data (firearm sensor 122 – Par. 42);
target detection means suitable for detecting target data (Par. 81); and
a data management and analysis unit (102), operatively connected to the user detection means, to the firearm detection means and to the target detection means, suitable for receiving the user physical status data, the firearm status data and the target data (see Fig. 3 – computer 102 for performance monitoring and reporting – Par. 43),
wherein the data management and analysis unit is configured to create a virtual training model (scenario), as a function of set/stored user features (e.g. prior performance/results), as a function of set/stored environmental features (environment type), as a function of set/stored training session features (stimulus type, delay period, movement pattern, etc.), and as a function of expected results (threshold), to compare the user physical status data, the firearm status data and the target data with the virtual training model (Par’s. 44-48) (as per claim 1),
the data management and analysis unit comprises a memory and for each training session, the data management and analysis unit collects, in the memory, the user physical status data, the firearm status data and the target data, and updates the set/stored user features, the set/stored environmental features, the set/stored training session features and the virtual training model for comparing collected new user physical status data, new firearm status data and new target data with the updated virtual training model (performance data stored and used to determine next phase of scenario – Par’s. 48-49) (as per claim 2),
the data management and analysis unit updates the virtual training model between one training session and the other and/or during a training session in real time (Par’s. 48-49) (as per claim 3),
the data management and analysis unit sets a training session as a function of initial user physical status data and/or as a function of the expected results (Par’s. 48-49) (as per claim 4),
the data management and analysis unit is connected to an external electronic device, and wherein, through said external electronic device, the data in the data management and analysis unit is accessible, modifiable and integrable, for modifying the virtual training model (network of computers 102, also operator interface 130 coupled with computer 102 – Par’s. 40, 43) (as per claim 5),
the user detection means are positioned on the user and/or are remote with respect to the user (ocular sensor 38 – see Fig. 1) (as per claim 8),
the target detection means are suitable for detecting whether and how a shot performed with the firearm hits a target (Par. 81) (as per claim 11),
the target detection means are positioned on a user and/or on the firearm, and/or the target detection means are remote with respect to the firearm, optionally the target detection means being positioned in the training region and/or on the targets (Par. 81) (as per claim 12),
a training method for using a firearm by a training system comprising: a training region, in which training sessions are executable, wherein the training region comprises fixed or movable targets; a firearm (120); user detection means suitable for detecting user physical status data (116, 120); firearm detection means suitable for detecting firearm status data (122); target detection means suitable for detecting target data (Par. 81); and a data management and analysis unit, operatively connected to the user detection means, to the firearm detection means and to the target detection means, suitable for receiving the user physical status data, the firearm status data and the target data, wherein the data management and analysis unit is configured to create a virtual training model, as a function of set/stored user features, as a function of set/stored environmental features, as a function of set/stored training session features, and as a function of expected results, to compare the user physical status data, the firearm status data and the target data with the virtual training model (Par’s. 44-48), the method comprising: detecting the user physical status data, the firearm status data and the target data; creating a virtual training model as a function of set/stored user features, as a function of set/stored environmental features, as a function of set/stored training session features and as a function of the expected results; and comparing the user physical status data, the firearm status data and the target data with the virtual training model (Par’s. 44-49) (as per claim 14),
collecting the user physical status data, the firearm status data and the target data; updating the set/stored user features, the set/stored environmental features, the set/stored training session features; and updating the virtual training model to compare new collected user physical status data, new firearm status data and new target data with the updated virtual training model (performance data stored and used to determine next phase of scenario – Par’s. 48-49) (as per claim 15),
updating the virtual training model is performed between one training session and the other, and/or during a training session in real time (Par’s. 48-49) (as per claim 16),
the external electronic device is a computer, a tablet, or a smartphone (Par’s. 40, 43)(as per claim 17), and
the user detection means (ocular tracker 38) positioned in the training region (see Fig. 1, Par. 37) (as per claim 19).
Claim Rejections - 35 USC § 103
10. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
11. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
12. Claims 6 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Stanley (US 2009/0111073 A1) in view of Miller (US 2015/0330749 A1).
Regarding claims 6 and 18, Stanley further discloses the firearm is a simulacrum firearm (Par. 80), but does not appear to disclose the training system further comprises devices wearable by a user and comprising haptic devices suitable for producing haptic signals on the user, wherein said devices wearable by the user are operatively connected to the data management and analysis unit, and wherein the data management and analysis unit controls an actuation of the haptic devices (as per claim 6), and the user detection means are worn by the user (as per claim 18). However, Miller discloses a similar shooting training system that includes haptic devices worn by a user that vibrate in response to user shooting events (Par’s. 32-33). Accordingly, it would have been obvious to one skilled in the art before the effective filing date of the invention to modify the teachings of Stanley by providing wearable haptic devices, as taught by Miller, as such a modification would involve combining prior art elements according to known method to yield predictable results.
13. Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Stanley (US 2009/0111073 A1) in view of Wynne (US 2021/0275094 A1).
Regarding claim 7, Stanley does not appear to disclose, but Wynne does disclose in a similar simulation system that includes firearm simulations (Par. 83), the user detection means comprise: a cardiac status detection group suitable for detecting a user's heartbeat and/or electrocardiogram; a respiratory status detection group suitable for detecting a respiratory rate; a sweat detection group suitable for detecting a galvanic skin response (GSR); a limb status detection group suitable for performing a myography of limbs and main muscles of the limbs; a position detection group suitable for detecting the user's position in the training region; and a user activity detection group suitable for detecting at least one of the user's speed, acceleration, or angular speed in the training region (see Par. 206). It would have been obvious to one skilled in the art before the effective filing date of the invention to modify the teachings of Stanley by detecting this information of the trainee, to obtain predictable results of training the user to manage stress during training scenarios.
14. Claims 9, 10 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Stanley (US 2009/0111073 A1) in view of Amis et al. (US 2015/0153130 A1) and Deng et al. (US 2020/0355457 A1).
Regarding claims 9, 10 and 20, Stanley further discloses firearm detection means comprise: a firearm status detection group suitable for detecting an operating status of the firearm, including presence of the firearm in a holster, or pointing of the firearm (aiming – Par. 42); and shooting and shooting mode detection group suitable for detecting shooting of the firearm and/or shooting modes (firearm sensor 122 detects laser shot fired by firearm – Par. 81). Furthermore, Amis discloses a firearm training system with a firearm configuration detection group suitable for detecting safe configuration of the firearm, or configuration of the firearm in a semi-automatic or automatic mode, or an armed configuration of the firearm (Par. 92, last 6 lines), and a firearm activity detection group suitable for detecting at least one of speed, acceleration, or angular speed of the firearm in the training region (Par. 14), and Deng discloses a firearm grip detection group suitable for detecting modes in which the firearm is gripped by a user, and a trigger guard engagement detection group suitable for detecting presence of the user's finger in a trigger guard of the firearm (Par. 356). Accordingly, it would have been obvious to one skilled in the art before the effective filing date of the invention to modify the teachings of Stanley by providing these additional firearm detection means taught by Amis and Deng, as such a modification would involve combining prior art elements according to known methods to yield predictable results of attaining additional data regarding the trainee’s firearm handling.
Regarding claim 10, Stanley further discloses the firearm detection means are positioned on the firearm and/or are remote with respect to the firearm, optionally the firearm detection means being positioned in the training region (Par. 42).
Regarding claim 20, Stanley further discloses the shooting and shooting mode detection group is suitable for detecting an actuation on a trigger performed by the user (Par. 42).
15. Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over Stanley (US 2009/0111073 A1) in view of Neeter (US 2021/0019215 A1)
Regarding claim 13, Stanley does not appear to disclose the training region comprises training region status sensors suitable for detecting region environmental conditions, including temperature, humidity, and wind, and wherein the data management and analysis unit updates the virtual training model as a function of what is detected by the training region status sensors. However, Neeter discloses a similar system for generating virtual training scenes in real-world physical areas, that includes sensors for detecting temperature, humidity and wind (Par. 39). Accordingly, it would have been obvious to one skilled in the art before the effective filing date of the invention to modify the teachings of Stanley by utilizing the environmental sensors of Neeter, to obtain predictable results of accounting for environmental factors in trainee performance data.
Conclusion
16. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. See attached PTO-892.
17. Any inquiry concerning this communication or earlier communications from the examiner should be directed to PETER EGLOFF whose telephone number is (571)270-3548. The examiner can normally be reached on Monday - Friday 9:00 am - 5:00 pm.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Xuan Thai can be reached at (571) 272-7147. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Peter R Egloff/
Primary Examiner, Art Unit 3715