DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-3, 5, 12, 17-18, 20-21 and 23 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Gabrielle Rios et al., US 2019/0130792 A1.
Independent claim 1, Rios discloses a virtual reality-based treatment system for performing treatment on at least one condition of a subject, comprising:
a virtual reality device arranged to be fitted to the subject and for immersing the subject in a virtual reality environment (i.e. The computing system can be configured to generate an augmented environment for display on a pair of augmented reality glasses in electrical communication with the computing system – Para 28);
at least one tracking camera configured to capture physical traits and movement of the body of the subject (i.e. Images of the patient's face can be acquired through use of a camera or scanner 220 of the injection aid system 200 – Para 93);
a processor communicating with the virtual reality device and the at least one tracking camera (i.e. computing system, e.g. processor - Fig. 1A “104”);
a monitor in communication with the processor, and including a user interface (Fig. 1 A “108”; Fig. 1B), wherein the processor is programmed to:
generate a dynamic virtual representation of the body of the subject based on the captured physical traits and movement of the body of the subject, and to render the virtual representation of the body in the virtual reality environment via the virtual reality device, wherein the dynamic virtual representation is synchronised with the movement of the body of the subject (i.e. The platform can comprise an augmented reality projection configured to provide at least a virtual visual representation of a portion of an internal anatomy of a patient's body portion overlaid or projected onto a patient's body portion and synchronized with movements of the patient's body portion – Para 28);
generate a virtual representation of the at least one condition of the subject in response to one or more inputs (i.e. provide user pictures including past pictures – Para 80; injection sites (for example, a display of a target injection site) can be superimposed on the simulated patient's face – Para 62);
overlay or render the virtual representation of the condition of the subject on the virtual representation of the body of the subject (i.e. receive and process information relating to the patient's past and present appearances so that the computing system further recommends and outputs a simulated outcome of an injection procedure – Para 21); and
receive and process one or more inputs representing one or more attributes of the condition to adjust the virtual representation of the condition of the subject in the virtual reality environment to thereby assist the subject to visualise and resolve the condition (i.e. receive one or more user inputs and/or filters to apply to anatomical features to output a simulation of outcomes of an injection – Para 68).
Independent claim 2, the claim is similar in scope to independent claim 1. Therefore, similar rationale as applied in the rejection of claim 1 applies herein.
Claim 3, Rios discloses the method of claim 2, further comprising: generating virtual representations of multiple layers or components of the virtual body selected from at least two of a skin layer or component, a muscle layer or component, a nerves layer or component, an organs layer or component, a vascular layer or component, a respiratory layer or component and a skeleton layer or component-and enabling switching between virtual representations of the layers or components (i.e. The computer-generated image(s) can correspond to one or more layers of anatomy (for example, bones, nerves, blood vessels, or the like) for the specific target - Para 62).
Claim 5, Rios discloses the method of claim2, wherein the captured physical traits include at least three of body shape, face shape, skin colour, hair colour/style, eye colour, height, weight, and gender (i.e. the processor or the computing system 104 can receive one or more of the inputs, such as user or patient inputs and or filters; The inputs can include, for example, readings from a facial recognition software, past and present photos of the patient; filters can include, for example, a gender filter,– Para 68).
Claim 12, Rios discloses the method of claim2, wherein the condition is one of pain, chronic pain, a physical or mental ailment or disability- including amputeeism and various levels of paralysis or palsy, and a physical or mental state which requires enhancing or therapy including muscle condition, mental acuity, or stress (i.e. medical conditions, such as cancer and dental treatment, but may be expanded to treating aesthetic imperfections, restorative cosmetic procedures, procedures for treating migraine, depression, lung aspirations, epidurals, orthopedic procedures, self-administered injections, in vitro procedures, or other therapeutic procedures – Para 5).
Independent claim 17, the claim is similar in scope to independent claim 1. Therefore, similar rationale as applied in the rejection of claim 1 applies herein.
Rios discloses a virtual subject creator module to capture physical traits of the body of a subject and render a virtual subject including those traits (i.e. provide user pictures including past pictures – Para 80);
a virtual subject controller module to capture movement of the body of the subject and render a moving virtual subject using the virtual subject from the virtual human creator module (i.e. The platform can comprise an augmented reality projection configured to provide at least a virtual visual representation of a portion of an internal anatomy of a patient's body portion overlaid or projected onto a patient's body portion and synchronized with movements of the patient's body portion – Para 28);
a virtual condition module to generate a virtual representation of the at least one condition of the subject in response to one or more inputs, and layer it on the moving virtual subject (i.e. injection sites (for example, a display of a target injection site) can be superimposed on the simulated patient's face – Para 62; the face includes the present appearance of the user – Para 80); and
a virtual environment module for providing a selectable virtual environment for the subject (i.e. The injection training platform can also include a gaming or training application, which can allow an injector, such as an injector using the injection systems or platforms described herein, to learn different injection techniques that are unique for different ethnicities. – Para 70; specify the ethnicity for use in training – Para 79).
Independent claim 18, the claim is similar in scope to independent claim 1. Therefore, similar rationale as applied in the rejection of claim 1 applies herein.
Independent claim 20, the claim is similar in scope to independent claim 1. Therefore, similar rationale as applied in the rejection of claim 1 applies herein.
Independent claim 21, the claim is similar in scope to independent claim 17. Therefore, similar rationale as applied in the rejection of claim 17 applies herein.
Claim 23, Rios discloses the extended reality (XR) based treatment system ofclaim18, wherein the extended reality based treatment system is selected from the group comprising at least one of virtual reality (VR), augmented reality (AR). and mixed reality (MR) (i.e. outputs to an augmented reality viewing system – Para 14).
Claims 4, 6-11 and 24-27 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHANTE HARRISON whose telephone number is (571)272-7659. The examiner can normally be reached Monday - Friday 8:00 am to 5:00 pm EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at 571-272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHANTE E HARRISON/Primary Examiner, Art Unit 2615