DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
In response to the amendment filed 2/2/2026; claims 1-12 are pending.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1 – 6 and 10 – 11 are rejected under 35 U.S.C. 103 as being unpatentable over Rios et al. (US 2017/0252108 A1) in view of Tian et al. (US 2020/0126297 A1) and Chinese Patent (CN 101958079 A) by Li et al. (CN 101958079 A with machines translation, 07/22/2010)
Re claim 1:
1. A virtual reality acupuncture system (Rios, Abstract; [0041], “performing a variety of medical injection procedures including prophylactic, curative, therapeutic, acupuncture, or cosmetic treatments”), comprising:
(a) a virtual reality device comprising a head-mounted display (Rios, figs. 1 - 4), wherein the head-mounted display is used for viewing a virtual scene model and a human acupoint model controlled by both hands of an operator for tracking hand movements to determine a needle insertion (Rios, [0056], “The control gloves 300 may be configured to sense the position, orientation and motion of the user's hand or hands as the user 304 interacts with at least one of the virtual environments”);
(b) an acupoint database for storing acupoint information (Rios, [0109]; [0028]; [0052], “the location of a distal tip of the injection tool 112 can be viewed on the computer-generated image”; fig. 12, 1240); and
(c) a computing host linked to the virtual reality device and the acupoint database, wherein the computing host is used for performing virtual reality and data transmission, and for executing an acupuncture virtual reality application program (Rios, fig. 12; [0051], “a computer-generated three-dimensional image(s)”; [0098], “The computing device 1000 can take one or more of different forms, including, by way of non-limiting examples, a laptop computer, a stand-alone personal computer, a server, a tablet, a workstation, a handheld device, a mobile device (such as a smartphone), and a consumer electronic device ( such as a video game console), to name a few”),
wherein the acupuncture virtual reality application program builds the virtual scene model, the human acupoint model, and a skin distance model wherein the virtual scene model is used to simulate the actual condition of the clinic, the human acupoint model is a virtual 3D stereoscopic model and has a spherical acupoint model, the skin distance model allows the needle to provide the operator with a sense of a distance depth after touching the skin to determine whether the acupuncture position is not located outside the sphere (Rios, [0010], “The injection information can comprise at least one of a position of the injection tool, an angle of the injection tool, an injection location, an injection depth, an injection angle, an injection volume, a position of the treatment target, a medication type, a recommended injection schedule, and a comparison of a detected injection to the recommended injection schedule”; [0054], “as the injection tool 112 approaches the treatment target 116, sensory indicators may be displayed, in an overlay manner for example, to indicate proximity of the virtual tool 114 to the desired target location”; [0091], “If the tip is located outside the treatment target, the process moves to step 828 and determines the process passed”; [0083], “systems, devices and methods may be used to render polygon-based objects as a model mesh 700 of the patient, as illustrated in FIG. 7. The model mesh 700 may include a three-dimensional structure including a series of polygons representing different features of the patient, such as bone, blood vessels, fat pads, muscles or nerves. These series of polygons may be rendered in different colors or with different graphical textures to make them distinct from each other or to match anatomical coloring”).
Note, In Rios, [0052], “The ability to visualize the tissue layers as the user is performing a procedure can help the user perform the procedure at an optimal location”; [0060], “The computer-generated image can correspond the facial features ( or other anatomical feature) of an actual patient (e.g., the skin tone, skin type, or facial features) or any layer of anatomy (e.g., bones, nerves, blood vessels, or the like)”. Furthermore, Rios further teaches a computer-generated image(s) may be used in conjunction with the medical injection treatment so that the locations of structures in the image correspond to the actual location of those structures in the treatment target 116. The ability to visualize the tissue layers as the user is performing a procedure can help the user perform the procedure at an optimal location. For example, the user can be instructed by the injection aid system 100 via the display device 108 to guide the injection tool 112 to an appropriate depth for a therapeutic injection. The movement of the injection tool 112 can be projected on the display device 108 in real-time during the treatment procedure (Rios, [0052]) and the treatment target can provide tactile feedback by applying force (actively or passively) in response to the injection motions or other physical interaction of the user (Rios, [0044]). Rios further suggests that an augment to the alarms/alerts on the injection system can provide audio, visual or tactile feedback confirming the correct injection technique (Rios, [0120]); wherein the audio, visual or tactile feedback can be interpreted as “sense of distance” after touching the skin. Therefore, Rios teaches a skin distance model which capable of providing audio, visual and tactile feedback that guide the user to reach the optimal location and depth.
Rios does not explicitly disclose a joystick controller. Tian et al. (US 2020/0126297 A1) teaches system and method for generating acupuncture points on reconstructed 3D human body mesh for physical treatment (Tian, Abstract). Tian teaches a virtual reality device, which comprises and a joystick (Tian, [0059], “the human operator 108 uses the haptic-enabled input device 111 to interact with virtualized surfaces on the 3D human body mesh of the patient's body 107, and mark the virtualized version of the patient's body with acupuncture points or adjust the locations of the automatically generated acupuncture points”; [0057], “in addition to the display generation component 103, one or more input devices (e.g., a touch-sensitive surface, such as a touch-sensitive remote control, or a touch-screen display that also serves as the display generation component, a mouse, a joystick, a wand controller”). Therefore, in view of Tian, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the system described in Rios, by providing a joystick as taught by Tian, since it was known in the art to use a joystick control for quick and precise movement for manipulating virtual tool.
Li teaches an invention provides a positioning model of channel acupuncture points in three-dimensional virtual human anatomy textures, which is based on the technologies of modern medicine anatomy patterns and digital images (Li, Abstract). Li teaches wherein the acupuncture virtual reality application program builds the virtual scene model, the human acupoint model, and a skin distance model … and the spherical acupoint model is a circular sphere area under a certain depth of skin surface of a subject, and is defined as a 3D acupoint located with a diameter of 0.1-5 cm (Li, pgs. 11 – 12, figs. 1- 6; pgs. 16 – 20 show virtual scene model, acupoint model, fig. 21a – 21e with circular sphere area under the skin wherein the number of acupoints follow the contour of the head and muscles (i.e., fig. 21c); pg. 28, “horizontal coordinate included angle 6 is approximately 30° from the point 6 of the acupuncture point skin acupuncture point skin entry point 6, and the depth of about 0.8 inches in the puncture is that the dummy door acupuncture point 5 is located, which is not capable of piercing or deep piercing (overrunning 1 inch), so as to avoid injury and delay. For another example, if the data is introduced into the needle, the needle is pushed from the next one inch into the needle, and the straight puncture depth is about 0.5-1 inch”; “0.5 – 1 inch” is within the claimed range 0.1 -5 cm). Therefore, in view of Li, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the system described in Rios, by providing the acupoint model and sphere model as taught by Li, since the positioning model is operated and shown in the modes of human virtual simulation three-dimensional digital images and an acupuncture point acupuncturing and inquiring system on a computer monitor, and has clear images, vivid human bodies, a three-dimensional visual angle, emulational cartoon, free section view and accurate acupoint selection (Li, pg. 1, Abstract). In a computer screen-of-view simulation, anatomical tissue images such as three-dimensional virtual human skeleton, muscle, and skin are drawn, anatomical tissue images such as three-dimensional virtual human skeleton, muscle, and skin are drawn based on the simulation of a computer screen, and a positioning model of the acupuncture points in the three-dimensional virtual human anatomical tissue is created, and the model may display a three-dimensional spatial position of a certain pulse and a certain hole in surrounding anatomical tissue, guide an acupuncture person to accurately enter the needle from the three-dimensional direction of the human body, directly hit the acupuncture points of a certain depth under the skin, and improve the efficacy of clinical acupuncture; Moreover, it is beneficial to study the relationship between each meridian, each hole and surrounding anatomical tissue or even the whole human body in three-dimensional virtual human anatomical tissues for the first time, which is conducive to studying the relationship between each meridian, each hole and surrounding anatomical tissue or even the whole human body, and in order to further explore the traditional Chinese medicine meridian and acupuncture mechanism in traditional Chinese medicine, lay a three-dimensional space for the traditional Chinese medicine combination to explore the movement of human substances, and has great scientific and application value (Li, pg. 26).
Re claim 2:
2. The system of claim 1, wherein the human body acupoint model makes different position changes to pose different actions according to different meridians or acupoints (Rios, [0053], “the user can change perspective views of the virtual anatomy by, for example, zooming in or out, panning left or right, or rotating up, down or around the virtual anatomy”; [0056]).
Re claim 3:
Rios does not explicitly disclose 3. The system of claim 2, wherein the position is sitting in a middle position with elbow flexion, sitting in a middle position with elbows flexion and palms up, sitting in a middle position with elbows flexion and palms down, sitting in a middle position with shoulder abduction, supine position, prone position or lateral position. Tian teaches the missing features (Tian, [0049]; [0101], “the posture of the 3D human body mesh shown in target region selection region is adjustable in response to the remote expert's input”; [0106], “the 2D image includes at least a portion of the patient 104, such as an upper body of the human subject, a lower body of the human subject, or a significant portion of the human body for which pose/posture change is sufficiently discernable in the images, or which includes important acupuncture points for the current physical treatment, such as an arm of the patient, a leg of the patient, the torso of the patient, the head of the patient, an ear of the patient, etc.”). Therefore, in view of Tian, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the system in Rios, by providing different postures as taught by Tian, in order to allow a patient to move its body and/or changed his/her posture (e.g., due to application of force, sensitivity or discomfort caused by the treatment, and/or fatigue caused by a fixed posture, etc.) (Tian, [0009]; [0052]) and annotations that are particular to certain parts of the body that are relevant to treatment as well as posture of the patient's body includes location and identities of joints, such as joints on the knees, hips, ankles, toes, writs, fingers, shoulders, spine, etc. (Tian, [0067]).
Re claim 4:
4. The system of claim 1, wherein the spherical acupoint model is observed from the translucent body surface of the human acupoint model to allow the operator looking the condition of the acupoint under the translucent body surface, including the anatomical structure around the acupoint (Rios, [0051], “The computer-generated image(s) can correspond to one or more layers of anatomy (e.g. bones, nerves, blood vessels, or the like) for the specific target. The images can be obtained using a CT scan, an MRI scan, a photographic image, an X-ray, and/or the like”; [0060], “The computer-generated image can correspond the facial features (or other anatomical feature) of an actual patient (e.g., the skin tone, skin type, or facial features) or any layer of anatomy (e.g., bones, nerves, blood vessels, or the like)”).
Re claim 5:
5. The system of claim 1, wherein the acupuncture virtual reality application program is executed to the construct for clarify teaching needs, functional planning, process design, situational scene model, 3D human body model, posture position, acupoint position and information, practice and test mode, or UI/UX interface design (Rios, [0081] – [0082]).
Re claim 6:
6. The system of claim 1, which performs a basic test mode comprising selecting a position posture to achieve a simulated real situation and matching an acupoint after completing the position selection (Rios, [0054], “The user can simulate an injection by moving the injection tool 112 in the physical world and having such motion depicted in the virtual world”; [0071], “The injection aid system 400 may simulate the under skin anatomical features as a see through view able to convey critical tissue information such as arteries, veins, nerves, fat and muscles as the injection tool 412 is projected on the image of the treatment target 416”).
Re claims 10 – 11:
10. The system of claim 1, wherein the circular sphere area is defined as a 3D acupoint sphere with a diameter of 0.1-3.0 cm. 11. The system of claim 1, wherein the circular sphere area is defined as a 3D acupoint sphere with a diameter of 0.5 cm (Li, pgs. 11 – 12, figs. 1- 6; pgs. 16 – 20 show virtual scene model, acupoint model, fig. 21a – 21e with circular sphere area under the skin wherein the number of acupoints follow the contour of the head and muscles (i.e., fig. 21c); pg. 28, “horizontal coordinate included angle 6 is approximately 30° from the point 6 of the acupuncture point skin acupuncture point skin entry point 6, and the depth of about 0.8 inches in the puncture is that the dummy door acupuncture point 5 is located, which is not capable of piercing or deep piercing (overrunning 1 inch), so as to avoid injury and delay. For another example, if the data is introduced into the needle, the needle is pushed from the next one inch into the needle, and the straight puncture depth is about 0.5-1 inch”; “0.5 – 1 inch” is within the claimed range 0.1 -5 cm). It would have been an obvious matter of design choice to a person of ordinary skill in the art to provide a diameter of 0.5 cm because Applicant has not disclosed that such size provides an advantage, is used for a particular purpose, or solves a stated problem. One of ordinary skill in the art, furthermore, the cited prior arts and applicant’s invention, to perform equally well.
Claims 7 – 9 are rejected under 35 U.S.C. 103 as being unpatentable over Rios, Tian and Li as applied to claim 1 above, and further in view of Chinese Patent (CN 110827597) by Guo et al. (Published on 2020-02-21)
Re claims 7 – 9:
Rios does not explicitly disclose an advance test, scoring and clinical examination case construction.
Guo teaches a to traditional Chinese medicine acupuncture teaching field, specifically relates to a simulation-based human acupuncture teaching method, system, platform, and storage medium. Guo teaches 7. The system of claim 1, which performs an advance test mode comprising appearing the acupoints of different meridians in the test and showing the acupoint position of the different positioning conditions along with the different acupoints in the test. 8. The system of claim 1, which performs an objective structured clinical examination case construction and scoring comprising establishing a real classroom in virtual reality and conducting the examination after giving the case situation the same. 9. The system of claim 8, wherein the clinical examination case construction and scoring comprising hand washing, self-introduction, indication contraindication identification, positioning requirements, acupuncture process, after taking the acupuncture process and post-acupuncture sanitary education (Guo, pg. 2, “call existing acupuncture manipulation data in the database, and the acupuncture manipulation data identifying classified existing acupuncture manipulation data in the database for comparison, finally displaying the acupuncture manipulation data”; pg. 4, “the database is stored with acupuncture manipulation all human meridian-acupoint data and expert, a teacher and student”; pg. 6, “the data comparing unit further comprises: value assigning module, for imparting to score the acupuncture manipulation data comparison result”; pg. 12, “step S401, to score the acupuncture manipulation data comparison result”; pg. 7, “by the intelligent algorithm and database comparison analysis acupuncture training process and result, brings more accurate acupuncture point position recognition and improve the learning efficiency of students”). Therefore, in view of Guo, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the system described in Rios, by evaluating a student’s clinical performance as taught by Guo, in order to provide comprehensive student acupuncture training process evaluation of information, further combining the intelligent algorithm and database technology to students of acupuncture practice process in real time and for visual display by a visual manner to improve the study efficiency (Guo, pg. 3).
Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Rios, Tian and Li as applied to claim 1 above, and further in view of Peterson (US 2014/0214134 A1)
Re claim 12:
Rios does not explicitly disclose 12. The system of claim 1, wherein the acupoint information stored in the acupoint database is the acupoint information published according to the World Health Organization's standard acupuncture point positioning. Peterson teaches the missing features (Peterson, [0004]). Therefore, in view of Peterson, it would have been obvious to one of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the system described in Rios, by providing WHO standard, so that the acupuncture locations can be certified by a known organization to ensure safety and effectiveness of the treatment.
Response to Arguments
Applicant's arguments filed 2/2/2026 have been fully considered but they are not persuasive.
Applicant argues Rios discloses tracking injection depth (please see Rios, paragraph [0010] on page 2) but does not teach a skin distance model that provides the operator with a sense of distance depth after touching the skin … The combination of Rios and Miller fails to teach or suggest this critical feature, i.e., a skin distance model.
The examiner submits that Rios teaches a skin distance model which includes a plurality of layers (Rios, [0052], “The ability to visualize the tissue layers as the user is performing a procedure can help the user perform the procedure at an optimal location”; [0060], “The computer-generated image can correspond the facial features ( or other anatomical feature) of an actual patient (e.g., the skin tone, skin type, or facial features) or any layer of anatomy (e.g., bones, nerves, blood vessels, or the like)”. Furthermore, Rios further teaches a computer-generated image(s) may be used in conjunction with the medical injection treatment so that the locations of structures in the image correspond to the actual location of those structures in the treatment target 116. The ability to visualize the tissue layers as the user is performing a procedure can help the user perform the procedure at an optimal location. For example, the user can be instructed by the injection aid system 100 via the display device 108 to guide the injection tool 112 to an appropriate depth for a therapeutic injection. The movement of the injection tool 112 can be projected on the display device 108 in real-time during the treatment procedure (Rios, [0052]) and the treatment target can provide tactile feedback by applying force (actively or passively) in response to the injection motions or other physical interaction of the user (Rios, [0044]). Rios further suggests that an augment to the alarms/alerts on the injection system can provide audio, visual or tactile feedback confirming the correct injection technique (Rios, [0120]); wherein the audio, visual or tactile feedback can be interpreted as “sense of distance” after touching the skin. Therefore, Rios teaches a skin distance model which capable of providing audio, visual and tactile feedback that guide the user to reach the optimal location and depth.
Applicant argues: Miller's spherical phantoms (diameter 25 mm or 8 mm) are merely static calibration objects for medical imaging equipment. (please see Miller, paragraph [0056] and [0060] on page 4) Miller does not teach constructing a spherical acupoint model based on acupuncture depth data, nor integrating such a model with a skin distance model for needle depth simulation. A person having ordinary skill in the art would not be motivated to apply Miller's imaging calibration phantoms to construct functional acupoint models in a VR acupuncture training system … Although Rios focuses on aid, training, evaluation, and certification of health-care professionals performing a variety of medical injection procedures including prophylactic, curative, therapeutic, acupuncture, or cosmetic treatments (please see Rios, paragraph [0041] on page 4), and Tian discusses general VR input devices, Miller relates to imaging equipment calibration. A person having ordinary skill in the art would not be motivated to combine these disparate references to arrive at the claimed acupuncture training system with its specific technical features.
The Office submits that the newly cited reference: Li et al. (Chinese Patent publication, CN 101958079 A, with machines translation) teaches the cited limitations: a spherical acupoint model
Applicant argues: Guo discloses a general acupuncture teaching system with database comparison and automatic scoring but does not teach integrating such evaluation features with the specific 3D spherical acupoint model constructed based on depth data and the skin distance model that provides tactile depth sensation as claimed in the present invention.
The Office submits that Rios teaches the required limitations: 3D spherical acupoint model constructed based on depth data and the skin distance model that provides tactile depth sensation.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JACK YIP whose telephone number is (571)270-5048. The examiner can normally be reached Monday thru Friday; 9:00 AM - 5:00 PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, XUAN THAI can be reached at (571) 272-7147. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JACK YIP/ Primary Examiner, Art Unit 3715