Prosecution Insights
Last updated: April 19, 2026
Application No. 18/237,111

Method For Training And Quantifying Specific Motor Skills And Cognitive Processes In Persons By Analysing Oculomotor Patterns W Using A 3-D Virtual Reality Device With Embedded Eye-Tracking Technology, Specific Visual Stimuli, Sensors To Show The Movement Of Limbs

Non-Final OA §101§103§112
Filed
Aug 23, 2023
Examiner
HOFFPAUIR, ANDREW ELI
Art Unit
3791
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
VIEWMIND, INC.
OA Round
1 (Non-Final)
39%
Grant Probability
At Risk
1-2
OA Rounds
3y 12m
To Grant
80%
With Interview

Examiner Intelligence

Grants only 39% of cases
39%
Career Allow Rate
29 granted / 75 resolved
-31.3% vs TC avg
Strong +41% interview lift
Without
With
+41.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 12m
Avg Prosecution
61 currently pending
Career history
136
Total Applications
across all art units

Statute-Specific Performance

§101
18.4%
-21.6% vs TC avg
§103
44.5%
+4.5% vs TC avg
§102
8.4%
-31.6% vs TC avg
§112
27.4%
-12.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 75 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Priority Applicant’s claim for the benefit of a prior-filed application under 35 U.S.C. 119(e) or under 35 U.S.C. 120, 121, 365(c), or 386(c) is acknowledged. Applicant has not complied with one or more conditions for receiving the benefit of an earlier filing date under 35 U.S.C. 119(e) or under 35 U.S.C. 120, 121, 365(c), or 386(c) as follows: The later-filed application must be an application for a patent for an invention which is also disclosed in the prior application (the parent or original nonprovisional application or provisional application). The disclosure of the invention in the parent application and in the later-filed application must be sufficient to comply with the requirements of 35 U.S.C. 112(a) or the first paragraph of pre-AIA 35 U.S.C. 112, except for the best mode requirement. See Transco Products, Inc. v. Performance Contracting, Inc., 38 F.3d 551, 32 USPQ2d 1077 (Fed. Cir. 1994). The disclosure of the prior-filed application, Application No. 18/227,577 and Provisional Application No. 62/373,228, fails to provide adequate support or enablement in the manner provided by 35 U.S.C. 112(a) or pre-AIA 35 U.S.C. 112, first paragraph for one or more claims of this application. Regarding claims 1 and 8, the limitations “identify[ing] selected ones of the measured eye and limb movements that are related to the performance, motor skills and cognitive capabilities of the person; determin[ing] expected eye and limb movements of the person while the person is viewing the virtual objects and performing the tasks and comparing the expected eye and limb movements to the selected ones of the measured eye and limb movements to determine deviations therebetween; and evaluating the performance, motor skills and cognitive capabilities of the person based on the deviations” are not supported in the disclosure as originally filed in the prior-filed applications”. The disclosure at the time of the effective filing date of Application No. 18/227,577 does not explicitly disclose a 3-D virtual reality environment, motion sensors, or measuring limb movements and does not disclose identify[ing] selected ones of the measured eye and limb movements that are related to the performance, motor skills and cognitive capabilities of the person; determin[ing] expected eye and limb movements of the person while the person is viewing the virtual objects and performing the tasks and comparing the expected eye and limb movements to the selected ones of the measured eye and limb movements to determine deviations therebetween; and evaluating the performance, motor skills and cognitive capabilities of the person based on the deviations. The disclosure at the time of the effective filing date of Provisional Application No. 62/373,228 does disclose a 3-D virtual reality environment (page 2), motion sensors to measure arm and leg movement/touch of objects with hands and feet (pages 1-3), and using eye tracking for registering/measuring saccades/eye movements (pages 1-2). However, the disclosure at the time of the effective filing date of Provisional Application No. 62/373,228 does not explicitly disclose the term “comparing” and does not disclose identify[ing] selected ones of the measured eye and limb movements that are related to the performance, motor skills and cognitive capabilities of the person; determin[ing] expected eye and limb movements of the person while the person is viewing the virtual objects and performing the tasks and comparing the expected eye and limb movements to the selected ones of the measured eye and limb movements to determine deviations therebetween; and evaluating the performance, motor skills and cognitive capabilities of the person based on the deviations. Accordingly, claims 1-8 are not entitled to the benefit of the prior applications and will be treated with an effective filing date of August 23rd, 2023. Drawings The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they include the following reference character(s) not mentioned in the description: “25”, “35” in fig. 1; & “535” in fig. 5 (should be 530, as mentioned in [0157] of the PGPUB). Corrected drawing sheets in compliance with 37 CFR 1.121(d), or amendment to the specification to add the reference character(s) in the description in compliance with 37 CFR 1.121(b) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance. Claim Objections Claims 5 and 7 are objected to because of the following informalities: Claim 5 line 5: “an average saccadic latency” should recite “the average saccadic latency representing …”. Claim 7 line 17 the punctuation “.” should be replaced by “;”. Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-7 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 (claims 2-7 by virtue of dependency) recites the limitation "the subject" in lines 6 and 10. There is insufficient antecedent basis for this limitation in the claim. The limitation is suggested to recite “the person” and will be interpreted as such for examination purposes. Regarding claim 4, the phrase "although not limited to” renders the claim indefinite because it is unclear whether the limitation(s) following the phrase are part of the claimed invention. See MPEP § 2173.05(d). The limitation is suggested to recite “the different specified feature of the virtual objects is color”. Regarding claim 5, the phrase "i.e." renders the claim indefinite because it is unclear whether the limitation(s) (how many times the person touch the incorrect objects) following the phrase are part of the claimed invention. See MPEP § 2173.05(d). The limitation is suggested to recite “wherein the inhibition process errors is how many times the person touches the incorrect objects”. Claim 7 recites the limitations "said stimulus image", “the gaze duration”, and “the visual stimulus”; “the left eye, the right eye”; “the hands and/or the feet”, “the subject”; “the hands”, “the touched green objects”; “the average time”, in lines 5, 6, 9, 11, 13, 15, 20, 21, & 23, respectively. There is insufficient antecedent basis for these limitations in the claim. Claim 7 is rejected as failing to define the invention in the manner required by 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph. The claim is narrative in form and replete with indefinite language. The structure which goes to make up the device/method must be clearly and positively specified. The structure/method must be organized and correlated in such a manner as to present a complete operative device/method. The claim(s) must be in one sentence form only. Note the format of the claims in the patent(s) cited. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-8 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. STEP 1 Claim 1 recites a method, including determining expected eye and limb movements of the person while the person is viewing the virtual objects and performing the tasks and comparing the expected eye and limb movements to the selected ones of the measured eye and limb movements to determine deviations therebetween and claim 8 recites a machine including a 3D virtual reality device, which are both statutory categories of invention. Claims 1-8 are all within at least one of the four categories. STEP 2A, PRONG ONE The claim is then analyzed to determine whether it is directed to any judicial exception. The above claim limitations (identifying, determining, evaluating) constitute an abstract idea that is part of the Mathematical Concepts and/or Mental Processes group identified in the 2019 Revised Patent Subject Matter Eligibility Guidance published in the Federal Register (84 FR 50) on January 7, 2019. The claimed steps of identifying, determining, evaluating can be practically performed in the human mind using mental steps or basic critical thinking, which are types of activities that have been found by the courts to represent abstract ideas. “[T]he ‘mental processes’ abstract idea grouping is defined as concepts performed in the human mind, and examples of mental processes include observations, evaluations, judgments, and opinions.” MPEP 2106.04(a)(2) III. The pending claims merely recite steps for evaluating performance, motor skills and cognitive capabilities that includes identifying, determining, evaluating. Examples of ineligible claims that recite mental processes include: a claim to “collecting information, analyzing it, and displaying certain results of the collection and analysis,” where the data analysis steps are recited at a high level of generality such that they could practically be performed in the human mind, Electric Power Group, LLC v. Alstom, S.A.; claims to “comparing BRCA sequences and determining the existence of alterations,” where the claims cover any way of comparing BRCA sequences such that the comparison steps can practically be performed in the human mind, University of Utah Research Foundation v. Ambry Genetics Corp. a claim to collecting and comparing known information, which are steps that can be practically performed in the human mind, Classen Immunotherapies, Inc. v. Biogen IDEC. See p. 7-8 of October 2019 Update: Subject Matter Eligibility. Regarding the dependent claims, the dependent claims are directed to either 1) steps that are also abstract or 2) additional data gathering/output that is well-understood, routine and previously known to the industry. Although the dependent claims are further limiting, they do not recite significantly more than the abstract idea. A narrow abstract idea is still an abstract idea and an abstract idea with additional well-known equipment/functions is not significantly more than the abstract idea. Claims 2-7 are directed to more abstract ideas. STEP 2A, PRONG TWO Next, the claim as a whole is analyzed to determine whether the claim recites additional elements that integrate the judicial exception into a practical application. The claim fails to recite an additional element or a combination of additional elements to apply, rely on, or use the judicial exception in a manner that imposes a meaningful limitation on the judicial exception. This judicial exception (abstract idea) in Claims 1-8 is not integrated into a practical application because: The abstract idea amounts to simply implementing the abstract idea on a computer. For example, the recitations regarding the generic computing components for identifying, determining, evaluating, merely invoke a computer as a tool. The data-gathering step (requesting, repeating, measuring) and the data-output step do not add a meaningful limitation to the method as they are insignificant extra-solution activity. There is no improvement to a computer or other technology. “The McRO court indicated that it was the incorporation of the particular claimed rules in computer animation that "improved [the] existing technological process", unlike cases such as Alice where a computer was merely used as a tool to perform an existing process.” MPEP 2106.05(a) II. The claims recite a computer that is used as a tool for identifying, determining, evaluating. The claims do not apply the abstract idea to affect a particular treatment or prophylaxis for a disease or medical condition. Rather, the abstract idea is utilized to determine a relationship among data to evaluate performance, motor skills and cognitive capabilities of a person. The claims do not apply the abstract idea to a particular machine. “Integral use of a machine to achieve performance of a method may provide significantly more, in contrast to where the machine is merely an object on which the method operates, which does not provide significantly more.” MPEP 2106.05(b). II. “Use of a machine that contributes only nominally or insignificantly to the execution of the claimed method (e.g., in a data gathering step or in a field-of-use limitation) would not provide significantly more.” MPEP 2106.05(b) III. The pending claims utilize a computer for identifying, determining, evaluating. The claims do not apply the obtained calculation to a particular machine. Rather, the data is merely output in a post-solution step. STEP 2B Next, the claim as a whole is analyzed to determine whether any element, or combination of elements, is sufficient to ensure that the claim amounts to significantly more than the exception. Besides the Abstract Idea, the additional elements are identified as follows: three-dimensional (3D) virtual reality device; eye tracker; motion sensors; processor. Those in the relevant field of art would recognize the above-identified additional elements as being well-understood, routine, and conventional means for data-gathering and computing, as demonstrated by Applicant’s specification (e.g., para. [73-74, 0134]) which discloses the three-dimensional (3D) virtual reality device, eye tracker, and processor comprises generic computer components that are configured to perform data-gathering and the generic computer functions (e.g., identifying, determining, evaluating) that are well-understood, routine, and conventional activities previously known to the pertinent industry. Applicant’s Background in the specification; The non-patent literature of record in the application: Fooken et al.; Decoding go/no-go decisions from eye movements. Journal of Vision 2019;19(2):5. https://doi.org/10.1167/19.2.5; Clay V, König P, König S. Eye Tracking in Virtual Reality. J Eye Mov Res. 2019 Apr 5;12(1):10.16910/jemr.12.1.3. doi: 10.16910/jemr.12.1.3. PMID: 33828721; PMCID: PMC7903250; Kim, J., Jang, H., Kim, D., & Lee, J. (2023). Exploration of the Virtual Reality Teleportation Methods Using Hand-Tracking, Eye-Tracking, and EEG. International Journal of Human–Computer Interaction, 39(20), 4112–4125. https://doi.org/10.1080/10447318.2022.2109248 (published online 17 Aug 2022); Laivuori, N. (2021). Eye and Hand Tracking in VR Training Application. Theseus.fi. http://www.theseus.fi/handle/10024/503405. Thus, the claimed additional elements “are so well-known that they do not need to be described in detail in a patent application to satisfy 35 U.S.C. § 112(a).” Berkheimer Memorandum, III. A. 3. Furthermore, the court decisions discussed in MPEP § 2106.05(d)(lI) note the well-understood, routine and conventional nature of such additional generic computer components as those claimed. See option III. A. 2. in the Berkheimer memorandum. The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the units associated with the steps do not add meaningful limitation to the abstract idea. A computer, processor, memory, or equivalent hardware is merely used as a tool for executing the abstract idea(s). The process claimed does not reflect an improvement in the functioning of the computer. When considered in combination, the additional elements (i.e., the generic computer functions and conventional equipment/steps) do not amount to significantly more than the abstract idea. Looking at the claim limitations as a whole adds nothing that is not already present when looking at the elements taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 1-4 and 8 are rejected under 35 U.S.C. 103 as being unpatentable over Ettenhofer (US 20160022136 A1) in view of Josephson (US 20220270509 A1). Regarding claim 1, Ettenhofer discloses a method for evaluating performance, motor skills and cognitive capabilities of a person (“performance”; “neurocognitive function”; “motor function”, Abstract, para. [0046, 0188, 0145]), comprising: requesting a person to perform a task (“visual test … task”, para. [0054-0058, 0136]), the task requesting the person to virtually touch specified virtual objects each having a different specified feature (“virtual button … touch-sensitive screen … target … press the button”; “different locations and timings”, para. [0058, 0065, 0079-0081, 0086, 0150-0164], figs. 1-2), the specified virtual objects being presented in a three-dimensional (3D) virtual environment (“visual environment”; “three-dimensional embodiments … target signal … cue signals”, para. [0047, 0061-0062]); repeating the requesting by requesting the person to perform the task a plurality of times for different ones of the virtual objects having different specified features (“series of visual tests”; “different locations and timings”; “varying task characteristics across multiple trials”; “cue signals … different trial types were mixed within each subtest”, para. [0054, 0086,0144-0145, 0150-0164], figs. 4-5); measuring eye movements and limb movements of the person while the subject is viewing the virtual objects and performing the tasks (“ track eye movement … body part movements … detected”; “eye movement latency”; “manual response time … body part movement latency”, para. [0077-0081, 0096-0097, 0104-0105]); identifying selected ones of the measured eye and limb movements that are related to the performance, motor skills and cognitive capabilities of the person (“eye movement latency”; “body part movement latency”, para. [0096, 0104, 0123]); determining expected eye and limb movements of the person while the person is viewing the virtual objects and performing the tasks (“normative database … individual's own previous performance on the same assessment”, para. [0096, 0104, 0123-0125]) and comparing the expected eye and limb movements to the selected ones of the measured eye and limb movements to determine deviations therebetween (“metrics … compared to a normative database … identify changes in performance over time … significantly different … standardized score representing the degree of similarity of an individual to a specified population”; “scores that are relevant to neurological and/or psychological status”; “differential patterns”, para. [0030, 0117, 0123-0125, 0188]); evaluating the performance, motor skills and cognitive capabilities of the person based on the deviations (“scores that are relevant to neurological and/or psychological status”; “diagnostic algorithm was applied … variability … normal … marginal … impaired … number and severity of impairment”, para. [0122-0128, 0181, 0185]). Ettenhofer does not disclose the specified virtual objects being presented in a three-dimensional (3D) virtual reality environment, the virtual objects moving toward or away from the subject with a defined speed, acceleration and direction. However, Josephson discloses directed to constructing training programs or routines and predictive training programs and routines implemented in a VR, AR, MR or XR environments including virtual objects (para. [0005, 0056]) discloses specified virtual objects being presented in a three-dimensional (3D) virtual reality environment (“selectable objects a-jj”, para. [0005, 0056, 0367-0368], fig. 3A), the virtual objects moving toward or away from the subject with a defined speed, acceleration and direction (“object … controllable”; “toward … velocity … acceleration … direction”, para. [0056, 0088, 0097]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Ettenhofer to include a 3D virtual reality environment such that the specified virtual objects are being presented in the three-dimensional (3D) virtual reality environment, the virtual objects moving toward or away from the subject with a defined speed, acceleration and direction, in view of the teachings of Josephson, as this will aid in constructing training programs and routines for conducting assessments of the neurological and/or psychological status of a subject. Regarding claim 2, Ettenhofer, as modified by Josephson hereinabove, discloses the method of claim 1, wherein the eye movements that are measured include at least one of a saccade amplitude, fixation duration and pupil behavior (“subject’s pupil’s dilation and/or constriction is detected … indicate distraction”; “visual reaction time … beginning of fixation”, para. [0077-0078, 0096; 0137-0138], fig. 1). Regarding claim 3, Ettenhofer, as modified by … hereinabove, discloses the method of claim 1 wherein the limb movements that are measured include a limb reaction time needed to perform a requested task (“manual reaction time … body movement part latency”, para. [0104]). Regarding claim 4, Ettenhofer, as modified by … hereinabove, the method of claim 1 wherein the different specified feature of the virtual objects is color (although not limited to) (“color”; “red”, para. [0063, 0067, 0081]). Regarding claim 8, Ettenhofer discloses a system for evaluating performance, motor skills and cognitive capabilities of a person (“performance”; “neurocognitive function”; “motor function”, Abstract, para. [0046, 0188, 0145]), comprising: a device (“computer system … electronic screen”, para. [0060-0061, 0140-0143, 0146]) configured to establish a 3D virtual environment in which a plurality of virtual objects is presented to the person (“visual environment”; “displays … three-dimensional embodiments … target signal … cue signals”; “two or more cue signals, simultaneously”, para. [0047, 0060-0062, 0065]), the objects having at least one feature that differs from one another (“visual cue signals … target signals … different locations and timings”, para. [0058, 0079-0081, 0086, 0150-0164], figs. 1-2); an eye-tracker (“eye tracker”, para. [0171]) configured to measure eye movements of the person while the person is viewing the virtual objects and performing requested tasks (“track eye movement … visual reaction time … eye movement latency”, para. [0077-0081, 0096-0097, 0171]), the requested tasks including multiple requests requesting the person to virtually touch specified virtual objects each having one of the specified features (“series of visual tests”; “different locations and timings”; “varying task characteristics across multiple trials”; “cue signals … different trial types were mixed within each subtest”, para. [0054, 0086,0144-0145, 0150-0164], figs. 4-5); one or more motion sensors (“button, joystick, virtual button”; “body movement sensors”, para. [0079, 0104, 0171]) configured to measure limb movements of the person while the person performs the requested tasks (“body part movements … detected”; “manual response time … body part movement latency”, para. [0059, 0077-0081, 0096-0097, 0104-0105]); a processor (“processor”, para. [0140-0141]) configured to receive data from the device, the eye- tracker and the one or more motion sensors while the person is performing the requested tasks (“automatically … store … data”; “retrieval”, para. [0060-0061, 0127, 0143-0148]) and being further configured to (i) identify selected ones of the measured eye and limb movements that are related to the performance, motor skills and cognitive capabilities of the person (“eye movement latency”; “body part movement latency”, para. [0096, 0104, 0123]); (ii) determine expected eye and limb movements of the person while the person is viewing the virtual objects and performing the requested tasks (“normative database … individual's own previous performance on the same assessment”, para. [0096, 0104, 0123-0125]) and comparing the expected eye and limb movements to the selected ones of the measured eye and limb movements determine deviations therebetween (“metrics … compared to a normative database … identify changes in performance over time … significantly different … standardized score representing the degree of similarity of an individual to a specified population”; “scores that are relevant to neurological and/or psychological status”; “differential patterns”, para. [0030, 0117, 0123-0125, 0188]); and (iii) evaluate the performance, motor skills and cognitive capabilities of the person based on the deviations evaluating the performance, motor skills and cognitive capabilities of the person based on the deviations (“scores that are relevant to neurological and/or psychological status”; “diagnostic algorithm was applied … variability … normal … marginal … impaired … number and severity of impairment”, para. [0122-0128, 0181, 0185]). Ettenhofer does not disclose a three-dimensional (3D) virtual reality device configured to establish a 3D virtual reality environment in which a plurality of virtual objects is presented to the person, the objects moving toward or away from the person with a defined speed, acceleration and direction. However, Josephson discloses directed to constructing training programs or routines and predictive training programs and routines implemented in a VR, AR, MR or XR environments including virtual objects (para. [0005, 0056]) discloses a three-dimensional (3D) virtual reality device (“computer”; apparatus 300, para. [0005, 0056, 0367-0368]) configured to establish a 3D virtual reality environment (3D environment 302; “3D, and/or nD virtual reality (VR) environments”, para. [0005, 0056, 0367-0368]) in which a plurality of virtual objects is presented to the person (“selectable objects a-jj”, para. [0005, 0056, 00368], fig. 3A), the objects moving toward or away from the person with a defined speed, acceleration and direction (“object … controllable”; “toward … velocity … acceleration … direction”, para. [0056, 0088, 0097]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Ettenhofer such that system comprises the three-dimensional (3D) virtual reality device configured to establish a 3D virtual reality environment in which a plurality of virtual objects is presented to the person, the objects moving toward or away from the person with a defined speed, acceleration and direction, in view of the teachings of Josephson, as this will aid in constructing training programs and routines for conducting assessments of the neurological and/or psychological status of a subject. Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Ettenhofer in view of Josephson, as applied to claim 1 above, and further in view of Fernandez (US 20210174959 A1). Regarding claim 5, Ettenhofer, as modified by Josephson discloses the method of claim 1 wherein the evaluating includes determining metrics that include: i. an inhibition process error (i.e. how many times the person touch the incorrect objects) (“manual omission errors … manual inhibition errors”, para. [0105-0107, 0145, 0148]); ii. a saccadic latency, saccadic latency representing an amount of time needed for the person to initiate a saccade to view a successively viewed object (“Saccadic reaction time (RT) … time … is calculated”, para. [0096-0098]). Ettenhofer does not disclose determining metrics that include an average saccadic latency. However, Fernandez directed to systems useful for detecting neurological disorders and for measuring general cognitive performance discloses determining metrics that include an average saccadic latency (“average saccadic latency”, para. [0068, 0258]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Ettenhofer, as modified by Josephson hereinabove, such that the evaluating includes determining metrics that include an average saccadic latency, in view of the teachings of Fernandez, as this will aid in assessing the neurological and/or psychological status of a subject based on the average saccadic latency. Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Ettenhofer in view of Josephson, as applied to claim 1 above, and further in view of Reneker (US 20220313143 A1), and further in view of Charvat (US 10188337 B1). Regarding claim 6, Ettenhofer, as modified by Josephson hereinabove, discloses the method of claim 1, wherein the evaluating further comprises determining (ii) a degree of compromise in executive processes, with increased inhibition error (“trends … inhibition error trend: more or less over time”; “neuropsychological evaluation … evaluation executive functions”; “normal … marginal … impaired … scores”, para. [0064, 0101, 0107, 0121-0122, 0145, 0181]). Ettenhofer, as modified by Josephson hereinabove, does not disclose wherein the evaluating further comprises determining (i) a degree of compromise in processing speed with increased changes in the speed at which successive objects are presented to the person. However, Reneker directed to a method and device for testing sensorimotor control to detect neurological impairment discloses determining (i) a degree of compromise in sensorimotor control with increased changes in the speed at which successive objects are presented to the person (“trials … object increasing in speed”; “degree of sensorimotor control”, para. [0036, 0065, 0080-0083]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Ettenhofer, as modified by Josephson hereinabove, such that the evaluating further comprises determining (i) a degree of sensorimotor control in processing speed with increased changes in the speed at which successive objects are presented to the person, in view of the teachings of Reneker, in order to identify the degree of neurological impairment. Ettenhofer, as modified by Josephson and Reneker hereinabove, does not disclose a degree of compromise in processing speed. However, Charvat directed to computer-implemented measurement of responses to neuropsychiatric tests and automated analysis discloses a degree of compromise in processing speed (“generate … processing speed values or scores … brain type assessment”, col. 16 line 62 – col. 17 line 67). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Ettenhofer, as modified by Josephson and Reneker hereinabove, to determine a compromise in processing speed, in view of the teachings of Charvat, as will aid in identifying the degree of neurological impairment. Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Ettenhofer in view of Josephson, as applied to claim 1 above, and further in view of Baeuerle (US 20190307384 A1). Regarding claim 7, Ettenhofer, as modified by Josephson hereinabove, discloses the method of claim 1 further comprising obtaining one or more additional measurements while the person is viewing the virtual objects and performing the tasks (“additionally … overshoot error”; “additional sensors … inputs … heart rate”; “additional variables”; para. [0078, 0102 , 0171, 0181]). Ettenhofer, as modified by Josephson hereinabove, does not disclose the one or more additional measurements being selected from the group consisting of: i. an amplitude of pupillary dilatation of the person; ii. a number of fixations made by the person on said stimulus image; iii. the gaze duration by the person on the stimulus image; iv. binocular disparity by the person while performing visual exploration and objects visualization; v. target touched by person and fixations of where the visual stimulus was before; vi. number of consecutive object touched by person when performing a trial; vii. Number of blinks coming from the left eye, the right eye or from both eyes; viii. Time taken to visually detect objects; x. Time since the person start to move the hands and/or feet up the time the person touches or tries to touch the object; xi. Number of times the subject touch -or not- the virtual objects; xii. Optimal place for target visualization and places where visualizing objects is less efficient. xii. Tracking Accuracy in maintaining visual focus on moving objects; xiii. Hand-Reach Depth towards the objects during the act of touching; xiv. Max Hand Velocity achieved by the hands during the movement towards the touched green objects; xv. Dominant Hand Ratio of preferred hand usage during manual interactions; xvi. Prediction Time, measuring the average time it takes for the person to anticipate and initiate a response following visual cues; and xii. Microsaccades. However, Baeuerle directed to identifying and measuring bodily states and feedback systems discloses the one or more additional measurements being selected from the group consisting of: i. an amplitude of pupillary dilatation of the person (“pupillometry”, para. [0099]); ii. a number of fixations made by the person on said stimulus image (“fixations”, para. [0042]); iii. the gaze duration by the person on the stimulus image (“gaze tracking … duration”, para. [0041-0042]); iv. binocular disparity by the person while performing visual exploration and objects visualization; v. target touched by person and fixations of where the visual stimulus was before (“number of targets acquired”, para. [0080]); vi. number of consecutive object touched by person when performing a trial (para. [0080]); vii. Number of blinks coming from the left eye, the right eye or from both eyes (“blink frequency”, para. [0042]); viii. Time taken to visually detect objects; x. Time since the person start to move the hands and/or feet up the time the person touches or tries to touch the object (target acquisition time, para. [0080]); xi. Number of times the subject touch -or not- the virtual objects (“number of targets hit as a percentage”, para. [0080]); xii. Optimal place for target visualization and places where visualizing objects is less efficient. xii. Tracking Accuracy in maintaining visual focus on moving objects; xiii. Hand-Reach Depth towards the objects during the act of touching; xiv. Max Hand Velocity achieved by the hands during the movement towards the touched green objects; xv. Dominant Hand Ratio of preferred hand usage during manual interactions; xvi. Prediction Time, measuring the average time it takes for the person to anticipate and initiate a response following visual cues; and xii. Microsaccades (“micro-saccades”, para. [0042]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Ettenhofer, as modified by Josephson hereinabove, such that the one or more additional measurements being selected from the group consisting of: i. an amplitude of pupillary dilatation of the person; ii. a number of fixations made by the person on said stimulus image; iii. the gaze duration by the person on the stimulus image; iv. binocular disparity by the person while performing visual exploration and objects visualization; v. target touched by person and fixations of where the visual stimulus was before; vi. number of consecutive object touched by person when performing a trial; vii. Number of blinks coming from the left eye, the right eye or from both eyes; viii. Time taken to visually detect objects; x. Time since the person start to move the hands and/or feet up the time the person touches or tries to touch the object; xi. Number of times the subject touch -or not- the virtual objects; xii. Optimal place for target visualization and places where visualizing objects is less efficient. xii. Tracking Accuracy in maintaining visual focus on moving objects; xiii. Hand-Reach Depth towards the objects during the act of touching; xiv. Max Hand Velocity achieved by the hands during the movement towards the touched green objects; xv. Dominant Hand Ratio of preferred hand usage during manual interactions; xvi. Prediction Time, measuring the average time it takes for the person to anticipate and initiate a response following visual cues; and xii. Microsaccades., in view of the teachings of Baeuerle, as this will aid in identifying and measuring bodily states/physiological status based on measures of performance dynamics. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Fooken et al. directed to decoding go/no-go decisions from eye movements. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANDREW ELI HOFFPAUIR whose telephone number is (571)272-4522. The examiner can normally be reached Monday-Friday 8:00-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Charles Marmor II can be reached at (571) 272-4730. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /A.E.H./Examiner, Art Unit 3791 /AURELIE H TU/Primary Examiner, Art Unit 3791
Read full office action

Prosecution Timeline

Aug 23, 2023
Application Filed
Feb 02, 2026
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12593987
FOREHEAD TEMPERATURE MEASUREMENT SYSTEM WITH HIGH ACCURACY
2y 5m to grant Granted Apr 07, 2026
Patent 12564423
SYSTEMS AND METHODS FOR ACCESSING A RENAL CAPSULE FOR DIAGNOSTIC AND THERAPEUTIC PURPOSES
2y 5m to grant Granted Mar 03, 2026
Patent 12533043
DEVICE FOR PROCESSING AND VISUALIZING DATA OF AN ELECTRIC IMPEDANCE TOMOGRAPHY APPARATUS FOR DETERMINING AND VISUALIZING REGIONAL VENTILATION DELAYS IN THE LUNGS
2y 5m to grant Granted Jan 27, 2026
Patent 12521023
TEMPERATURE SELF-COMPENSATION INTERVENTIONAL OPTICAL FIBER PRESSURE GUIDEWIRE AND WIRELESS FFR MONITOR
2y 5m to grant Granted Jan 13, 2026
Patent 12502514
Vascular Access Device Adapter
2y 5m to grant Granted Dec 23, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
39%
Grant Probability
80%
With Interview (+41.1%)
3y 12m
Median Time to Grant
Low
PTA Risk
Based on 75 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month