Prosecution Insights
Last updated: April 19, 2026
Application No. 18/261,386

METHOD AND APPARATUS FOR DIAGNOSING DIZZINESS THROUGH EYE MOVEMENT MEASUREMENT BASED ON VIRTUAL REALITY, RECORDING MEDIUM STORING PROGRAM FOR REALIZING THE SAME, AND COMPUTER PROGRAM STORED IN RECORDING MEDIUM

Non-Final OA §101§103§112
Filed
Jul 13, 2023
Examiner
HEALY, NOAH MICHAEL
Art Unit
3791
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Neuroears Co. Ltd.
OA Round
1 (Non-Final)
69%
Grant Probability
Favorable
1-2
OA Rounds
3y 4m
To Grant
99%
With Interview

Examiner Intelligence

Grants 69% — above average
69%
Career Allow Rate
25 granted / 36 resolved
-0.6% vs TC avg
Strong +41% interview lift
Without
With
+40.7%
Interview Lift
resolved cases with interview
Typical timeline
3y 4m
Avg Prosecution
48 currently pending
Career history
84
Total Applications
across all art units

Statute-Specific Performance

§101
12.1%
-27.9% vs TC avg
§103
38.6%
-1.4% vs TC avg
§102
18.6%
-21.4% vs TC avg
§112
27.9%
-12.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 36 resolved cases

Office Action

§101 §103 §112
DETAILED ACTION Claims 1-17 are pending and hereby under examination. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Specification The disclosure is objected to because of the following informalities: Page 2, line 14, nystagmography is repeated twice. Appropriate correction is required. Claim Objections Claim 5 and 12 are objected to because of the following informalities: Regarding claims 5 and 12, “(horizon/vertical/torsion)” should read “(horizontal/vertical/torsional)”. Appropriate correction is required. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “a head movement detection unit” first recited in claim 1; “an eye movement video acquisition unit” first recited in claim 1; “an eye movement video transmitting unit” first recited in claim 1; “a nystagmus detection unit” first recited in claim 1; “a diagnostic unit” first recited in claim 1; “an arithmetic unit” first recited in claim 3; “a graph output unit” first recited in claim 5; “a gaze guidance gazing point providing unit” first recited in claim 6; and “a posture adjustment guide unit” first recited in claim 8; The identified structure for the corresponding claim limitations are as follows: “a head movement detection unit” appears to have no clear structure recited, but for examination purposes it will be interpreted according to the broad descriptions provided in the specification. The specification describes “configured to receive a signal detected from a head tracker attached to a virtual reality device and detecting a head position and movement of a patient to detect head movement of the patient” (Page 4), “the head tracker 11 includes a sensor that detects the position and movement of the head … the head movement detection unit 17 detects the position and movement of the head tracked by the head tracker 11” (Page 9), and “the head movement detection unit 17 is configured separately from the head tracker 11 and may be integrated into the head tracker 11. That is, the head tracker 11 is composed of a sensor that detects the position and movement of the head, and the head movement detection unit 17 is software that receives a detection signal detected through the sensor to detect the position and movement of the head” (Page 10). “an eye movement video acquisition unit” appears to have no clear structure recited, but for examination purposes it will be interpreted according to the broad descriptions provided in the specification. The specification describes “configured to receive raw data” (Page 4), “may acquire the eye movement video by processing the raw data of the eye movement” (Page 4), and may be integrated into the eye tracker 12” (Page 10). “an eye movement video transmitting unit” appears to have no clear structure recited, but for examination purposes it will be interpreted according to the broad descriptions provided in the specification. The specification describes “transmits the acquired eye movement image through the eye movement video transmitting unit 19” (Page 12). “a nystagmus detection unit” appears to have no clear structure recited, but for examination purposes it will be interpreted according to the broad descriptions provided in the specification. The specification describes “the nystagmus detection unit 20 is for detecting the oculomotor function, and detects meaningful nystagmus, which is required to diagnose dizziness from the eye movement in the eye movement video” (Page 12). “a diagnostic unit” appears to have no clear structure recited, but for examination purposes it will be interpreted according to the broad descriptions provided in the specification. The specification describes “configured to diagnose the dizziness of the patient based on the meaningful nystagmus detected by the nystagmus detection unit” (Page 4), “may determine that there is an abnormality in a vestibular function when the relative ratio calculated by the arithmetic unit is not "1”” (Page 5), and “The diagnostic unit 24 may diagnose the type of dizziness of the patient based on the nystagmus detected by the nystagmus detection unit 20” (Page 16). “an arithmetic unit” appears to have no clear structure recited, but for examination purposes it will be interpreted according to the broad descriptions provided in the specification. The specification describes “an arithmetic unit configured to calculate a relative ratio between the head movement and the eye movement by dividing a movement angle of a head” (Page 5). “a graph output unit” is identified as “a graph output unit configured to display the nystagmus detected by the nystagmus detection unit in three axes” (Page 5) and “the graph output unit 21 generates and outputs a nystagmus graph for the patient” (Page 13). “a gaze guidance gazing point providing unit” is identified as “a gaze guidance gazing point providing unit configured to provide a gazing point for gaze guidance through a display installed in the virtual reality device” (Page 5) and “The gaze guidance gazing point providing unit 22 provides a gazing point (target point) for gaze guidance in virtual reality to measure optokinetic nystagmus” (Page 14). “a posture adjustment guide unit” appears to have no clear structure recited, but for examination purposes it will be interpreted according to the broad descriptions provided in the specification. The specification describes “a posture adjustment guide unit configured to provide the patient with an accurate posture required for each test through the virtual reality device” (Page 5) and “a posture adjustment guide unit 25 that provides a test method for an operation of each test and guides a guide in a posture required for an accurate test” (Page 16). Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. The following is a quotation of the first paragraph of pre-AIA 35 U.S.C. 112: The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor of carrying out his invention. Claims 1-17 are rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, or for applications subject to pre-AIA 35 U.S.C. 112, the inventor(s), at the time the application was filed, had possession of the claimed invention. The specification is devoid of any description defining what “a head movement detection unit”, “an eye movement video acquisition unit”, “an eye movement video transmitting unit”, “a nystagmus detection unit”, “a diagnostic unit”, “an arithmetic unit”, and “a posture adjustment guide unit”. A “head movement detection unit” appears to be directed towards a “sensor”. However, there is no disclosure of any specific sensor for detecting head movement. Is any sensor applicable? “An eye movement video acquisition unit” is configured to receive raw data, does that mean it is capturing the data via a camera as disclosed in the specification or is it a receiver of some kind? “An eye movement video transmitting unit” is configured to transmit data, there is no clear structure on what this unit is. Is it only a transmitter? “A nystagmus detection unit”, “a diagnostic unit”, and “an arithmetic unit” all appear to be directed towards some kind of processing device or processing step; however, a processor is only mentioned once in the specification (Page 22) with regard to computer programmable instructions to carry out the method. What is the structure of these units? The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-17 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Regarding claims 1 and 9, it is unclear what the term “to filter” means. In claim 1, the nystagmus detection unit is configured to receive eye movement video to “filter saccadic oscillation in the eye movement video to detect nystagmus”. Is the saccadic oscillation filtered out to detect only nystagmus, or is both saccadic oscillation and nystagmus detected? Similarly in claim 9, the “distinguishing” step is unclear. Are saccadic oscillation and nystagmus separately determined, or are they both considered for detecting nystagmus? Additionally, the “meaningless eye movement” is “filtered”. Is it removed (i.e., filtered out)? How is it removed if it “is not measured”? For examination purposes, the claims will be interpreted to detect nystagmus and saccadic oscillation together for “nystagmus detection”. Also, since data cannot be filtered if it is not measured, detecting nystagmus and saccadic oscillation will also read on this limitation of “filter meaningless eye movement that is not measured”. Claims 2-8 and 10-17 are also rejected due to their dependence on claims 1 and 9. Regarding claim 1, it is unclear if the head detection unit receives a signal from a head tracker, wherein the head tracker detects a head position/movement or if the head detection unit processes the signal data to determine head/position movement. Applicant should clarify the language between what the head tracker and the head detection unit do. For examination purposes, the claim will be interpreted such that the head detection unit measures head position/movement of a patient. Regarding claim 1, it is unclear if the eye movement video acquisition unit receives a signal from an eye tracker, wherein the eye tracker detects eye movement or if the eye movement video acquisition unit processes the signal data to determine eye movement. Applicant should clarify the language between what the eye tracker and the eye movement video acquisition unit do. For examination purposes, the claim will be interpreted such that the eye movement video acquisition unit measures eye movement of a patient. Claim limitations “a head movement detection unit”, “an eye movement video acquisition unit”, “an eye movement video transmitting unit”, “a nystagmus detection unit”, “a diagnostic unit”, “an arithmetic unit”, and “a posture adjustment guide unit” invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function. The disclosure is devoid of any structure that performs the function in the claim. For example, as described above, a “head movement detection unit” appears to be directed towards a “sensor”. However, there is no disclosure of any specific sensor for detecting head movement. “An eye movement video acquisition unit” is configured to receive raw data, but there is no disclosure of structure as to what is acquiring the data. “An eye movement video transmitting unit” is configured to transmit data, but there is no clear structure indicating how it transmits data. “A nystagmus detection unit”, “a diagnostic unit”, and “an arithmetic unit” all appear to be directed towards some kind of processing device or processing step, but only a generic a processor is recited in the specification (not directly tied to these units). The limitations of “a graph output unit” and “a gaze guidance gazing point providing unit” are directed towards a display, and Examiner notes these two units as examples of having sufficient structure that performs the function in the claims. Therefore, the claims are indefinite and are rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph. Applicant may: (a) Amend the claim so that the claim limitation will no longer be interpreted as a limitation under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph; (b) Amend the written description of the specification such that it expressly recites what structure, material, or acts perform the entire claimed function, without introducing any new matter (35 U.S.C. 132(a)); or (c) Amend the written description of the specification such that it clearly links the structure, material, or acts disclosed therein to the function recited in the claim, without introducing any new matter (35 U.S.C. 132(a)). If applicant is of the opinion that the written description of the specification already implicitly or inherently discloses the corresponding structure, material, or acts and clearly links them to the function so that one of ordinary skill in the art would recognize what structure, material, or acts perform the claimed function, applicant should clarify the record by either: (a) Amending the written description of the specification such that it expressly recites the corresponding structure, material, or acts for performing the claimed function and clearly links or associates the structure, material, or acts to the claimed function, without introducing any new matter (35 U.S.C. 132(a)); or (b) Stating on the record what the corresponding structure, material, or acts, which are implicitly or inherently set forth in the written description of the specification, perform the claimed function. For more information, see 37 CFR 1.75(d) and MPEP §§ 608.01(o) and 2181. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claim 16 is rejected under 35 U.S.C. §101 because the claimed invention is directed to non-statutory subject matter. The claims do not fall within at least one of the four categories of patent eligible subject matter because they claim a “computer-readable recording medium”. Without specifying that the computer-readable medium is non-transitory, the claimed invention could be implemented as data incorporated on a digital signal; however, signals are not patentable subject matter. When a claim covers both statutory and non-statutory embodiments, it is proper to reject as including non-statutory subject matter. Claims 1-17 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. Analysis of independent claim 9: Step 1 of the subject matter eligibility test (see MPEP 2106.03). Claim 1 is directed to a system, which describes one of the four statutory categories of patentable subject matter, i.e., a machine. Claim 9 is directed to a computer implemented method, which describes one of the four statutory categories of patentable subject matter, i.e., a method. Therefore, further consideration is necessary regarding claims. Step 2A of the subject matter eligibility test (see MPEP 2106.04). Prong One: Claims 1 and 9 recite an abstract idea. In particular, the claim generally recites the following: a diagnostic unit configured to diagnose a dizziness of the patient based on the meaningful nystagmus detected by the nystagmus detection unit diagnosing a type of dizziness of the patient based on the detected meaningful nystagmus; This element recited in claims 1 and 9 is drawn to an abstract idea since it is directed to mental processes – concepts performed in the human mind (including an observation, evaluation, judgment, opinion) (see MPEP § 2106.04(a)(2), subsection III). “Diagnosing a type of dizziness” is drawn to an abstract idea since it is a mental process that can be practically performed in the human mind, with the aid of pen and paper or a generic computer. A person of ordinary skill in the art could reasonably review collected data of eye movement and/or head movement and determine a type of dizziness. There is nothing to suggest an undue level of complexity in “diagnosing a dizziness” or “a type of dizziness”. Prong Two: Claims 1 and 9 do not recite additional elements that integrate the exception into a practical application. Therefore, the claims are "directed to" the abstract idea. The additional elements merely: Add insignificant extra-solution activity (the pre-solution activity of: using generic data gathering components (e.g., “a head movement detection unit configured to receive a signal detected from a head tracker attached to a virtual reality device and detecting a head position and movement of a patient to detect head movement of the patient” (claim 1), “receiving a signal detected from a head tracker attached to a virtual reality device and detecting a head position and movement of a patient to detect the head movement of the patient” (claim 9), “an eye movement video acquisition unit configured to receive raw data obtained by capturing eye movement of the patient from an eye tracker attached to the virtual reality device to acquire eye movement video” (claim 1), “receiving raw data obtained by capturing the eye movement of the patient from an eye tracker attached to the virtual reality device to acquire and transmit eye movement video” (claim 9), “an eye movement video transmitting unit configured to transmit the eye movement video acquired by the eye movement video acquisition unit” (claim 1), “a nystagmus detection unit configured to receive the eye movement video to filter saccadic oscillation in the eye movement video to detect nystagmus and filter meaningless eye movement that is not measured since an eye moves too slow or is covered due to blinking in the detected nystagmus to detect only meaningful nystagmus” (claim 1), and “distinguishing saccadic oscillation and nystagmus in the eye movement video and filtering meaningless eye movement that is not measured since an eye moves too slow or is covered due to blinking in the detected nystagmus to detect only meaningful nystagmus” (claim 9)); As a whole, the additional elements merely serve to gather information to be used by the abstract idea, while generically implementing it on a computer. There is no practical application because the abstract idea is not applied, relied on, or used in a meaningful way. The processing performed remains in the abstract realm, i.e., the result is not used for a treatment. No improvement to the technology is evident. Therefore, the additional elements, alone or in combination, do not integrate the abstract idea into a practical application. Step 2B of the subject matter eligibility test (see MPEP 2106.05). Claims 1 and 9 do not include additional elements, alone or in combination, that are sufficient to amount to significantly more than the judicial exception (i.e., an inventive concept) for the same reasons as described above. E.g., all elements are directed to implementing the abstract ideas on generic processing components, the pre-solution activity of using generic data-gathering components, and generic post-solution activities, which merely facilitate the abstract idea. Per the Berkheimer requirement, the additional elements are well-understood, routine, and conventional. For example, “a virtual reality device” as disclosed in the Applicant’s specification on Page 8, line 24 – Page 9, line 3, “using a commercialized virtual reality device, for example, a mounted display (HMD)” or “a head movement detection unit” as discussed above appears to be directed towards a generic sensor. Similarly, as pointed out in the 112(b) rejections above, there is no clear structure in the disclosure for the various detection/acquisition units of claims 1 and 9. As such, these elements do not qualify as significantly more because these limitations are simply appending well-understood, routine and conventional activities previously known in the industry, specified at a high level of generality, to the judicial exception, e.g., a claim to an abstract idea requiring no more than a generic computer to perform generic computer functions that are well-understood, routine and conventional activities previously known in the industry (see Electric Power Group, 830 F.3d 1350 (Fed. Cir. 2016); Alice Corp. v. CLS Bank Int'/, 110 USPQ2d 1976 (2014)) and/or a claim to an abstract idea requiring no more than being stored on a computer readable medium which is a well-understood, routine and conventional activity previously known in the industry (see Electric PowerGroup, 830 F.3d 1350 (Fed. Cir. 2016); Alice Corp. v. CLS Bank Int'/, 110 USPQ2d 1976 (2014); SAP Am. v. lnvestPic, 890 F.3d 1016 (Fed. Circ. 2018)). In view of the above, the additional elements individually do not integrate the exception into a practical application and do not amount to significantly more than the above-judicial exception (the abstract idea). Looking at the limitations as an ordered combination (that is, as a whole) adds nothing that is not already present when looking at the elements taking individually. There is no indication that the combination of elements improves the functioning of a computer, for example, or improves any other technology. There is no indication that the combination of elements permits automation of specific tasks that previously could not be automated. There is no indication that the combination of elements include a particular solution to a computer-based problem or a particular way to achieve a desired computer-based outcome. Rather, the collective functions of the claimed invention merely provide conventional computer implementation, i.e., the computer is simply a tool to perform the process. Analysis of the dependent claims: Claims 2-8 and 10-17 depend from the independent claim. Dependent claims 2-8 and 10-17 merely further define the abstract idea and are, therefore, directed to an abstract idea for similar reasons: they merely Further describe the abstract idea (“a relative ratio between the head movement and the eye movement is calculated by dividing a movement angle of a head detected in response to a change in a head in the operation (a) by the eye movement angle moving in response to the change in the head, and it is determined that there is abnormality in a vestibular function when the calculated relative ratio does not reach "1" or a normal value that is predetermined” (claims 3, 4, and 11), “A computer-readable recording medium in which a program for realizing the method of diagnosing dizziness through eye movement measurement based on virtual reality” (claim 16), and “A computer program stored in a computer-readable recording medium for realizing the method of diagnosing dizziness through eye movement measurement based on virtual reality” (claim 17)), Further describe the pre-solution activity (“the eye movement video is acquired and transmitted by processing the raw data of the eye movement captured using an internal camera of the eye tracker using a processing program” (claims 2 and 10), “providing a gazing point for gaze guidance through the virtual reality device” (claims 6 and 13), “wherein the gazing point has a curtain shape or a point shape” (claims 7 and 14), and “providing the patient with an accurate posture required for each test through the virtual reality device” (claims 8 and 15)), and Further describe the post-solution activity (“displaying the nystagmus detected in the operation (c) in three axes (horizon/vertical/torsion axes)” (claims 5 and 12)). Taken alone or in combination, the additional elements do not integrate the judicial exception into a practical application at least because the abstract idea is not applied, relied on, or used in a meaningful way. The additional elements do not add anything significantly more than the abstract idea. The collective functions of the additional elements merely provide computer/electronic implementation and processing, and no additional elements beyond those of the abstract idea. There is no indication that the combination of elements permits automation of specific tasks that previously could not be automated. There is no indication that the combination of elements improves the functioning of a computer, output device, improves technology other than the technical field of the claimed invention, etc. The result of the abstract idea does not cause the computing device and/or application to perform different. The result of the abstract idea does not cause output of the user-accessible output. The user- accessible output does not cause or confirm that the user adjusts their posture. Therefore, the claims are rejected as being directed to non-statutory subjection matter. Examiner notes that the additional elements such as a virtual reality device or a head tracker are not positively recited in claims 1-17. As such, they are not considered for placing the judicial exception into a practical application or amounting to significantly more. Additionally, Examiner reiterates that the additional elements identified in the 112(b) rejection and 112(f) invocation above lack clear sufficient structure to perform the functions in the claims. Thus, they are considered well-understood, routine, and conventional. Claims 1-17 are rejected. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claim(s) 1-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Krueger (US 20180008141) and Suh (US 10881319). Regarding claim 1, Krueger discloses an apparatus for diagnosing dizziness through eye movement measurement based on virtual reality, the apparatus comprising: a head movement detection unit configured to receive a signal detected from a head tracker attached to a virtual reality device (Figs. 3A-B, head-worn VR device 300) and detecting a head position and movement of a patient to detect head movement of the patient (Fig. 3A, head orientation sensor 108); an eye movement video acquisition unit configured to receive raw data obtained by capturing eye movement of the patient from an eye tracker attached to the virtual reality device to acquire eye movement video (Fig. 3A, eye tracking video cameras 210/211); an eye movement video transmitting unit configured to transmit the eye movement video acquired by the eye movement video acquisition unit (Fig. 5, data sent to eye camera video processor 124); a nystagmus detection unit configured to receive the eye movement video to filter saccadic oscillation in the eye movement video to detect nystagmus and filter meaningless eye movement that is not measured since an eye moves too slow or is covered due to blinking in the detected nystagmus to detect only meaningful nystagmus (Fig. 6, measure eye orientation changes 642 to determine vestibulo-ocular performance 644; Paragraph 0028; Paragraph 0079, “ In embodiments of the present invention, vestibular ocular performance (VOP), saccades, visual pursuit performance, nystagmus, vergence, eyelid closure, dynamic visual acuity, dynamic visual stability, retinal image stability, foveal fixation stability, and focused position of the eyes could be measured in a VR, AR or synthetic 3D environment”); and While Krueger discusses that individuals with VOR/DVA [vestibulo-ocular reflex/dynamic visual acuity] abnormality may experience dizziness (Paragraph 0325), Krueger does not diagnose dizziness. However, Suh teaches an apparatus for treating dizziness, wherein after nystagmus is determined, a type of dizziness is diagnosed (Col 5, lines 60-64), which Suh discusses is useful for providing different treatment/alleviation methods and for dizziness prevention (Col 2, lines 29-33). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the virtual reality device for measuring nystagmus of Krueger to incorporate the type of dizziness determination from nystagmus detection of Suh, and the results would have been predictable to one of ordinary skill in the art. Regarding claim 2, Krueger further discloses wherein the eye movement video acquisition unit acquires the eye movement video by processing the raw data of the eye movement (Fig. 3A, eye camera video processor 124) captured using an internal camera (Fig. 3A, eye tracking cameras 210/211) of the eye tracker using a processing program (Paragraphs 0130 and 0279). Regarding claim 3, Krueger further discloses an arithmetic unit configured to calculate a relative ratio between the head movement and the eye movement by dividing a movement angle of a head detected by the head movement detection unit in response to a change in the patient's head by the eye movement angle obtained in response to the change in the head (Paragraphs 0272-0274, wherein a gain is detected which is a comparison between the movement of the head to the movement of the eyes); Regarding claim 4, Krueger further discloses wherein the diagnostic unit determines that there is an abnormality in a vestibular function when the relative ratio calculated by the arithmetic unit is not "1." (Paragraph 0160, wherein the graphs of Figs. 9A-C and 10A-C depict the gain. A gain of less than 1.0 shows “the velocity of the eyes is slower than that of the head”; Paragraph 0272, wherein a gain of -1 or 1 is “perfect”, thus a gain outside of this range is abnormal). Regarding claim 5, Krueger further discloses comprising a graph output unit configured to display the nystagmus detected by the nystagmus detection unit in three axes (horizon/vertical/torsion axes) (Paragraph 0247; Figs. 9-10; Examiner notes that Krueger does not explicitly disclose displaying the detected nystagmus; however, Krueger discloses measuring these movements in the horizontal, vertical, and torsional axes and also graphs other measurements of eye movement. Thus, one of ordinary skill would be able to graph these detected eye movements). Regarding claim 6, Krueger further discloses a gaze guidance gazing point providing unit configured to provide a gazing point for gaze guidance through a display installed in the virtual reality device (Figs. 11-15). Regarding claim 7, Krueger further discloses wherein the gazing point has a curtain shape or a point shape (Fig. 11B; Fig. 13, tennis ball 920; Figs. 14-15). Regarding claim 8, a posture adjustment guide unit configured to provide the patient with an accurate posture required for each test through the virtual reality device (Fig. 7, boxes 622, 624, and 626; Paragraphs 0109-0116). Regarding claim 9, Krueger discloses a method of diagnosing dizziness through eye movement measurement based on virtual reality, the method comprising: (a) receiving a signal detected from a head tracker attached to a virtual reality device (Figs. 3A-B, head-worn VR device 300) and detecting a head position and movement of a patient to detect head movement of the patient (Fig. 3A, head orientation sensor 108); (b) receiving raw data obtained by capturing the eye movement of the patient from an eye tracker attached to the virtual reality device to acquire and transmit eye movement video (Fig. 3A, eye tracking video cameras 210/211; Fig. 5, data sent to eye camera video processor 124); (c) distinguishing saccadic oscillation and nystagmus in the eye movement video and filtering meaningless eye movement that is not measured since an eye moves too slow or is covered due to blinking in the detected nystagmus to detect only meaningful nystagmus (Fig. 6, measure eye orientation changes 642 to determine vestibulo-ocular performance 644; Paragraph 0028; Paragraph 0079, “ In embodiments of the present invention, vestibular ocular performance (VOP), saccades, visual pursuit performance, nystagmus, vergence, eyelid closure, dynamic visual acuity, dynamic visual stability, retinal image stability, foveal fixation stability, and focused position of the eyes could be measured in a VR, AR or synthetic 3D environment”); While Krueger discusses that individuals with VOR/DVA [vestibulo-ocular reflex/dynamic visual acuity] abnormality may experience dizziness (Paragraph 0325), Krueger does not diagnose dizziness. However, Suh teaches an apparatus for treating dizziness, wherein after nystagmus is determined, a type of dizziness is diagnosed (Col 5, lines 60-64), which Suh discusses is useful to for providing different treatment/alleviation methods and for dizziness prevention (Col 2, lines 29-33). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the virtual reality device for measuring nystagmus of Krueger to incorporate the type of dizziness determination from nystagmus detection of Suh, and the results would have been predictable to one of ordinary skill in the art. Regarding claim 10, Krueger further discloses wherein, in the operation (b), the eye movement video is acquired and transmitted by processing the raw data of the eye movement (Fig. 3A, eye camera video processor 124) captured using an internal camera (Fig. 3A, eye tracking cameras 210/211) of the eye tracker using a processing program (Paragraphs 0130 and 0279). Regarding claim 11, Krueger further discloses wherein, in the operation (d), a relative ratio between the head movement and the eye movement is calculated by dividing a movement angle of a head detected in response to a change in a head in the operation (a) by the eye movement angle moving in response to the change in the head, and it is determined that there is abnormality in a vestibular function when the calculated relative ratio does not reach "1" or a normal value that is predetermined (Paragraphs 0272-0274, wherein a gain is detected which is a comparison between the movement of the head to the movement of the eyes and wherein a gain of -1 or 1 is “perfect”, thus a gain outside of this range is abnormal; Paragraph 0160, wherein the graphs of Figs. 9A-C and 10A-C depict the gain. A gain of less than 1.0 shows “the velocity of the eyes is slower than that of the head”). Regarding claim 12, Krueger further discloses comprising (e) displaying the nystagmus detected in the operation (c) in three axes (horizon/vertical/torsion axes) (Paragraph 0247; Figs. 9-10; Examiner notes that Krueger does not explicitly disclose displaying the detected nystagmus; however, Krueger discloses measuring these movements in the horizontal, vertical, and torsional axes and also graphs other measurements of eye movement. Thus, one of ordinary skill would be able to graph these detected eye movements). Regarding claim 13, Krueger further discloses comprising, before the operation (b), (f) providing a gazing point for gaze guidance through the virtual reality device (Figs. 11-15). Regarding claim 14, Krueger further discloses wherein the gazing point has a curtain shape or a point shape (Fig. 11B; Fig. 13, tennis ball 920; Figs. 14-15). Regarding claim 15, Krueger further discloses comprising, after the operation (a), (g) providing the patient with an accurate posture required for each test through the virtual reality device (Fig. 7, boxes 622, 624, and 626; Paragraphs 0109-0116). Regarding claim 16, Krueger further discloses a computer-readable recording medium in which a program for realizing the method of diagnosing dizziness through eye movement measurement based on virtual reality of claim 9 (Paragraph 0276; Paragraph 0279). Regarding claim 17, Krueger further discloses a computer program stored in a computer-readable recording medium for realizing the method of diagnosing dizziness through eye movement measurement based on virtual reality of claim 9 (Paragraph 0276; Paragraph 0279). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to NOAH MICHAEL HEALY whose telephone number is (703)756-5534. The examiner can normally be reached Monday - Friday 8:30am - 5:30pm ET. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Sims can be reached at (571)272-7540. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NOAH M HEALY/Examiner, Art Unit 3791 /JASON M SIMS/Supervisory Patent Examiner, Art Unit 3791
Read full office action

Prosecution Timeline

Jul 13, 2023
Application Filed
Dec 15, 2025
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12588821
BODY TEMPERATURE ESTIMATION SYSTEM AND METHOD BASED ON ONE-CHANNEL TEMPERATURE SENSOR
2y 5m to grant Granted Mar 31, 2026
Patent 12569150
METHODS, DEVICES AND SYSTEMS FOR BIOPHYSICAL SENSING
2y 5m to grant Granted Mar 10, 2026
Patent 12558011
DEVICE AND A SYSTEM FOR VOIDING DYSFUNCTION DIAGNOSIS
2y 5m to grant Granted Feb 24, 2026
Patent 12544534
Foley Catheter System with Specimen Sampling Port Disinfectant Cap and Corresponding Tray Packaging Systems and Drainage Products
2y 5m to grant Granted Feb 10, 2026
Patent 12533053
Photoplethysmography Based Non-Invasive Blood Glucose Prediction by Neural Network
2y 5m to grant Granted Jan 27, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
69%
Grant Probability
99%
With Interview (+40.7%)
3y 4m
Median Time to Grant
Low
PTA Risk
Based on 36 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month