Prosecution Insights
Last updated: April 19, 2026
Application No. 17/079,119

METHODS, SYSTEMS, AND COMPUTER READABLE MEDIA FOR TESTING VISUAL FUNCTION USING VIRTUAL MOBILITY TESTS

Final Rejection §103
Filed
Oct 23, 2020
Examiner
JORDAN, DANIEL JEFFERY
Art Unit
2872
Tech Center
2800 — Semiconductors & Electrical Systems
Assignee
The Trustees of the University of Pennsylvania
OA Round
4 (Final)
62%
Grant Probability
Moderate
5-6
OA Rounds
3y 9m
To Grant
62%
With Interview

Examiner Intelligence

Grants 62% of resolved cases
62%
Career Allow Rate
30 granted / 48 resolved
-5.5% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 9m
Avg Prosecution
41 currently pending
Career history
89
Total Applications
across all art units

Statute-Specific Performance

§103
51.9%
+11.9% vs TC avg
§102
22.9%
-17.1% vs TC avg
§112
25.2%
-14.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 48 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments 2. Applicant’s arguments (see Remarks dated 09/10/2025) with respect to claims 1-20 have been considered, but are moot because of the new grounds of rejection. Claim Objections 3. Claims 2-3 and 11-12 are objected to because of the following informalities: Each of claims 2 and 11 should read “[[a]] the vision condition” Claim 3 should read “or measuring one or more symptoms of the vision condition” Claim 12 should read “or measuring one or more symptoms of [[a]] the vision condition” Appropriate correction is required. Claim Rejections - 35 USC § 103 4. The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action. 5. Claims 1-20 are rejected under 35 USC 103 as being unpatentable over Blaha et al. (US 20170340200 A1, of record) in view of Leung et al. (US 10568502 B2, of record) and Na et al. (KR 20170012615 A, of record), and further in view of Banken ("Voretigene Neparvovec for Biallelic RPE65-Mediated Retinal Disease: Effectiveness and Value, Final Evidence Report." Institute for Clinical and Economic Review. 2018.). Regarding claim 1, Blaha discloses a system (Fig. 2, 200) comprising: at least one processor ([0022]); and a memory ([0022]), wherein the system is configured for: configuring a virtual mobility test for testing visual function of a user ([0056]); generating the virtual mobility test ([0058]), wherein generating the virtual mobility test includes displaying, via a user display, virtual objects ([0058], “object(s) being displayed on the VR device”); and analyzing performance of the user during the virtual mobility test for determining the visual function of the user based on user interaction with the objects in the virtual mobility test using data from body movement detection sensors ([0063]-[0065]), and repeating the virtual mobility test in which the virtual objects are displayed “with different visual properties” ([0110]-[0111]). Blaha fails to explicitly disclose wherein generating the virtual mobility test includes displaying virtual objects for the user to purposefully and virtually touch by physically moving a body part to physical locations corresponding to virtually displayed locations of the virtual objects and directing the user to intentionally touch the virtual objects, and repeating the virtual mobility test in which the virtual objects are displayed at different luminance levels, and wherein analyzing the performance of the user comprises comparing accuracy and/or times for the user to complete the virtual mobility test at the different luminance levels. However, Leung teaches a vision assessment system that includes virtual reality and augmented reality environments, wherein generating a virtual mobility test includes displaying virtual objects for the user to purposefully and virtually touch by physically moving a body part to physical locations corresponding to virtually displayed locations of the virtual objects (Figs. 8A-B), repeating the virtual mobility test in which the virtual simulation is displayed at different luminance levels (column 12 lines 42-45), and wherein analyzing the performance of the user comprises comparing accuracy and/or times for the user to complete the virtual mobility test at the different luminance levels (column 1 lines 42-54). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine Blaha and Leung such that generating the virtual mobility test was to include displaying virtual objects to for the user to purposefully and virtually touch by physically moving a body part to physical locations corresponding to virtually displayed locations of the virtual objects and directing the user to intentionally touch the virtual objects, motivated by allowing for assessment of user spatial awareness and hand-eye coordination. Modified Blaha fails to explicitly disclose directing the user to intentionally touch the virtual objects as the user walks through a physical room following a displayed virtual pathway or course. However, Na teaches a virtual reality simulation device, wherein a user is directed to intentionally touch virtual objects (Fig. 2, 110 when a golf course or hole is shown) as the user walks through a physical room following a displayed virtual pathway or course ([0074], “golfer moves in a virtual reality space … through a walking zone of a real space”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine modified Blaha and Na such that the user walked through a physical room following a displayed virtual pathway or course, motivated by more effectively assessing a user’s spatial awareness and hand-eye coordination. Modified Blaha fails to explicitly disclose wherein the user has received a gene therapy for a vision condition. However, Banken teaches a multi-luminance visual mobility test (Page ES4), and discloses wherein patients are tested before-and-after undergoing gene therapy trials for RPE65-associated inherited retinal dystrophy (Page 8, Outcomes). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine modified Blaha and Banken such that a visual mobility test was administered to a patient who has received a gene therapy for a vision condition, motivated by checking the effectiveness of the gene therapy. Regarding claim 2, modified Blaha discloses wherein configuring the virtual mobility test includes configuring the virtual mobility test based on the user or the vision condition (Blaha - [0093]); or wherein configuring the virtual mobility test includes configuring the virtual mobility test to test a right eye, a left eye, or both eyes (Blaha - [0060]). Regarding claim 3, modified Blaha discloses wherein analyzing the performance of the user includes evaluating the effectiveness of the gene therapy for the vision condition of the user (Banken - Page ES4, “The primary efficacy endpoint for the Phase III trial was change in bilateral MLMT performance”), or measuring one or more symptoms of the vision condition (Blaha - [0110]-[0111]). Regarding claim 4, modified Blaha discloses wherein configuring the virtual mobility test includes configuring luminance (Blaha - [0127]), shadow (Blaha - [0132]), color (Blaha - [0130]), contrast (Blaha - [0132]), gradients of contrast or color on the surface of the objects (Blaha - [0132]), reflectance or color of borders of the objects (Blaha - [0140]), or a lighting condition associated with one or more of the objects in the virtual mobility test based on configuration information (Blaha - [0130]). Regarding claim 5, modified Blaha discloses wherein configuring the virtual mobility test includes: configuring the height of one or more of the objects in the virtual mobility test and/or configuring the size of one or more of the objects in the virtual mobility test based on configuration information (Blaga - [0130]). Regarding claim 6, modified Blaha discloses wherein the configuration information includes the height or other attributes of the user, condition-based information; user-inputted information, operator-inputted information, or dynamic information (Blaha - [0132]). Regarding claim 7, modified Blaha discloses wherein generating the virtual mobility test includes providing auditory or haptic feedback to the user when a feedback condition occurs (Blaha - [0163]), wherein the feedback condition includes a collision between a user and an obstacle in the virtual mobility test, a user leaves a designated path or course associated with the virtual mobility test, a user touches an obstacle, or a predetermined amount of progress has not occurred in a predetermined amount of time (Blaha - [0059] and [0163]). Regarding claim 8, modified Blaha discloses wherein generating the virtual mobility test includes capturing the data from the body movement detection sensors (Blaha - [0063]) and using the data to output a video of the user’s progress through the virtual mobility test (Blaha - [0088] and [0096]). Regarding claim 9, modified Blaha discloses wherein the objects in the virtual mobility test may include a tile, an obstacle, a box obstacle, a step-over obstacle, a hanging or swinging obstacle, a floating obstacle, a starting line, a finish line, a finish flag, a guide arrow, or a button obstacle (Blaha - [0157] and [0171]). Regarding claim 10, Blaha discloses a method (Fig. 1, 100), the method comprising: configuring a virtual mobility test for testing visual function of a user ([0056]); generating the virtual mobility test ([0058]), wherein generating the virtual mobility test includes displaying, via a user display, virtual objects ([0058], “object(s) being displayed on the VR device”); and analyzing performance of the user during the virtual mobility test for determining the visual function of the user based on user interaction with objects in the virtual mobility test using data obtained from body movement detection sensors ([0063]-[0065]), repeating the virtual mobility test in which the virtual objects are displayed “with different visual properties” ([0110]-[0111]). Blaha fails to explicitly disclose wherein generating the virtual mobility test includes displaying virtual objects for the user to purposefully and virtually touch by physically moving a body part to physical locations corresponding to virtually displayed locations of the virtual objects and directing the user to intentionally touch the virtual objects, and repeating the virtual mobility test in which the virtual objects are displayed at different luminance levels and wherein analyzing the performance of the user comprises comparing accuracy and/or times for the user to complete the virtual mobility test at the different luminance levels. However, Leung teaches a vision assessment system that includes virtual reality and augmented reality environments, wherein generating a virtual mobility test includes displaying virtual objects for the user to purposefully and virtually touch by physically moving a body part to physical locations corresponding to virtually displayed locations of the virtual objects (Figs. 8A-B), repeating the virtual mobility test in which the virtual simulation is displayed at different luminance levels (column 12 lines 42-45), and wherein analyzing the performance of the user comprises comparing accuracy and/or times for the user to complete the virtual mobility test at the different luminance levels (column 1 lines 42-54). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine Blaha and Leung such that generating the virtual mobility test was to include displaying virtual objects to for the user to purposefully and virtually touch by physically moving a body part to physical locations corresponding to virtually displayed locations of the virtual objects and directing the user to intentionally touch the virtual objects, motivated by allowing for assessment of user spatial awareness and hand-eye coordination. Modified Blaha fails to explicitly disclose directing the user to intentionally touch the virtual objects as the user walks through a physical room following a displayed virtual pathway or course. However, Na teaches a virtual reality simulation device, wherein a user is directed to intentionally touch virtual objects (Fig. 2, 110 when a golf course or hole is shown) as the user walks through a physical room following a displayed virtual pathway or course ([0074], “golfer moves in a virtual reality space … through a walking zone of a real space”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine modified Blaha and Na such that the user walked through a physical room following a displayed virtual pathway or course, motivated by more effectively assessing a user’s spatial awareness and hand-eye coordination. Modified Blaha fails to explicitly disclose wherein the user has received a gene therapy for a vision condition. However, Banken teaches a multi-luminance visual mobility test (Page ES4), and discloses wherein patients are tested before-and-after undergoing gene therapy trials for RPE65-associated inherited retinal dystrophy (Page 8, Outcomes). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine modified Blaha and Banken such that a visual mobility test was administered to a patient who has received a gene therapy for a vision condition, motivated by checking the effectiveness of the gene therapy. Regarding claim 11, modified Blaha discloses wherein configuring the virtual mobility test includes configuring the virtual mobility test based on the user or the vision condition (Blaha - [0093]); or wherein configuring the virtual mobility test includes configuring the virtual mobility test to test a right eye, a left eye, or both eyes (Blaha - [0060]). Regarding claim 12, modified Blaha discloses wherein analyzing the performance of the user includes evaluating the effectiveness of the gene therapy for the vision condition of the user (Banken - Page ES4, “The primary efficacy endpoint for the Phase III trial was change in bilateral MLMT performance”), or measuring one or more symptoms of the vision condition (Blaha - [0110]-[0111]). Regarding claim 13, modified Blaha discloses wherein configuring the virtual mobility test includes configuring luminance (Blaha - [0127]), shadow (Blaha - [0132]), color (Blaha - [0130]), contrast (Blaha - [0132]), gradients of contrast or color on the surface of the objects (Blaha - [0132]), reflectance or color of borders of the objects (Blaha - [0140]), or a lighting condition associated with one or more of the objects in the virtual mobility test based on configuration information (Blaha - [0130]). Regarding claim 14, modified Blaha discloses wherein configuring the virtual mobility test includes: configuring the height of one or more of the objects in the virtual mobility test and/or configuring the size of one or more of the objects in the virtual mobility test based on configuration information (Blaha - [0130]). Regarding claim 15, modified Blaha discloses wherein the configuration information includes the height or other attributes of the user, condition-based information; user-inputted information, operator-inputted information, or dynamic information (Blaha - [0132]). Regarding claim 16, modified Blaha discloses wherein generating the virtual mobility test includes providing auditory or haptic feedback to the user when a feedback condition occurs (Blaha - [0163]), wherein the feedback condition includes a collision between a user and an obstacle in the virtual mobility test, a user leaves a designated path or course associated with the virtual mobility test, a user touches the one or more of the objects, or a predetermined amount of progress has not occurred in a predetermined amount of time (Blaha - [0059] and [0163]). Regarding claim 17, modified Blaha discloses wherein generating the virtual mobility test includes capturing the data from the body movement detection sensors (Blaha - [0063]) and using the data to output a video of the user’s progress through the virtual mobility test (Blaha - [0088] and [0096]). Regarding claim 18, modified Blaha discloses wherein the objects in the virtual mobility test may include a tile, an obstacle, a box obstacle, a step-over obstacle, a hanging or swinging obstacle, a floating obstacle, a starting line, a finish line, a finish flag, a guide arrow, or a button obstacle (Blaha - [0157] and [0171]). Regarding claim 19, Blaha discloses a non-transitory computer readable medium having stored thereon executable instructions that when executed by at least one processor of a computer cause the computer to perform steps ([0022]) comprising: configuring a virtual mobility test for testing visual function of a user ([0056]); generating the virtual mobility test ([0058]), wherein generating the virtual mobility test includes displaying, via a user display, virtual objects ([0058], “objects being displayed on the VR device”); and analyzing performance of the user during the virtual mobility test for determining the visual function of the user based on user interaction with objects in the virtual mobility test using data obtained from body movement detection sensors ([0063]-[0065]), repeating the virtual mobility test in which the virtual objects are displayed “with different visual properties” ([0110]-[0111]). Blaha fails to explicitly disclose wherein generating the virtual mobility test includes displaying virtual objects for the user to purposefully and virtually touch by physically moving a body part to physical locations corresponding to virtually displayed locations of the virtual objects and directing the user to intentionally touch the virtual objects, and repeating the virtual mobility test in which the virtual objects are displayed at different luminance levels and wherein analyzing the performance of the user comprises comparing accuracy and/or times for the user to complete the virtual mobility test at the different luminance levels. However, Leung teaches a vision assessment system that includes virtual reality and augmented reality environments, wherein generating a virtual mobility test includes displaying virtual objects for the user to purposefully and virtually touch by physically moving a body part to physical locations corresponding to virtually displayed locations of the virtual objects (Figs. 8A-B), repeating the virtual mobility test in which the virtual simulation is displayed at different luminance levels (column 12 lines 42-45), and wherein analyzing the performance of the user comprises comparing accuracy and/or times for the user to complete the virtual mobility test at the different luminance levels (column 1 lines 42-54). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine Blaha and Leung such that generating the virtual mobility test was to include displaying virtual objects to for the user to purposefully and virtually touch by physically moving a body part to physical locations corresponding to virtually displayed locations of the virtual objects and directing the user to intentionally touch the virtual objects, motivated by allowing for assessment of user spatial awareness and hand-eye coordination. Modified Blaha fails to explicitly disclose directing the user to intentionally touch the virtual objects as the user walks through a physical room following a displayed virtual pathway or course. However, Na teaches a virtual reality simulation device, wherein a user is directed to intentionally touch virtual objects (Fig. 2, 110 when a golf course or hole is shown) as the user walks through a physical room following a displayed virtual pathway or course ([0074], “golfer moves in a virtual reality space … through a walking zone of a real space”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine modified Blaha and Na such that the user walked through a physical room following a displayed virtual pathway or course, motivated by more effectively assessing a user’s spatial awareness and hand-eye coordination. Modified Blaha fails to explicitly disclose wherein the user has received a gene therapy for a vision condition. However, Banken teaches a multi-luminance visual mobility test (Page ES4), and discloses wherein patients are tested before-and-after undergoing gene therapy trials for RPE65-associated inherited retinal dystrophy (Page 8, Outcomes). It would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to combine modified Blaha and Banken such that a visual mobility test was administered to a patient who has received a gene therapy for a vision condition, motivated by checking the effectiveness of the gene therapy. Regarding claim 20, modified Blaha discloses wherein configuring the virtual mobility test includes configuring the virtual mobility test based on the user or the vision condition (Blaha - [0093]); or wherein configuring the virtual mobility test includes configuring the virtual mobility test to test a right eye, a left eye, or both eyes (Blaha - [0060]). Conclusion 6. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. 7. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Daniel Jeffery Jordan whose telephone number is 571-270-7641. The examiner can normally be reached 9:30a-6:00p. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Stephone Allen can be reached at 571-272-2434. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /D. J. J./Examiner, Art Unit 2872 /STEPHONE B ALLEN/Supervisory Patent Examiner, Art Unit 2872
Read full office action

Prosecution Timeline

Oct 23, 2020
Application Filed
Sep 28, 2023
Non-Final Rejection — §103
Mar 27, 2024
Response Filed
Jul 12, 2024
Final Rejection — §103
Sep 18, 2024
Response after Non-Final Action
Oct 18, 2024
Response after Non-Final Action
Oct 18, 2024
Examiner Interview (Telephonic)
Nov 14, 2024
Examiner Interview (Telephonic)
Nov 14, 2024
Examiner Interview Summary
Nov 25, 2024
Request for Continued Examination
Dec 05, 2024
Response after Non-Final Action
Jun 03, 2025
Non-Final Rejection — §103
Sep 10, 2025
Response Filed
Jan 06, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591113
LENS ASSEMBLY AND ELECTRONIC APPARATUS INCLUDING THE SAME
2y 5m to grant Granted Mar 31, 2026
Patent 12566316
CAMERA OPTICAL LENS
2y 5m to grant Granted Mar 03, 2026
Patent 12461343
OPTICAL IMAGING LENS
2y 5m to grant Granted Nov 04, 2025
Patent 12429711
OPHTHALMIC DEVICE WITH BUILT-IN SELF-TEST CIRCUITRY FOR TESTING AN ADJUSTABLE LENS
2y 5m to grant Granted Sep 30, 2025
Patent 12429715
Synthesis and Application of Light Management with Thermochromic Hydrogel Microparticles
2y 5m to grant Granted Sep 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
62%
Grant Probability
62%
With Interview (+0.0%)
3y 9m
Median Time to Grant
High
PTA Risk
Based on 48 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month