Prosecution Insights
Last updated: April 19, 2026
Application No. 17/948,577

SYSTEMS, METHODS, AND DEVICES FOR ENVIRONMENT DETECTION FOR THE VISION IMPAIRED

Final Rejection §103
Filed
Sep 20, 2022
Examiner
JACKSON, DANIELLE
Art Unit
3636
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Andiee's, LLC
OA Round
2 (Final)
65%
Grant Probability
Favorable
3-4
OA Rounds
2y 4m
To Grant
92%
With Interview

Examiner Intelligence

Grants 65% — above average
65%
Career Allow Rate
574 granted / 878 resolved
+13.4% vs TC avg
Strong +26% interview lift
Without
With
+26.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
19 currently pending
Career history
897
Total Applications
across all art units

Statute-Specific Performance

§101
0.1%
-39.9% vs TC avg
§103
39.5%
-0.5% vs TC avg
§102
23.6%
-16.4% vs TC avg
§112
30.8%
-9.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 878 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections Claim 14 is objected to because of the following informalities: Claim 14 is an exact copy of claim 13 and therefore is redundant. Appropriate correction is required. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1-9, 11 and 15-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Plikynas et al. (US-2023/0050825 A1) in view of Kim et al. (KR-102103405 B1). Claim 1: Plikynas et al. discloses a system, comprising: a server computer (700) including a processor (800), a device including an optical sensor (120, 150), a cane (250), wherein the device causes the haptic controller to provide one or more notifications to a user. Plikynas et al. teaches a haptic controller (530), but lacks the haptic controller being on the cane. Kim et al. teaches a system (1000), comprising: a device including an optical sensor (180), and a cane (200) including a haptic controller (140), wherein the device causes the haptic controller in the cane to provide one or more notifications to a user (The current GPS location information of the smart stick 1000 is collected through the location information acquisition unit 110, and the guide information output unit 140 is provided on the handle unit, and provides the user with at least one of braille, vibration, and sound”; English translation of the specification). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Plikynas et al. to include a haptic controller on the cane itself, as suggested by Kim et al., so the user had multiple ways to receive the notifications or an alternative way to receive notifications depending on their preference. Claim 2: Plikynas et al. teaches an artificial intelligence core (800); and an artificial intelligence data memory storage (900). Claim 3: Plikynas et al. teaches images from the optical sensor as being compared against data in the artificial intelligence data memory storage to identify objects detected by the optical sensor (paragraph 69). Claim 4: Plikynas et al. teaches the artificial intelligence core as comparing the images from the optical sensor against data in the artificial intelligence memory data storage to identify objects detected by the optical sensor (paragraph 136). Claim 5: As discussed above, in the combination of Plikynas et al. modified by the teaching of Kim et al. to have a haptic controller on the cane itself, it would also have obvious to have the cane be wirelessly connected to the device, so that the device and the cane could communicate without the user of wires which can be snagged on objects as the user is walking with the cane. Additionally, Plikynas et al. already teaches the use of wireless connections between components elsewhere in the system (paragraphs 123, 128, 134, 143 and 153). Claims 6-8: The combination of Plikynas and Kim et al. teaches the haptic controller as including a braille printer, a vibration unit, or an audible warning (see Kim -“the guide information output unit 140 is provided on the handle unit, and provides the user with at least one of braille, vibration, and sound”; English translation of the specification). Claim 9: Plikynas et al. teaches the optical sensor as being a camera (paragraphs 54, 67, 73, 91, 117-119; reference characters 120, 150). Claim 11: Plikynas et al. teaches an artificial intelligence core which receives images from the optical sensor (paragraph 134). Claims 15 and 16: Plikynas et al. teaches the server processor identifies a location of the user based on data collected by the optical sensor and identifies a terrain associated with said location (paragraphs 64 and 183). Claims 17 and 18: Plikynas et al. does not explicitly teach the server processor identifying a private space and, in response, turns off the optical sensor, wherein the private space is a location identified by the user of the device. However, it would have been obvious to include this function as a privacy feature so that the user would not be recorded in a space that they did not want to be or did not need to be without them having to manually turn the optical sensor on/off themselves, which saves them time/energy. Claims 19 and 20: As best understood from the Applicant’s disclosure, Plikynas et al. does identify objects within the user’s personal space and outside of their personal space (as evidenced by FIG. 4, wherein objects not within the user’s reached are captured by the device) and has the ability to notify a user about objects in their extended space (as discussed above). Claim(s) 10 and 12-14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Plikynas et al. (US-2023/0050825 A1) in view of Kim et al. (KR-102103405 B1) as applied to claims 1 and 11 above, and further in view of Higgins (US-2022/0280370 A1). Claim 10: The combination of Plikynas et al. and Kim et al. is discussed above and Plikynas et al. teaches passive sensors (RGB camera, IMU, GPS, light detector, EMG; paragraph 67), and an active optical sensor (3D-ToF-IR camera; paragraph 67), but lacks a LIDAR optical sensor. Higgins teaches a system comprising a cane (102) including an optical sensor (312), wherein the optical sensor is a LIDAR optical sensor (Paragraph 61). It would have been obvious to one of ordinary skill in the art before the effective filing date to modify the combination to include a LIDAR optical sensor, as taught by Higgins, as an alternative to the sensors disclosed by Plikynas et al. so that the surroundings of the user could more easily and accurately be detected even if the event of limited visibility (smoke, steam, etc.). Claims 12-14: Plikynas et al. teaches the optical sensors to be interpreted by a neural network incorporated into the artificial intelligence core and also teaches multiples different kinds of neural networks (paragraphs 8, 29, 91, 156) Therefore, it would have been obvious to one of ordinary skill in the art to have a first neural network for identifying objects detects by one type of sensor, such as a LIDAR optical sensor as discussed above and taught by Higgins, and a second neural network for identifying objects detects by another type of sensor, such as a camera optical sensor (120, 150), to interpret the two different types of images being received by the artificial intelligence core. Response to Arguments Applicant's arguments filed 6/24/2025 have been fully considered but they are not persuasive. Applicant argues that there is no reason why one of ordinary skill in the art would combine a hand held “cane” with a “hands-free” navigation system (Remarks; page 11). However, it is noted that Plikynas et al. still teaches the use of a cane (250; see FIG. 2) and even has an element (230, FIG. 2) of the navigation system located on the cane itself (paragraphs 127, 133, 139 and 143). As such, adding a haptic controller to the cane that already exists would be obvious to one of ordinary skill in the art. Therefore, the Examiner maintains the combination of Plikynas et al. in view of Kim et al. reads on claim 1 as it currently stands. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to DANIELLE JACKSON whose telephone number is (571)272-2268. The examiner can normally be reached M-F: 11AM-7PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, David Dunn can be reached at (571)272-6670. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /DNJ/Examiner, Art Unit 3636 /DAVID R DUNN/Supervisory Patent Examiner, Art Unit 3636
Read full office action

Prosecution Timeline

Sep 20, 2022
Application Filed
Mar 17, 2025
Non-Final Rejection — §103
Jun 24, 2025
Response Filed
Oct 03, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12595677
Portable Wind-Resistant Shade Structure
2y 5m to grant Granted Apr 07, 2026
Patent 12553253
ROOFTOP TENT ASSEMBLY
2y 5m to grant Granted Feb 17, 2026
Patent 12553255
PORTABLE SHELTERS
2y 5m to grant Granted Feb 17, 2026
Patent 12546136
CANOPY ENGAGEMENT DEVICE AND CANOPY WITH THE CANOPY ENGAGEMENT DEVICE
2y 5m to grant Granted Feb 10, 2026
Patent 12546137
Portable Barrier and Associated Method of Use
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
65%
Grant Probability
92%
With Interview (+26.5%)
2y 4m
Median Time to Grant
Moderate
PTA Risk
Based on 878 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month