Prosecution Insights
Last updated: April 19, 2026
Application No. 18/474,563

SYSTEMS AND METHODS FOR UPDATING USER INTERFACES OF MARINE ELECTRONIC DEVICES WITH ACTIVITY-BASED OPTIMIZED SETTINGS

Final Rejection §103
Filed
Sep 26, 2023
Examiner
SILVERMAN, SETH ADAM
Art Unit
2172
Tech Center
2100 — Computer Architecture & Software
Assignee
Navico Inc.
OA Round
5 (Final)
73%
Grant Probability
Favorable
6-7
OA Rounds
2y 4m
To Grant
88%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
327 granted / 449 resolved
+17.8% vs TC avg
Moderate +15% lift
Without
With
+14.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
47 currently pending
Career history
496
Total Applications
across all art units

Statute-Specific Performance

§101
8.9%
-31.1% vs TC avg
§103
58.5%
+18.5% vs TC avg
§102
20.1%
-19.9% vs TC avg
§112
9.4%
-30.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 449 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claims in Consider Claims 1-20 are canceled. Claims 21-30, and 41-50, are pending in this application. Response to Arguments Applicant's arguments filed 10/16/2025 have been fully considered but they are not persuasive. The applicant’s arguments to Richardson are not compelling. The applicant has alleged that Richardson’s Bluetooth connection, is invalid as non-analogous art. However, It is well known in the art that many devices of the past decade and a half have been able to wirelessly connect with other devices. Car radios, airplane seat entertainment centers, are just two examples that can connect to Bluetooth devices. The examiner has made no attempt to claim that Richardson teaches watercraft device connectivity. Both of the other references already teach this watercraft limitation. Richardson is only being used for the universal application of wireless device connectivity. Because modern equipment such as cars, airplanes, tablets, laptops, and desktops, are already well-known to employ wireless connectivity, it is entirely reasonable to incorporate this technology into watercraft. Claim Rejection Notes In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 21-30 and 41-50, are rejected under 35 U.S.C. 103 as being unpatentable over Kinoshita et al. (US 20160264227 A1, published: 9/15/2016), in view of Richardson (US 20210073213 A1, published: 3/11/2021), and in further view of Grace et al. (US 20160098865 A1, published: 4/7/2016). Claim 21. (Currently Amended): Kinoshita teaches a system for connecting a device to a network for a watercraft (a watercraft with which it is easier to arrange a network system of devices [Kinoshita, 0006]), the system comprising: a screen (display device 8 [Kinoshita, 0130]); a processor; and a memory (central controller 7 includes a computing device 71 such as a CPU or other computing device, a memory 72, a storage device 73 [Kinoshita, 0092]) including computer executable instructions, the computer executable instructions configured to, when executed by the processor, cause the processor to: detect a device proximate the watercraft (the fish finder screen 68a displays fish school information indicating a position of a fish school in the water below the watercraft [Kinoshita, 0130]), the device being electronically connectable to the network for the watercraft (each of these devices includes a controller and the controllers of related devices are connected to one another to define a network system of devices inside the watercraft [Kinoshita, 0004]); present, on the screen, a movable representation of the device with an image representing the watercraft (the display device 8 displays an image captured by the imaging device 60, or the display device 8 displays a moving image captured by the imaging device 60. The display device 8 displays the moving image captured by the imaging device 60 in real time [Kinoshita, 0095]. As shown in FIG. 19, the central controller 7 displays a navigation screen 68b on the display device 8 based on a current position detection signal from the GNSS receiver 62. The navigation screen 68b includes a map indicating the current position of the watercraft 1 [Kinoshita, 0130]); receive user input moving the movable representation of the device to a desired position on the image representing the watercraft (when launching and docking, an operator operates the joystick to pilot the watercraft [Kinoshita, 0034]. The navigation screen 68b includes a map indicating the current position of the watercraft 1. Operating buttons 68c for the fish finder function and the navigation function are displayed on the display device 8 as software keys [Kinoshita, 0130]; Examiner's Note: the operator can adjust the display, and can change position of the boat to change the displayed output); determine the desired position (the GNSS receiver 62 is a receiver for a GPS or other GNSS (global navigation satellite system) and measures a current position of the watercraft 1 [Kinoshita, 0091]. The position information 69c of the watercraft 1 is acquired by the GNSS receiver 62 [Kinoshita, 0131]). Kinoshita does not teach a system for connecting an unconnected device to a network; detect an unconnected device, the unconnected device being electronically connectable to the network. However, Richardson teaches a system for connecting an unconnected device to a network; detect an unconnected device, the unconnected device being electronically connectable to the network (in FIG. 5A, the initiator's device has found a local connection 1412 (e.g. by way of Bluetooth or WIFI) to the recipient's device as well as other local connections 1413/1414 to other nearby devices and has selected the local connection 1412 to the recipient's device. Note that in this embodiment, after selecting the local connection 1412 to the recipient's device, the initiator selects the reason for connecting 1420/1422/1424 [Richardson, 0073]). Therefore, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the watercraft proximate object finding and displaying invention of Kinoshita to include the selection of an unconnected device to be connected to a network device feature of Richardson. One would have been motivated to make this modification in view of the many devices that allow for connection via Bluetooth or other standard in accordance with a user selecting said device to be connected. The combination of Kinoshita and Richardson, does not teach determine a name for the device; and store the name and the desired position for the device in the memory, such that the device becomes a connected device. However, Grace teaches determine a name for the device; and store the name and the desired position for the device in the memory (the first computing device obtaining and storing current aquatic effort data, wherein the current aquatic effort data includes a current position identifier that facilitates indexing of current aquatic effort data obtained at the current position, wherein the current aquatic effort data further includes at least one member of a group consisting of: a quantity indicator of current fish caught at the current position, a fish type indicator that indicates a type of fish currently caught at the current position, a fish size indicator that indicates a size of fish currently caught at the current position [Grace, Claim 21]), such that the device becomes a connected device (network overview 100 may include internet 102 and one or more watercrafts 104, 106 and 108 [Grace, 0026, FIG. 1]). Therefore, it would have been obvious to a person of ordinary skill in the art, before the effective filing date of the claimed invention, to modify the watercraft proximate object finding and displaying invention of the combination of Kinoshita and Richardson, to include the identification and storage of an object feature of Grace. One would have been motivated to make this modification because any type of recording of historical aquatic data is typically manual and labor intensive. Current aquatic data suffers from many of the same frustrations as historical aquatic data. Furthermore, any type of success during an aquatic effort is usually the result of the experience of the angler or scientist and/or trial and error [Grace, 0001]. It would thus benefit the user to have a reliable system to determine the type of object, and where said object was discovered, that can then be stored in memory, to assist in finding said object again in the future. Claims 29 and 30, having the same deficiencies of claim 21, are likewise rejected. Claim 22. (Currently Amended): The combination of Kinoshita, Richardson, and Grace, teaches the system of claim 21. Kinoshita further teaches wherein the processor is further configured to present a device icon as the movable representation of the device, and present the unconnected device icon overtop the image representing the watercraft in a way that depicts a detected current position of the unconnected device ([Kinoshita, 0130 FIG. 19]; Examiner's Note: as illustrated). Claim 23. (Currently Amended): The combination of Kinoshita, Richardson, and Grace, teaches the system of claim 21. Kinoshita further teaches wherein the user input includes dragging the movable representation of the unconnected device to the desired position on the image representing the watercraft (the direction lever 94 and the rotation lever 95 are both lever shaped operating icons. The operator can drag the direction lever 94 and move it in any of the forward, reverse, left, and right directions [Kinoshita, 01121]). Claims 41 and 47, having the same deficiencies of claim 23, are likewise rejected. Claim 24. (Currently Amended): The combination of Kinoshita, Richardson, and Grace, teaches the system of claim 21. The combination further teaches wherein the processor is further configured to receive one or more image types from the unconnected device and update the memory to include the one or more image types for presentation on the screen (the display device 8 displays the moving image captured by the imaging device 60 in real time [Kinoshita, 0095]. The first computing device obtaining and storing current aquatic effort data [Grace, Claim 21]). Claims 42 and 48, having the same deficiencies of claim 24, are likewise rejected. Claim 25. (Currently Amended): The combination of Kinoshita, Richardson, and Grace, teaches the system of claim 24. Kinoshita further teaches wherein each of the one or more image types from the unconnected device provides at least one of chart data, sonar data, radar data, or vessel data (the sonar 53 emits sound waves into the water surrounding the watercraft body 2 and measures the positions of objects in the water [Kinoshita, 0089]). Claims 43 and 49, having the same deficiencies of claim 25, are likewise rejected. Claim 26. (Currently Amended): The combination of Kinoshita, Richardson, and Grace, teaches the system of claim 21. Kinoshita further teaches wherein the processor is further configured to receive data from the unconnected device and store imagery that is representative of the unconnected device based on the received data (the still images and moving images captured by the imaging device 60 are stored in the storage device 73 as digital data [Kinoshita, 0096]). Claim 44, having the same deficiencies of claim 26, is likewise rejected. Claim 27. (Currently Amended): The combination of Kinoshita, Richardson, and Grace, teaches the system of claim 26. Kinoshita further teaches wherein the movable representation of the unconnected device is the stored imagery (the still images and moving images captured by the imaging device 60 are stored in the storage device 73 as digital data [Kinoshita, 0096]). Claim 45, having the same deficiencies of claim 27, is likewise rejected. Claim 28. (Currently Amended): The combination of Kinoshita, Richardson, and Grace, teaches the system of claim 21. Kinoshita further teaches wherein the processor is further configured to: detect a unconnected second device; present, on the screen, a second movable representation of the second unconnected device with the image representing the watercraft; receive second user input moving the second movable representation of the second unconnected device to a second desired position on the watercraft; determine the second desired position; determine a second name for the second unconnected device; and store the second name and the second unconnected desired position for the second device in the memory (Examiner's Note: this claim is describing redoing the independent. As is well known, a computer program can be run over and over again. The embodiment of Kinoshita can display objects in real-time, as they swim up to the camera or sensors). Claims 46 and 50, having the same deficiencies of claim 25, is likewise rejected. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SETH A SILVERMAN whose telephone number is (571)272-9783. The examiner can normally be reached Mon-Thur, 8AM-4PM MST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Adam Queler can be reached on (571)272-4140. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Seth A Silverman/Primary Examiner, Art Unit 2145
Read full office action

Prosecution Timeline

Sep 26, 2023
Application Filed
Aug 02, 2024
Non-Final Rejection — §103
Sep 06, 2024
Response Filed
Nov 07, 2024
Non-Final Rejection — §103
Jan 07, 2025
Interview Requested
Jan 16, 2025
Applicant Interview (Telephonic)
Jan 16, 2025
Examiner Interview Summary
Feb 05, 2025
Response Filed
Mar 07, 2025
Final Rejection — §103
May 12, 2025
Response after Non-Final Action
Jun 06, 2025
Request for Continued Examination
Jun 11, 2025
Response after Non-Final Action
Jul 16, 2025
Non-Final Rejection — §103
Oct 16, 2025
Response Filed
Nov 26, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12587581
SYSTEMS, METHODS, AND MEDIA FOR CAUSING AN ACTION TO BE PERFORMED ON A USER DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12579201
INFORMATION PROCESSING SYSTEM
2y 5m to grant Granted Mar 17, 2026
Patent 12578200
NAVIGATIONAL USER INTERFACES
2y 5m to grant Granted Mar 17, 2026
Patent 12572269
PERFORMING A CONTROL OPERATION BASED ON MULTIPLE TOUCH POINTS
2y 5m to grant Granted Mar 10, 2026
Patent 12572261
SPATIAL NAVIGATION AND CREATION INTERFACE
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

6-7
Expected OA Rounds
73%
Grant Probability
88%
With Interview (+14.8%)
2y 4m
Median Time to Grant
High
PTA Risk
Based on 449 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month