Prosecution Insights
Last updated: April 19, 2026
Application No. 18/314,375

SYSTEMS AND METHODS FOR NAVIGATING INTERACTIVE ELEMENTS OF AN APPLICATION

Final Rejection §103
Filed
May 09, 2023
Examiner
NGUYEN, KENNY
Art Unit
2171
Tech Center
2100 — Computer Architecture & Software
Assignee
Sb22 Inc.
OA Round
4 (Final)
49%
Grant Probability
Moderate
5-6
OA Rounds
3y 1m
To Grant
97%
With Interview

Examiner Intelligence

Grants 49% of resolved cases
49%
Career Allow Rate
88 granted / 178 resolved
-5.6% vs TC avg
Strong +48% interview lift
Without
With
+47.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
32 currently pending
Career history
210
Total Applications
across all art units

Statute-Specific Performance

§101
6.7%
-33.3% vs TC avg
§103
51.6%
+11.6% vs TC avg
§102
18.2%
-21.8% vs TC avg
§112
19.1%
-20.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 178 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This action is made final. Claims 1, 8-10, and 17-20 are pending in the case. Claims 1, 10, and 19 are independent claims. Claims 2-7 and 11-16 have been canceled. Priority Acknowledgement is made of Applicant’s claim for domestic benefit of provisional application 63/364,386 filed 05/09/2022. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 8, 10, 17, 19, and 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Dakss et al. (US 11785280 B1), in view of Eckman et al. (US 10515516 B1). Regarding claim 1, Dakss teaches a system (FIGS. 1A-B and Col. 20, line 52 to Col. 21, line 3) comprising: at least one processor (FIG. 1A and Col. 20, line 52 to Col. 21, line 19: personal digital device 102 must contain at least one processor to execute client software 103); and memory storing instructions that, when executed by the at least one processor, cause the system to perform a set of operations (FIG. 1A and Col. 20, line 52 to Col. 21, line 19: personal digital device 102 must contain memory to store instructions/client software 103 that, when executed by the processor, causes a set of operations to be performed), the set of operations comprising: establishing a secure connection between a device application and a server (FIG. 1A and Col. 20, line 55 to Col. 21, line 41: a secure connection is established between a device 102, having a device application/client software 103, and a server/interactive services platform 100. Client software 103 may be “a standalone application or as a software library or software development kit that has been integrated with an operator software application 104 where the operator is a provider of interactive services to users such as an online sports betting service”; Col. 27, lines 54-58: “The client software 103 uses a communications network such as the Internet to connect to the interactive services platform 100 using the core real-time API service 105 which starts a client communications session.” Thus, a secure connection is established); receiving a selection of a focus mode for the application (FIG. 2A and Col. 29, line 22 to Col. 30, line 17: as seen in screens 200 to 206, a selection of a focus mode for the application is received when the user initiates either microphone icon 202 or interactive button 204); determining an event card to display in focus mode based on one or more of an application history and one or more user preferences (FIG. 2A and Col. 29, line 22 to Col. 30, line 40: as seen in screen 212, an event card/bet offer 218 is determined to be displayed in focus mode. This determination is “based on the user's past behaviors and attributes”, which indicate one or more of an application history and one or more user preferences; For additional details regarding application history and user preferences, see FIG. 6A and Col. 52, line 22 to Col. 54, line 10; FIG. 6B and Col. 55, line 56 to Col. 56, line 3; FIG. 10A and Col. 62, line 57 to Col. 63, line 6); displaying the event card with all available interactive elements for an event on the event card (FIG. 2B and Col. 29, line 64 to Col. 30, line 40: event card, or first widget, corresponds to bet offer 218 that is displayed with all available interactive elements for the event, including “buttons or sliders where the user can interact with the offer 222”); receiving a gesture input (FIG. 2B and Col. 30, line 32 to Col. 31, line 30: a gesture input is received. For example, the user may perform a horizontal swipe to display bet offer 228. Note that “it will be appreciated to one skilled in the art that there are many possible embodiments that could be practiced with many possible user interface variations, such as an interface where instead of swiping left or right the user can swipe up or down to reveal another recommended bet offer, where a similar outcome can be achieved”); modifying the displayed event card based on the received gesture input (FIG. 2B and Col. 30, line 32 to Col. 31, line 30: the displayed event card is modified based on the received gesture input. For example, if the user performs a horizontal swipe, the displayed event card/bet offer 218 transitions to bet offer 228), wherein modifying the displayed event card comprises: determining a type of the received gesture input (FIG. 2B and Col. 30, line 32 to Col. 31, line 30: a gesture input is received and determined. For example, the user may perform a horizontal swipe to display bet offer 228. Note that “it will be appreciated to one skilled in the art that there are many possible embodiments that could be practiced with many possible user interface variations, such as an interface where instead of swiping left or right the user can swipe up or down to reveal another recommended bet offer, where a similar outcome can be achieved”); when the gesture is a vertical swipe (FIG. 2B and Col. 30, line 32 to Col. 31, line 30: the user can swipe vertically/scroll to see additional recommended offers to a prior carousel 226, as supported in Col. 30, lines 21-24, FIG. 11 and Col. 66, lines 47-51. Prior carousel 226 displays a new event card from a different event card category): transitioning to a different event card category (FIG. 2B and Col. 30, line 32 to Col. 31, line 30: the user can swipe vertically/scroll to see additional recommended offers to a prior carousel 226, as supported in Col. 30, lines 21-24, FIG. 11 and Col. 66, lines 47-51. Prior carousel 226/different event card category includes a first event card); and displaying a first event card from the different event category as the displayed event card (FIG. 2B and Col. 30, line 32 to Col. 31, line 30: the user can swipe vertically/scroll to see additional recommended offers to a prior carousel 226, as supported in Col. 30, lines 21-24, FIG. 11 and Col. 66, lines 47-51. The first event card “Offer 6” is from a different event card category/prior carousel 226 as the displayed event card, which belongs to event category corresponding to carousel 216); when the gesture is a horizontal swipe (FIG. 2B and Col. 30, line 32 to Col. 31, line 30: a gesture input is received. For example, the user may perform a horizontal swipe to display bet offer 228): displaying a second event card from a same event category as the event card as the displayed event card (FIG. 2B and Col. 30, line 32 to Col. 31, line 30: the displayed event card is modified based on the received gesture input. For example, if the user performs a horizontal swipe, the displayed event card/bet offer 218 transitions to a second event card/bet offer 228 from the same event card category/carousel 216 as the displayed event card); and selecting an interactive element and a bet amount on the displayed event card via a tap of the interactive element (FIGS. 2D-E and Col. 33, lines 6-62: an interactive element and a bet amount is selected via a tap of the interactive element. For example, the user selects an “Edit Bet” button 275 which allows selection of a new bet amount represented by alternate bet option 279); receiving bet confirmation of the interactive element via a confirming input (FIGS. 2D-E and Col. 33, lines 6-62: for example, a confirming input corresponds to the user selecting “Place Bet” button 290 as seen in screen 289, which allows bet confirmation to be received. “When the user presses the final “Place Bet” button 290 the transaction is completed by having the order placed through the operator application 104 as shown in overlay 291 in screen 292. This is accomplished due to the client software 106 operating as a component integrated within the operator application 104 as a software component, SDK or set of software libraries used by the operator application 104.”); and displaying an acknowledgement of the received bet (FIGS. 2D-E and Col. 33, lines 6-62: “When the user presses the final “Place Bet” button 290 the transaction is completed by having the order placed through the operator application 104 as shown in overlay 291 in screen 292. This is accomplished due to the client software 106 operating as a component integrated within the operator application 104 as a software component, SDK or set of software libraries used by the operator application 104.” The displayed overlay 291 acknowledges the received bet). Although Dakss teaches transitioning to a different event card category; and displaying a first event card from the different event card category as the displayed event card, Dakss’ first embodiment does not explicitly teach doing so when the gesture is a horizontal swipe (Dakss’ first embodiment teaches doing so based on a vertical swipe). Similarly, although Dakss teaches displaying a second event card from a same event card category as the displayed event card, Dakss’ first embodiment does not explicitly teach doing so when the gesture is a vertical swipe (Dakss first embodiment teaches doing so based on a horizontal swipe). However, Dakss teaches, “In view of this disclosure, it will be appreciated to one skilled in the art that there are many possible embodiments that could be practiced with many possible user interface variations, such as an interface where instead of swiping left or right the user can swipe up or down to reveal another recommended bet offer, where a similar outcome can be achieved” (Col. 31, lines 24-30). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Dakss embodiments by interchanging the functions of the directional swipes and have wherein modifying the displayed event card further comprising: when the gesture is a horizontal swipe, transitioning to a different event card category; and displaying a first event card from the different event card category as the displayed event card, and when the gesture is a vertical swipe, displaying a second event card from a same event card category as the displayed event card. One of ordinary skill in the art would find it obvious to swap, or perform simple substation of, the horizontal swiping with the vertical swiping, and vice versa, to produce analogous effects. In this way, the user may swipe horizontally to access a different event card category and swipe vertically to access a new event card of the same event card category, which may personally be more intuitive or aesthetically functional for the user. Although Dakss teaches a user adjusting a slider that is part of interactive elements (FIG. 2D and Col. 33, lines 6-40: for example, user adjusts a slider to adjust alternate values to the bet offer), Dakss does not explicitly teach, when the gesture is a drag: continuing to display the event card as the displayed event card; and scrolling through interactive elements on the displayed event card, wherein scrolling through interactive elements causes display of at least one new interactive element. Eckman teaches when the gesture is a drag (FIGS. 12A-B and 13A-B and Col. 15, lines 14-22: “In particular, the user can define his or her wagered amount, select his or her team, and define the spread or line amount by graphically and dynamically moving the graphical slider or track bar element to the left or right”. For example, the user adjusts the point spread via interactive element/graphical slider on an event card): continuing to display the event card as the displayed event card (FIGS. 12A-B and 13A-B and Col. 15, lines 14-22: when the gesture is a drag, the event card continues to be displayed as seen in the transition from FIG. 12A to FIG. 12B); and scrolling through interactive elements on the displayed event card, wherein scrolling through interactive elements causes display of at least one new interactive element (FIGS. 3, 12A-B and 13A-B and Col. 15, lines 14-22: “In particular, the user can define his or her wagered amount, select his or her team, and define the spread or line amount by graphically and dynamically moving the graphical slider or track bar element to the left or right”. For example, the user adjusts the point spread via graphical slider on an event card. Sliding the graphical slider corresponds to scrolling through interactive elements, the interactive elements of which include at least the graphical slider itself and the values of the spread indicated to the right of the graphical slider, both of which are scrolled. Display of a new interactive element may include the text indicating the spread, the text of which appears in response to the scrolling, as seen in FIG. 12B). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified interacting with the slider as disclosed in Dakss by incorporating the teachings of Eckman and have when the gesture is a drag: continuing to display the event card as the displayed event card; and scrolling through interactive elements on the displayed event card, wherein scrolling through interactive elements causes display of at least one new interactive element. Doing so would preclude the user from being limited to choosing from a set number of options and, instead, grant the user the flexibility to efficiently select from a range of values by performing a drag gesture on a slider interactive element, for example. In this way, the user can more quickly and, with more granularity, adjust a betting amount that is optimal for the user. In addition, maintaining the display of the displayed event card prevents the user from losing access to important relevant information while making a bet. Additionally, continuing the display of the displayed event card may also prevent the user from mistakenly placing a bet on an unintended event. Moreover, display of a new interactive element may offer more context to the user to make an informed bet when performing the desired or intended scrolling input. Regarding claim 8, Dakss in view of Eckman teaches the system of claim 1. Dakss further teaches wherein the interactive element includes standardized bet amounts based on one or more of a user history, user preferences, and a user risk profile (FIG. 2A and Col. 29, line 22 to Col. 30, line 40: as seen in screen 212, an event card/bet offer 218 is determined to be displayed in focus mode. Interactive element 222 includes standardized bet amounts included in the bet offer 218. This determination of the bet offer is “based on the user's past behaviors and attributes”, which indicate one or more of an application history and one or more user preferences; For additional details regarding application history and user preferences, see FIG. 6A and Col. 52, line 22 to Col. 54, line 10; FIG. 6B and Col. 55, line 56 to Col. 56, line 3; FIG. 10A and Col. 62, line 57 to Col. 63, line 6). Regarding claims 10 and 17, the claims recite a method comprising steps with corresponding limitations to the system of claims 1 and 8, respectively, and are therefore rejected on the same premises. Regarding claims 19 and 20, the claims recite one or more non-transitory computer storage media including instructions, which when executed by a processor (FIG. 1A and Col. 20, line 52 to Col. 21, line 19: personal digital device 102 must contain memory to store instructions/client software 103 that, when executed by the processor, causes a set of operations to be performed), cause the processor to perform operations with corresponding limitations (claims 19 and 20 are broader as they do not recite a limitation corresponding to “establish a secure connection between a device [having] an application and a server” as included in claims 1 and 8) to the system of claims 1 and 8, respectively, and are therefore rejected on the same premises. Claims 9 and 18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Dakss et al. (US 11785280 B1), in view of Eckman et al. (US 10515516 B1), and in view of Hanajima et al. (US 2020/0038136 A1). Regarding claim 9, Dakss in view of Eckman teaches the system of claim 1. Dakss in view of Eckman does not explicitly teach wherein a confirming input is one or more of a tap and hold gesture or a rotational input of a rotational element of the device. Hanajima teaches wherein a confirming input is one or more of a tap and hold gesture or a rotational input of a rotational element of the device (FIG. 14 and [0173]: “That is, a confirmation button 17 is displayed at the lower right of the screens B and D, and the screen turns to a next screen when an operation (for example, single click, double click, tap, double tap, and long press) of selecting and determining the confirmation button 17 is performed.”). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Dakss in view of Eckman by incorporating the teachings of Hanajima and have wherein a confirming input is one or more of a tap and hold gesture or a rotational input of a rotational element of the device. Doing so would encourage “an operator do careful operations and checking” ([0173]) by verifying that the displayed contents are correct before proceeding. In the context of betting, this would help prevent unintended betting inputs and protect the user from financial burden as the confirmation input of a tap and hold/long press gesture requires more deliberate force and time than a simple tap, which is more likely to be accidentally inputted. Claim 18 recites a method with corresponding limitations to the system of claim 9 and is therefore rejected on the same premise. Response to Arguments Applicant's arguments filed 10/24/2025 have been fully considered but they are not persuasive. In Remarks, Applicant argues: Regarding independent claim 1, and similarly independent claims 10 and 19, Dakss does not teach the drag gesture and its function as claimed (p. 7 of Remarks). To this end, Applicant also argues that Eckman also does not teach the drag gesture and its function as claimed (p. 8 of Remarks). Applicant further argues that Eckman does not disclose displaying “at least one new interactive element” (p. 8 of Remarks). The Examiner respectfully disagrees. Regarding point (a), Dakss was not relied upon to teach the drag gesture. However, Dakss provides a foundation to supplement the combination of Dakss with Eckman to teach the claimed drag gesture. Dakss’ foundation involves disclosing a user adjusting a slider that is part of interactive elements (FIG. 2D and Col. 33, lines 6-40: for example, user adjusts a slider to adjust alternate values to the bet offer). On the contrary, both Dakss and Eckmann clearly teach a drag gesture. Because Dakss does not explicitly teach, when the gesture is a drag: continuing to display the event card as the displayed event card; and scrolling interactive elements on the displayed event card, Eckman’s teachings were incorporated. Eckman teaches when the gesture is a drag: continuing to display the event card as the displayed event card; and scrolling through interactive elements on the displayed event card, wherein scrolling through interactive elements causes display of at least one new interactive element (FIGS. 3, 12A-B and 13A-B and Col. 15, lines 14-22). Eckman specifically discloses, “In particular, the user can define his or her wagered amount, select his or her team, and define the spread or line amount by graphically and dynamically moving the graphical slider or track bar element to the left or right” (Col. 15, lines 14-22). For example, the user adjusts the point spread via dragging the graphical slider on an event card. When the gesture is a drag, the event card continues to be displayed as seen in the transition from FIG. 12A to FIG. 12B. Sliding the graphical slider corresponds to scrolling through interactive elements, the interactive elements of which include at least the graphical slider itself and the value of the spread indicated to the right of the graphical slider, both of which are scrolled. Display of a new interactive element may include the text indicating the spread, the text of which appears in response to the scrolling (or interaction), as seen in FIG. 12B. Without additional qualifying details, including details for the “interactive element”, the amended independent claims do not preclude the teachings of Dakss in view of Eckman. The Examiner maintains that the rationale to modify Dakss with Eckman accordingly is sufficient and herein restated: It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Dakss embodiments by interchanging the functions of the directional swipes and have wherein modifying the displayed event card further comprising: when the gesture is a horizontal swipe, transitioning to a different event card category; and displaying a first event card from the different event card category as the displayed event card, and when the gesture is a vertical swipe, displaying a second event card from a same event card category as the displayed event card. One of ordinary skill in the art would find it obvious to swap, or perform simple substation of, the horizontal swiping with the vertical swiping, and vice versa, to produce analogous effects. In this way, the user may swipe horizontally to access a different event card category and swipe vertically to access a new event card of the same event card category, which may personally be more intuitive or aesthetically functional for the user. In conclusion, Applicant’s arguments are unpersuasive. Independent claim 1, and similarly amended independent claims 10 and 19, are properly rejected under 35 U.S.C. 103 as being unpatentable over Dakss et al. (US 11785280 B1), in view of Eckman et al. (US 10515516 B1). The dependent claims accordingly remain rejected. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to KENNY NGUYEN whose telephone number is (571)272-4980. The examiner can normally be reached M-Th 7AM to 5PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KIEU D VU can be reached on (571)272-4057. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /KENNY NGUYEN/Primary Examiner, Art Unit 2171
Read full office action

Prosecution Timeline

May 09, 2023
Application Filed
Dec 16, 2023
Non-Final Rejection — §103
Jun 24, 2024
Response Filed
Aug 10, 2024
Final Rejection — §103
Jan 15, 2025
Interview Requested
Feb 17, 2025
Request for Continued Examination
Feb 21, 2025
Response after Non-Final Action
Apr 19, 2025
Non-Final Rejection — §103
Oct 24, 2025
Response Filed
Nov 29, 2025
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602452
FILTERING OF DYNAMIC OBJECTS FROM VEHICLE GENERATED MAP
2y 5m to grant Granted Apr 14, 2026
Patent 12579481
Fluid Machine, Fluid Machine Managing Method and Fluid Machine Managing System
2y 5m to grant Granted Mar 17, 2026
Patent 12578202
NAVIGATION PROCESSING METHOD AND APPARATUS
2y 5m to grant Granted Mar 17, 2026
Patent 12578847
SECURE SCREEN RENDERING WITH ACCESSIBILITY DATA
2y 5m to grant Granted Mar 17, 2026
Patent 12579456
COGNITIVE PLATFORM FOR DERIVING EFFORT METRIC FOR OPTIMIZING COGNITIVE TREATMENT
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
49%
Grant Probability
97%
With Interview (+47.6%)
3y 1m
Median Time to Grant
High
PTA Risk
Based on 178 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month