Prosecution Insights
Last updated: April 19, 2026
Application No. 18/908,294

PAIRING USER HEADPHONES WITH A LOCATION-BASED AUTOMATED ASSISTANT

Non-Final OA §103
Filed
Oct 07, 2024
Examiner
SKHOUN, HICHAM
Art Unit
2164
Tech Center
2100 — Computer Architecture & Software
Assignee
Google LLC
OA Round
3 (Non-Final)
77%
Grant Probability
Favorable
3-4
OA Rounds
3y 1m
To Grant
83%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
266 granted / 344 resolved
+22.3% vs TC avg
Moderate +6% lift
Without
With
+5.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
25 currently pending
Career history
369
Total Applications
across all art units

Statute-Specific Performance

§101
13.6%
-26.4% vs TC avg
§103
41.0%
+1.0% vs TC avg
§102
27.2%
-12.8% vs TC avg
§112
8.1%
-31.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 344 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION 2. Claims 1-20 are presented for examination. 3. This office action is in response to the RCE filed 02/05/2026. 4. Claims 1, 10 and 19 are independent claims. 5. The office action is made Non-Final. Examiner Note 6. The Examiner cites particular columns and line numbers in the references as applied to the claims below for the convenience of the Applicant(s). Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested that, in preparing responses, the Applicant fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner. Claim Rejections - 35 USC § 103 7. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. 8. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: a) A patent may not be obtained through the invention is not identically disclosed or described as set forth in section 102 of this title, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negatived by the manner in which the invention was made. 9. Claims 1-20 are rejected under 35 U.S.C.103 as being unpatentable over Stovezky et al (US 11568146 B2) hereinafter as Stovezky in view of Nadig et al (US 20230120966 A1) hereinafter as Nadig. 10. Regarding claim 1, Stovezky teaches a method implemented by one or more processors, the method comprising: determining that a location-based automated assistant is available to a user device (col 1, lines 50-67, “the location-based biasing modes available for a client device of a user at a given location of the client device include at least a first location-based biasing mode and a second location-based biasing mode.”, col 6, lines 43-60, “The automated assistant 108, with prior permission from the user 102”, col 7, lines 19-41, “the automated assistant 108 can use voice identification, with prior permission from the user 102, in order to determine that the user 102 is the one who is interacting with the automated assistant 108.”, col 16, lines 43-61, “the geographic characteristic engine 216 can generate that data indicates, with prior permission from the user, whether the user is navigating toward or away from an area in which they reside, such as their home city, town, and/or country.”), wherein the location-based automated assistant is available to the user device via a particular network and based on the user device and the location-based automated assistant both being connected to the particular network (Fig 1, col 6, lines 43-60, col 14, lines 18-33, “The server device can host the automated assistant 204, and/or computing device 202 can transmit inputs received at one or more assistant interfaces 220 to the server device.”); determining that a user of the user device has authorized pairing of the location-based automated assistant to the user device (col 6, lines 43-60, “The automated assistant 108, with prior permission from the user 102”, col 7, lines 19-41, “the automated assistant 108 can use voice identification, with prior permission from the user 102, in order to determine that the user 102 is the one who is interacting with the automated assistant 108.”, col 16, lines 43-61, “the geographic characteristic engine 216 can generate that data indicates, with prior permission from the user, whether the user is navigating toward or away from an area in which they reside, such as their home city, town, and/or country.”); pairing, responsive to determining that the user device has authorized pairing of the location-based automated assistant to the user device, the location-based automated assistant to the user device, wherein pairing the location-based automated assistant to the user device comprises providing the location-based automated assistant access to the user device via the particular network (col 6, lines 43-61, “FIG. 1A, FIG. 1B, and FIG. 1C illustrate a view 100, a view 120, and a view 150, respectively, of a user 102 interacting with an automated assistant that operates according to one or more location-based biasing modes. Specifically, FIG. 1A illustrates a view 100 of the user 102 interacting with an automated assistant 108 that is accessible via a computing device 104 that is located within a home of the user 102.”); receiving, at the user device, a query from the user, wherein the query includes a request (col 6, lines 43-61, “FIG. 1A, FIG. 1B, and FIG. 1C illustrate a query includes a request (a spoken utterance)); determining, based on content of the query, that the request is directed towards the location-based automated assistant (col 21, lines 38-58, “a spoken utterance that is directed to the automated assistant.”, col 23, lines 43-60, “The responsive output can be rendered by the computing device and/or the automated assistant, to which the spoken utterance was directed.”); providing, in response to determining that the request is directed towards the location-based automated assistant, the request to the location-based automated assistant (col 6, lines 43-61, “FIG. 1A, FIG. 1B, and FIG. 1C illustrate a query includes a request (a spoken utterance), and a response to the request”, col 21, lines 38-58, “a spoken utterance that is directed to the automated assistant.”, col 23, lines 43-60, “The responsive output can be rendered by the computing device and/or the automated assistant, to which the spoken utterance was directed.”); and receiving, at the user device and from the location-based assistant, a response to the request (col 6, lines 43-61, “FIG. 1A, FIG. 1B, and FIG. 1C illustrate a query includes a request (a spoken utterance), and a response to the request from automated assistant”), wherein the response to the request is generated, by the location-based automated assistant, based on user information accessed by the location-based automated assistant (col 5, lines 39-60, “providing content that is prioritized over other content for rendering to the user when they are near a location of interest within an area.”, col 8, lines 14-30, “The second geographic characteristic data 142 can optionally indicate, with prior permission from the user 102, that the user 102 is at a particular location 130 that is adjacent to a public transit station 132 within the area 122 (user information accessed by the location-based automated assistant).”, col 12, lines 29-52, “When the user 102 is located at the location of interest 124, the automated assistant 108 can bias the processing of inputs and/or outputs according to the location of the user 102 and the first location-based biasing mode (user information accessed by the location-based automated assistant).”, col 25, lines 32-51, “collect personal information about users…a user's geographic location may be generalized where geographic location information is obtained (such as to a city, ZIP code, or state level),”), Examiner notes and interpretation of the subject matter: A trust measure for a location-based automated assistant can be generated by assessing user confidence in the system's competence, which often dictates how much they monitor or rely on it. In specific implementations, such as for service locations, this involves creating "trust cards" that highlight safety measures (e.g., mask enforcement, safety protocols) with a title and description. A trust measure for a location-based automated assistant is a, often automated, system that determines the reliability of location data or user permissions based on verified, secure, and authenticated infrastructure. These measures often use "trust scores" to identify, validate, and manage user interactions with AI, determining whether to allow, pause, or request confirmation for actions. Based on the provided search results, location-based automated assistants and services are increasingly using trust measures—or "AI trust scores"—to determine when and how they can access, process, or share sensitive user data. This approach moves beyond static permissions to a dynamic, risk-based model where data access is granted only if the assistant meets a defined threshold of reliability and security. Stovezky implicitly teaches wherein the location-based automated assistant is provided access to the user information based on a trust measure generated for the location-based automated assistant (Abstract, col 7, lines 19-41, “the automated assistant 108 can use voice identification, with prior permission from the user 102. in order to determine that the user 102 is the one who is interacting with the automated assistant 108. Additionally, the computing device 104 can generate first geographic characteristic data 106 indicating that the user 102 is interacting with the automated assistant 108 within their home.”, col 17, lines 53-67, “When the automated assistant is operating according to the first location-based biasing mode, a score for a transcription that includes one or more terms associated with an area that the user is located can be biased and/or prioritized over other scores for other transcriptions that do not include one or more terms associated with the area. When the automated assistant is operating according to the second location-based biasing mode, a score for a transcription that includes one or more terms associated with a location of interest in which the user is located can be biased and/or prioritized over other scores for other transcriptions that do not include the one or more terms associated with the location of interest.”). However, Nadig explicitly teaches wherein the location-based automated assistant is provided access to the user information based on a trust measure generated for the location-based automated assistant ([0016-0017], “artificial intelligence such Alexa (the location-based automated assistant)”, [0114], “The system(s) 120 may determine threshold user authentication confidence score data that may represent a threshold user authentication confidence score required prior to providing user access to the sensitive data.”, [0128], “The ML component 516 (the location-based automated assistant) may track the behavior of various users as a factor in determining a confidence level of the identity of the user. Thus, the ML component 516 may use historical data and/or usage patterns over time to increase or decrease a confidence level of an identity of a user.”, [0146]). It would have been obvious to one of ordinary skill in the art before the effective filling date of the claimed invention to incorporate the concept of the user information shared with the location-based automated assistant is shared based on a trust measure associated with the location-based automated assistant suggested in Nadig’s system into Stovezky’s and by incorporating Nadig into Stovezky because both systems are related generally to intelligent automated assistants would determining privacy controls for output including sensitive data (Nadig). 11. Regarding claim 2, Stovezky and Nadig teach the invention as claimed in claim 1 above and Stovezky further teaches wherein determining that the user has authorized pairing of the location-based automated assistant to the user device is based on determining that the user has previously been present at a location of the location-based automated assistant (col 2, lines 55-66, “determining the client device is in the location and the contextual condition(s) are present can cause the automated assistant to automatically transition to the second location-based biasing mode for the client device. ”, col 5, lines 14-18, “when the user has visited a particular area and/or a particular location of interest, such as a landmark, data that is used to render a responsive output can be processed according to a particular location-based biasing mode.”, col 22, lines 39-60, “whether the user has ever visited the location of interest before, and/or any other property that can describe the context of the user relative to a location of interest.”). Also, Napolitano teaches the limitation at ([0136], [0149], [0366], “User data including contacts, preferences, location, favorite media, and the like can be used to interpret voice commands and facilitate user interaction with the various devices discussed herein. this gathered data can include personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, home addresses, or any other identifying information.”). Also, Nadig teaches the limitation at ([0128], [0146], [0148]). 12. Regarding claim 3, Stovezky and Nadig teach the invention as claimed in claim 1 above and Stovezky further teaches wherein determining that the user has authorized pairing of the location-based automated assistant with the user device is based on identifying, from an application of the user device, authorization information (col 6, lines 43-60, “FIG. 1A illustrates a view 100 of the user 102 interacting with an automated assistant 108 that is accessible via a computing device 104 that is located within a home of the user 102.”, col 7, lines 18-41, “the automated assistant 108 can access contextual data characterizing content being rendered at the television 110. Based on the user 102 being in their home, the automated assistant 108 can bypass operating according to a location-based biasing mode, and provide a responsive output to the user 102 based on the contextual data and the spoken utterance.”). 13. Regarding claim 4, Stovezky and Nadig teach the invention as claimed in claim 3 above and Stovezky further teaches wherein the application is a calendar application (col 5, lines 11-12, “This variety of different data can be associated with the user (e.g., calendar data, application data, message data, etc.), the spoken utterance, a given context, and/or any other related information.”, col 12, lines 29-52, “The automated assistant 108 can bias the processing according to a greater corpus of documents and/or data, such as application data accessible via the computing device 140 (e.g., calendar data, messaging data, area-related location data, etc.).”). Also, Nadig teaches the limitation at ([0024], “the user input may be “Alexa, what is on my calendar today?” or “Give me details on my appointment today.””). 14. Regarding claim 5, Stovezky and Nadig teach the invention as claimed in claim 3 above and Nadig further teaches wherein the authorization information includes reservation information related to the location ([0057], “book a trip”, [0109], “ride booking”). 15. Regarding claim 6, Stovezky and Nadig teach the invention as claimed in claim 1 above and Nadig further teaches wherein the user information shared with the location-based automated assistant includes one or more user automated assistant preferences ([0064-0066], “The data of a profile may include preferences specific to the user”, [0068], [0071]). 16. Regarding claim 7, Stovezky and Nadig teach the invention as claimed in claim 1 above and Nadig further teaches wherein the trust measure indicates particular information to provide to the location-based automated assistant as user information, and wherein the user information is selected based on the user information conforming to the particular information indicated by the trust measure ([0016-0017], “artificial intelligence such Alexa (the location-based automated assistant)”, [0114], “The system(s) 120 may determine threshold user authentication confidence score data that may represent a threshold user authentication confidence score required prior to providing user access to the sensitive data.”, [0128], “The ML component 516 (the location-based automated assistant) may track the behavior of various users as a factor in determining a confidence level of the identity of the user. Thus, the ML component 516 may use historical data and/or usage patterns over time to increase or decrease a confidence level of an identity of a user.”, [0146]). 17. Regarding claim 8, Stovezky and Nadig teach the invention as claimed in claim 1 above and Stovezky further teaches wherein the user information includes application information that is from an application of the user device, and further comprising: providing, along with the request, the application information to the location-based automated assistant (col 5, lines 11-12, “This variety of different data can be associated with the user (e.g., calendar data, application data, message data, etc.), the spoken utterance, a given context, and/or any other related information.”, col 6, lines 43-61, col 12, lines 29-52, “The automated assistant 108 can bias the processing according to a greater corpus of documents and/or data, such as application data accessible via the computing device 140 (e.g., calendar data, messaging data, area-related location data, etc.).”). 18. Regarding claim 9, Stovezky and Nadig teach the invention as claimed in claim 8 above and Stovezky further teaches wherein the application is a calendar application (col 5, lines 11-12, “This variety of different data can be associated with the user (e.g., calendar data, application data, message data, etc.), the spoken utterance, a given context, and/or any other related information.”, col 12, lines 29-52, “The automated assistant 108 can bias the processing according to a greater corpus of documents and/or data, such as application data accessible via the computing device 140 (e.g., calendar data, messaging data, area-related location data, etc.).”). Also, Nadig teaches the limitation at ([0024], “the user input may be “Alexa, what is on my calendar today?” or “Give me details on my appointment today.””). 19. Regarding claims 10-18, those claims recite a system performs the method of claims 1-9 respectively and are rejected under the same rationale. 20. Regarding claims 19 and 20, those claims recite a non-transitory computer readable storage medium configured to store instructions that, when executed by one or more processors, cause one or more of the processors to perform the method of claims 1 and 2 respectively and are rejected under the same rationale. Respond to Amendments and Arguments 21. In the remarks received 02/05/2026, the Applicant's attorney respectfully submits that the cited portions of Napolitano fail to render obvious the above features of independent claim 1, as well as similar features of independent claims 10 and 19, at least as amended. For example, para. [0136] of Napolitano sets forth that "it can be determined whether a particular user device is authorized to control media on, for example, [a] television set-top box. A user device can be authorized based on a... trust determination". Then, "[i]n response to determining that a particular user device is authorized, attempts to control [the] television set-top box can be permitted". It appears that the Office Action has equated Napolitano's "television set-top box" with independent claim l's "location based automated assistant". However, the Applicant's attorney respectfully submits that Napolitano's "user device" that "can be authorized based on a... trust determination" to "control media on... [a] television set-top box" fails to teach or suggest that "the location-based automated assistant is provided access to the user information based on a trust measure generated for the location-based automated assistant." Put another way, Napolitano's "trust determination" is made for Napolitano's "user device", which fails to teach or suggest that the "trust measure" is "generated for the location-based automated assistant" as set forth in independent claim 1, as amended. For at least these reasons, the Applicant's attorney respectfully requests that the Office Action's 103 rejection be withdrawn. 22. Applicant’s 35 U.S.C. § 103 arguments on claims 1-20 have been fully considered but are moot in view of the new ground of rejection necessitated by applicant’s amendment presented above, 35 USC § 103. CONCLUSION 23. The prior art made of record and not relied upon is considered pertinent to applicant s disclosure. Gruber et al (US 20190214024 A1) discloses an intelligent automated assistant system engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions. Any inquiry concerning this communication or earlier communications from the examiner should be directed to HICHAM SKHOUN whose telephone number is (571)272-9466. The examiner can normally be reached Normal schedule: Mon-Fri 10am-6:30pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amy Ng can be reached at 5712701698. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HICHAM SKHOUN/Primary Examiner, Art Unit 2164
Read full office action

Prosecution Timeline

Oct 07, 2024
Application Filed
Jun 27, 2025
Non-Final Rejection — §103
Oct 01, 2025
Response Filed
Oct 28, 2025
Final Rejection — §103
Dec 30, 2025
Examiner Interview Summary
Dec 30, 2025
Applicant Interview (Telephonic)
Dec 30, 2025
Response after Non-Final Action
Feb 05, 2026
Request for Continued Examination
Feb 15, 2026
Response after Non-Final Action
Feb 21, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12591552
Distributed File System that Provides Scalability and Resiliency
2y 5m to grant Granted Mar 31, 2026
Patent 12561304
DISTRIBUTABLE HASH FILTER FOR NONPROBABILISTIC SET INCLUSION
2y 5m to grant Granted Feb 24, 2026
Patent 12536141
DEFRAGMENTATION FOR LOG STRUCTURED MERGE TREE TO IMPROVE READ AND WRITE AMPLIFICATION
2y 5m to grant Granted Jan 27, 2026
Patent 12511292
CLUSTER VIEWS FOR COMPUTE SCALE AND CACHE PRESERVATION
2y 5m to grant Granted Dec 30, 2025
Patent 12481672
METRICS MANAGEMENT SYSTEM
2y 5m to grant Granted Nov 25, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
77%
Grant Probability
83%
With Interview (+5.6%)
3y 1m
Median Time to Grant
High
PTA Risk
Based on 344 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month