Prosecution Insights
Last updated: April 19, 2026
Application No. 18/774,705

SYSTEMS AND METHODS FOR PROTECTING SENSITIVE CONTENT DURING REMOTE SPACE SHARING

Final Rejection §102§103
Filed
Jul 16, 2024
Examiner
GADALLA, HANY S
Art Unit
2493
Tech Center
2400 — Computer Networks
Assignee
Jpmorgan Chase Bank N A
OA Round
2 (Final)
73%
Grant Probability
Favorable
3-4
OA Rounds
2y 10m
To Grant
99%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
128 granted / 175 resolved
+15.1% vs TC avg
Strong +38% interview lift
Without
With
+38.4%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
19 currently pending
Career history
194
Total Applications
across all art units

Statute-Specific Performance

§101
9.0%
-31.0% vs TC avg
§103
52.8%
+12.8% vs TC avg
§102
14.3%
-25.7% vs TC avg
§112
17.4%
-22.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 175 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION The present office action is responsive to communications received on 02/03/2026. Status of Claims Claims 1, 8 and 15 were amended. Claims 1-20 are pending. Response to arguments Regarding the first argument the claim is broad by reciting that the anchor comprises, not is, an RFD. The examiner found paragraphs in the prior art that disclose that the anchors comprise interaction with RFDs. The examiner does not believe the amendment is sufficient to overcome the prior art. Regarding the second argument, the wording “streaming policy for the area” does not need to be said explicitly it can be clearly understood from the paragraph that there is a rule/policy associated with an area. Regarding the third argument, the claim says implementing a control signal which goes from the headset to a program. Since the prior art teaches a headset location sent to implement rules/policy changes then it discloses the same concept. Therefore, the arguments are not found to be persuasive. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1-4, 6-11, 13-18 and 20 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Hartmann et al. (US 20250208607 A1) hereinafter referred to as Hartmann. With respect to claim 1, Hartmann discloses: A method, comprising: receiving, by a policy control computer program executed by an electronic device (Hartmann Fig. 1A program logic unit 106) and from a headset computer program executed by a headset (Hartmann Fig. 1A VR headset unit 104), an anchor identifier for an anchor in response to the headset being proximate to the anchor; (Hartmann ¶31 reads “Each spatial anchor relates a coordinate system associated with the augmented reality glasses for the display of the data to a spatial coordinate system associated with the spatial coordinates of one or more of the plurality of equipment stored in the database system. The program logic is configured to display the data of equipment that is located in a proximity of the up-to-date spatial position of the user via the augmented reality glasses at predefined relative positions to the spatial anchors.” Additionally, Fig. 7 illustrates that each anchor has a unique QR code [identifier]. In other words, the headset based on close proximity to the anchor and the interaction with the anchors’ QR code identifier the program logic unit identifies the VR headset location based on predefined positions to the spatial anchors). wherein the anchor comprises a radio frequency device; (Hartmann ¶156 “The interconnection of the network may be formed by means of physically hard wiring, optical and/or wireless radio-frequency”). identifying by the policy control computer program, an area in which headset is located by retrieving the area mapped to the anchor identifier in a mapping of a plurality of anchor identifiers to a plurality of areas; (Hartmann ¶31 “The program logic is configured to display the data of equipment that is located in a proximity of the up-to-date spatial position of the user via the augmented reality glasses at predefined relative positions to the spatial anchors.” In other words, the logic unit 106 identifies based on the stored location policies the predefined area of the headset relative to plurality of anchors). identifying, by the policy control computer program, a streaming policy for the area, wherein the streaming policy restricts audio and/or video streaming from the area, allows audio and/or video streaming from the area, or prohibits audio and/or video streaming from the area; (Hartmann ¶31 “Each spatial anchor relates a coordinate system associated with the augmented reality glasses for the display of the data to a spatial coordinate system associated with the spatial coordinates of one or more of the plurality of equipment stored in the database system.” Additionally, Hartmann ¶55-56 disclose user subscription to streaming service [audio and/or video] wherein “the subscription of the user to the service is performed by the program logic automatically, e.g., upon a successful authentication of the user at the AR system and/or upon the user approaching the production system closer than a user-defined or predefined maximum distance of e.g., less than 10 m, or less than 5 m, or less than 2 m, or less than 1 m … the database system is configured to provide, in response to a query (e.g. a conventional SQL query) of the executable program logic, selectively the data of equipment that are proximate to the up-to-date spatial position of the user wearing the augmented reality glasses.”) sending, by the policy control computer program, a control signal to the headset computer program based on the streaming policy; (Hartmann ¶56 “the database system is configured to provide, in response to a query (e.g. a conventional SQL query) of the executable program logic, selectively the data of equipment that are proximate to the up-to-date spatial position of the user wearing the augmented reality glasses.” And also ¶58 discloses updating streaming subscription policies based on user headset location). and implementing, by the headset computer program, the control signal to control streaming of audio and/or video from the headset to the policy control computer program. (Hartmann ¶48 discloses the different users have different policies based on their location and controls implemented on their headset display. Additionally, Hartmann ¶57 “the database system is configured to provide, via the subscription-based streaming service, selectively the data of equipment that are located proximate to the up-to-date spatial position of the user wearing the augmented reality glasses.” Which means a signal is sent from headset to identify location upon which the rule/policy determines response). With respect to claim 2, Hartmann discloses: The method of claim 1, wherein the headset computer program implements the control signal by preventing audio and/or video streaming from the headset. (Hartmann ¶218 “If the robot changes position and the updated coordinates of the robot are stored in the database, this event may trigger the displaying or hiding [preventing] of a virtual object in the AR glasses of a local user.”) With respect to claim 3, Hartmann discloses: The method of claim 1, wherein the headset computer program implements the control signal by redacting or obfuscating content in audio and/or video streamed from the headset. (Hartmann ¶218 “If the robot changes position and the updated coordinates of the robot are stored in the database, this event may trigger the displaying or hiding [redacting or obfuscating content] of a virtual object in the AR glasses of a local user.”) With respect to claim 4, Hartmann discloses: The method of claim 3, wherein the headset computer redacts or obfuscates the content based on a mesh of the area. (Hartmann ¶218 “If the robot changes position and the updated coordinates of the robot [based on a mesh of the area] are stored in the database, this event may trigger the displaying or hiding [redacting or obfuscating content] of a virtual object in the AR glasses of a local user.”) With respect to claim 6, Hartmann discloses: The method of claim 1, wherein the headset computer program receives the anchor identifier by ultrawideband communication, by Bluetooth, or by Wi-Fi. (Hartmann ¶143 discloses data is exchanged via Bluetooth or wi-fi). With respect to claim 7, Hartmann discloses: The method of claim 1, wherein the policy control computer program further receives spatial data from the headset computer program, and identifies the area based on the anchor identifier and the spatial data. (Hartmann ¶145 “The anchors are then stored in the database system for making the anchors accessible to other users, e.g., remote users. A virtual reality application configured to create a virtual reality for the remote user can be configured to read the anchor(s) created by the user wearing the AR glasses and to create and display some virtual objects at a defined distance and orientation relative to the anchor. The augmented reality application of the user wearing the AR glasses may do the same. Thereby, the spatial anchor may allow the two users to share the same visual experience.”). With respect to claim 8, Hartmann discloses: A method, comprising: receiving, by a policy control computer program executed by an electronic device (Hartmann Fig. 1A program logic unit 106) and from an anchor in one of a plurality of areas (Hartmann Fig.7 plurality of anchors identified by QR code in different countries [areas] in communication with positioning system 108 in Fig. 1), a headset identifier for a headset (Hartmann Fig. 1A VR headset unit 104) from an anchor in response to the headset being proximate to the anchor, (Hartmann ¶258 “the executable program logic receives an up-to-date spatial position of a user wearing the augmented reality glasses from a positioning system 108.”). wherein the anchor comprises a radio frequency device; (Hartmann ¶156 “The interconnection of the network may be formed by means of physically hard wiring, optical and/or wireless radio-frequency”). identifying, by the policy control computer program, a streaming policy for the area, wherein the streaming policy restricts audio and/or video streaming from the area, allows audio and/or video streaming from the area, or prohibits audio and/or video streaming from the area; (Hartmann ¶31 “Each spatial anchor relates a coordinate system associated with the augmented reality glasses for the display of the data to a spatial coordinate system associated with the spatial coordinates of one or more of the plurality of equipment stored in the database system.” Additionally, Hartmann ¶55-56 disclose user subscription to streaming service [audio and/or video] wherein “the subscription of the user to the service is performed by the program logic automatically, e.g., upon a successful authentication of the user at the AR system and/or upon the user approaching the production system closer than a user-defined or predefined maximum distance of e.g., less than 10 m, or less than 5 m, or less than 2 m, or less than 1 m … the database system is configured to provide, in response to a query (e.g. a conventional SQL query) of the executable program logic, selectively the data of equipment that are proximate to the up-to-date spatial position of the user wearing the augmented reality glasses.”) sending, by the policy control computer program, a control signal to a headset computer executed by a headset associated with the headset identifier based on the streaming policy; (Hartmann ¶56 “the database system is configured to provide, in response to a query (e.g. a conventional SQL query) of the executable program logic, selectively the data of equipment that are proximate to the up-to-date spatial position of the user wearing the augmented reality glasses.” And also ¶58 discloses updating streaming subscription policies based on user headset location). and implementing, by the headset computer program, the control signal to control streaming of audio and/or video from the headset to the policy control computer program. (Hartmann ¶48 discloses the different users have different policies based on their location and controls implemented on their headset display. Additionally, Hartmann ¶57 “the database system is configured to provide, via the subscription-based streaming service, selectively the data of equipment that are located proximate to the up-to-date spatial position of the user wearing the augmented reality glasses.” Which means a signal is sent from headset to identify location upon which the rule/policy determines response). With respect to claim 9, Hartmann discloses: The method of claim 8, wherein the headset computer program implements the control signal by preventing audio and/or video streaming from the headset. (Hartmann ¶218 “If the robot changes position and the updated coordinates of the robot are stored in the database, this event may trigger the displaying or hiding [preventing] of a virtual object in the AR glasses of a local user.”) With respect to claim 10, Hartmann discloses: The method of claim 8, wherein the headset computer program implements the control signal by redacting or obfuscating content in audio and/or video streamed from the headset. (Hartmann ¶218 “If the robot changes position and the updated coordinates of the robot are stored in the database, this event may trigger the displaying or hiding [redacting or obfuscating content] of a virtual object in the AR glasses of a local user.”) With respect to claim 11, Hartmann discloses: The method of claim 10, wherein the headset computer redacts or obfuscates the content based on a mesh of the area. (Hartmann ¶218 “If the robot changes position and the updated coordinates of the robot [based on a mesh of the area] are stored in the database, this event may trigger the displaying or hiding [redacting or obfuscating content] of a virtual object in the AR glasses of a local user.”) With respect to claim 13, Hartmann discloses: The method of claim 8, wherein the anchor receives the headset identifier by ultrawideband communication, by Bluetooth, or by Wi-Fi. (Hartmann ¶143 discloses data is exchanged via Bluetooth or wi-fi). With respect to claim 14, Hartmann discloses: The method of claim 8, wherein the policy control computer program further receives spatial data from the headset computer program, and identifies the area based on the anchor identifier and the spatial data. (Hartmann ¶145 “The anchors are then stored in the database system for making the anchors accessible to other users, e.g., remote users. A virtual reality application configured to create a virtual reality for the remote user can be configured to read the anchor(s) created by the user wearing the AR glasses and to create and display some virtual objects at a defined distance and orientation relative to the anchor. The augmented reality application of the user wearing the AR glasses may do the same. Thereby, the spatial anchor may allow the two users to share the same visual experience.”). With respect to claim 15, Hartmann discloses: A system, comprising: a headset executing a headset computer program; (Hartmann Fig. 1A VR headset unit 104). a plurality of anchors, each anchor associated with an area and transmitting an anchor identifier, (Hartmann ¶31 reads “Each spatial anchor relates a coordinate system associated with the augmented reality glasses for the display of the data to a spatial coordinate system associated with the spatial coordinates of one or more of the plurality of equipment stored in the database system. The program logic is configured to display the data of equipment that is located in a proximity of the up-to-date spatial position of the user via the augmented reality glasses at predefined relative positions to the spatial anchors.” Additionally, Fig. 7 illustrates that each anchor has a unique QR code [identifier]. In other words, the headset based on close proximity to the anchor and the interaction with the anchors’ QR code identifier the program logic unit identifies the VR headset location based on predefined positions to the spatial anchors). wherein the anchor comprises a radio frequency device; (Hartmann ¶156 “The interconnection of the network may be formed by means of physically hard wiring, optical and/or wireless radio-frequency”). an electronic device executing a policy control computer program; (Hartmann Fig. 1A program logic unit 106) and a database storing a policy streaming policy for each of the plurality of areas, wherein each of the streaming policies allows, restricts, or prohibits audio and/or video streaming from a respective area; (Hartmann Fig. 1A DB system 102 and ¶31 “Each spatial anchor relates a coordinate system associated with the augmented reality glasses for the display of the data to a spatial coordinate system associated with the spatial coordinates of one or more of the plurality of equipment stored in the database system.” Additionally, Hartmann ¶55-56 disclose user subscription to streaming service [audio and/or video] wherein “the subscription of the user to the service is performed by the program logic automatically, e.g., upon a successful authentication of the user at the AR system and/or upon the user approaching the production system closer than a user-defined or predefined maximum distance of e.g., less than 10 m, or less than 5 m, or less than 2 m, or less than 1 m … the database system is configured to provide, in response to a query (e.g. a conventional SQL query) of the executable program logic, selectively the data of equipment that are proximate to the up-to-date spatial position of the user wearing the augmented reality glasses.”) wherein the policy control computer program is configured to receive an anchor identifier from the headset computer program, wherein the headset is proximate to the anchor, to identify a streaming policy for the area, (Hartmann ¶31 reads “Each spatial anchor relates a coordinate system associated with the augmented reality glasses for the display of the data to a spatial coordinate system associated with the spatial coordinates of one or more of the plurality of equipment stored in the database system. The program logic is configured to display the data of equipment that is located in a proximity of the up-to-date spatial position of the user via the augmented reality glasses at predefined relative positions to the spatial anchors.” Additionally, Fig. 7 illustrates that each anchor has a unique QR code [identifier]. In other words, the headset based on close proximity to the anchor and the interaction with the anchors’ QR code identifier the program logic unit identifies the VR headset location based on predefined positions to the spatial anchors). wherein the streaming policy restricts audio and/or video streaming from the area, allows audio and/or video streaming from the area, or prohibits audio and/or video streaming from the area, to send a control signal to the headset computer program based on the streaming policy; (Hartmann ¶31 “The program logic is configured to display the data of equipment that is located in a proximity of the up-to-date spatial position of the user via the augmented reality glasses at predefined relative positions to the spatial anchors.” In other words, the logic unit 106 identifies based on the stored location policies the predefined area of the headset relative to plurality of anchors and implementing streaming policy as mapped above). and the headset computer program is configured to implement the control signal to control streaming of audio and/or video from the headset to the policy control computer program. (Hartmann ¶48 discloses the different users have different policies based on their location and controls implemented on their headset display. Additionally, Hartmann ¶57 “the database system is configured to provide, via the subscription-based streaming service, selectively the data of equipment that are located proximate to the up-to-date spatial position of the user wearing the augmented reality glasses.” Which means a signal is sent from headset to identify location upon which the rule/policy determines response). With respect to claim 16, Hartmann discloses: The system of claim 15, wherein the headset computer program is configured to implement the control signal by preventing audio and/or video streaming from the headset. (Hartmann ¶218 “If the robot changes position and the updated coordinates of the robot are stored in the database, this event may trigger the displaying or hiding [preventing] of a virtual object in the AR glasses of a local user.”) With respect to claim 17, Hartmann discloses: The system of claim 15, wherein the headset computer program is configured to implement the control signal by redacting or obfuscating content in audio and/or video streamed from the headset. (Hartmann ¶218 “If the robot changes position and the updated coordinates of the robot are stored in the database, this event may trigger the displaying or hiding [redacting or obfuscating content] of a virtual object in the AR glasses of a local user.”) With respect to claim 18, Hartmann discloses: The system of claim 17, wherein the headset computer is configured to redact or obfuscate the content based on a mesh of the area. (Hartmann ¶218 “If the robot changes position and the updated coordinates of the robot [based on a mesh of the area] are stored in the database, this event may trigger the displaying or hiding [redacting or obfuscating content] of a virtual object in the AR glasses of a local user.”) With respect to claim 20, Hartmann discloses: The system of claim 15, wherein the policy control computer program is configured to receive spatial data from the headset computer program, and to identify the area based on the anchor. (Hartmann ¶145 “The anchors are then stored in the database system for making the anchors accessible to other users, e.g., remote users. A virtual reality application configured to create a virtual reality for the remote user can be configured to read the anchor(s) created by the user wearing the AR glasses and to create and display some virtual objects at a defined distance and orientation relative to the anchor. The augmented reality application of the user wearing the AR glasses may do the same. Thereby, the spatial anchor may allow the two users to share the same visual experience.”). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 5, 12 and 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Hartmann as applied to claims 1-4, 6-11, 13-18 and 20 above, and further in view of Cheng et al. (US 20050125673 A1) hereinafter referred to as Cheng. With respect to claim 5, Hartmann discloses: The method of claim 4, Hartmann does not explicitly disclose: wherein the mesh identifies objects that may contain sensitive content, and the headset computer program redacts or obfuscates the objects. However, Cheng in an analogous art discloses: wherein the mesh identifies objects that may contain sensitive content, and the headset computer program redacts or obfuscates the objects. (Cheng ¶15-16 and 20 disclose obfuscating sensitive data based on user location). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the mesh disclosed by Hartmann wherein the mesh identifies objects that may contain sensitive content, and the headset computer program redacts or obfuscates the objects as disclosed by Cheng to prevent leaking of sensitive data based on user/device location (see Cheng Abstract and ¶20). With respect to claim 12, Hartmann discloses: The method of claim 11, Hartmann does not explicitly disclose: wherein the mesh identifies objects that may contain sensitive content, and the headset computer program redacts or obfuscates the objects. However, Cheng in an analogous art discloses: wherein the mesh identifies objects that may contain sensitive content, and the headset computer program redacts or obfuscates the objects. (Cheng ¶15-16 and 20 disclose obfuscating sensitive data based on user location). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the mesh disclosed by Hartmann wherein the mesh identifies objects that may contain sensitive content, and the headset computer program redacts or obfuscates the objects as disclosed by Cheng to prevent leaking of sensitive data based on user/device location (see Cheng Abstract and ¶20). With respect to claim 19, Hartmann discloses: The system of claim 18, Hartmann does not explicitly disclose: wherein the mesh identifies objects that may contain sensitive content, and the headset computer program redacts or obfuscates the objects. However, Cheng in an analogous art discloses: wherein the mesh identifies objects that may contain sensitive content, and the headset computer program redacts or obfuscates the objects. (Cheng ¶15-16 and 20 disclose obfuscating sensitive data based on user location). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the mesh disclosed by Hartmann wherein the mesh identifies objects that may contain sensitive content, and the headset computer program redacts or obfuscates the objects as disclosed by Cheng to prevent leaking of sensitive data based on user/device location (see Cheng Abstract and ¶20). Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to HANY S GADALLA whose telephone number is (571)272-2322. The examiner can normally be reached Mon to Fri 8:00AM - 4:00PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Carl Colin can be reached on (571) 272-3862. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HANY S. GADALLA/Primary Examiner, Art Unit 2493
Read full office action

Prosecution Timeline

Jul 16, 2024
Application Filed
Oct 30, 2025
Non-Final Rejection — §102, §103
Feb 03, 2026
Response Filed
Mar 26, 2026
Final Rejection — §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12598083
ELECTRONIC DEVICE TRACKING OR VERIFICATION
2y 5m to grant Granted Apr 07, 2026
Patent 12587366
SYSTEM AND METHOD FOR GENERATING CRYPTOGRAPHIC SIGNATURE FOR ARTIFICIAL INTELLIGENT GENERATED CONTENT
2y 5m to grant Granted Mar 24, 2026
Patent 12572639
GENERATIVE ARTIFICIAL INTELLIGENCE FOR VALIDATION OF A HUMAN USER
2y 5m to grant Granted Mar 10, 2026
Patent 12566828
MINIMIZING DATA EXPOSURE IN API RESPONSES
2y 5m to grant Granted Mar 03, 2026
Patent 12531745
CONTENT TRANSMISSION PROTECTION METHOD AND RELATED DEVICE THEREOF
2y 5m to grant Granted Jan 20, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
73%
Grant Probability
99%
With Interview (+38.4%)
2y 10m
Median Time to Grant
Moderate
PTA Risk
Based on 175 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month