Prosecution Insights
Last updated: April 19, 2026
Application No. 18/570,545

FUNCTION ALLOCATION CONTROL APPARATUS, FUNCTION ALLOCATION CONTROL METHOD AND PROGRAM

Final Rejection §103§112
Filed
Dec 14, 2023
Examiner
BINCZAK, BRANDON MICHAEL
Art Unit
2437
Tech Center
2400 — Computer Networks
Assignee
NTT, Inc.
OA Round
2 (Final)
38%
Grant Probability
At Risk
3-4
OA Rounds
2y 11m
To Grant
74%
With Interview

Examiner Intelligence

Grants only 38% of cases
38%
Career Allow Rate
23 granted / 60 resolved
-19.7% vs TC avg
Strong +36% interview lift
Without
With
+36.1%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
34 currently pending
Career history
94
Total Applications
across all art units

Statute-Specific Performance

§101
9.0%
-31.0% vs TC avg
§103
54.7%
+14.7% vs TC avg
§102
9.9%
-30.1% vs TC avg
§112
26.0%
-14.0% vs TC avg
Black line = Tech Center average estimate • Based on career data from 60 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 12/14/2023 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Response to Arguments Applicant’s arguments, see page 10, filed 1/5/2026, with respect to the objection(s) to the drawings have been fully considered and are persuasive. The associated objection(s) to the drawings has been withdrawn. Applicant's arguments, see page 10, filed 1/5/2026, with respect to the rejection of claims 2-6 and 9-20 under 35 USC 112(b) have been fully considered but they are not persuasive. Regarding the argument: “… Herein, the claims have been amended to address the indefiniteness issues identified in the Office Action.” Examiner notes that the amendments do not reconcile the issues to which the previous rejections were directed. Further details can be found in the below section titled, Claim Rejections - 35 USC § 112. These rejections are maintained. Applicant's arguments, see pages 11 and 12, filed 1/5/2026, with respect to the rejection of claims 1-20 under 35 USC 103 have been fully considered and are persuasive. Therefore, the rejection has been withdrawn. However, upon further consideration, a new ground(s) of rejection is made in view of ARONOWITZ et al (Doc ID US 20180034859 A1), GREWAL et al (Doc ID US 20210279311 A1), and BULYGIN et al (Doc ID US 20200074086 A1). Claim Rejections - 35 USC § 112 The following is a quotation of the first paragraph of 35 U.S.C. 112(a): (a) IN GENERAL.—The specification shall contain a written description of the invention, and of the manner and process of making and using it, in such full, clear, concise, and exact terms as to enable any person skilled in the art to which it pertains, or with which it is most nearly connected, to make and use the same, and shall set forth the best mode contemplated by the inventor or joint inventor of carrying out the invention. Claims 1-20 are rejected under 35 U.S.C. 112(a) as failing to comply with the written description requirement. The claim(s) contains subject matter which was not described in the specification in such a way as to reasonably convey to one skilled in the relevant art that the inventor or a joint inventor, at the time the application was filed, had possession of the claimed invention. Regarding claim(s) 1, 7, and 8: Claim 1 recites, “… the security verification function is altered based on the trust score …” Claim(s) 7 and 8 recite(s) similar language. This limitation lacks sufficient written description in the original disclosure, and thus constitutes new matter. While the specification discusses changing the allocation of security verification functions in paragraph 0031, inter alia, it is silent regarding any material which could reasonably be interpreted as “altering” a security verification function. “Altering a function” cannot be interpreted as “changing an allocation of functions.” This rejection can be overcome by amending the claim(s) such that they recite only that subject matter which is explicitly supported by the original disclosure. Claim 1 also recites, “… when the each entity is an apparatus, the security verification comprises binary analysis, firmware analysis, communication verification and behavior verification …” Claim(s) 7 and 8 recite(s) similar language. This limitation lacks sufficient written description in the original disclosure, and thus constitutes new matter. The specification at paragraph [0015] recites, “Next, when the entity is an apparatus, security verification for the apparatus is performed by, for example, integrity verification by a static verification method such as binary analysis or firmware analysis, or a dynamic verification method such as communication verification by network scan or software behavior verification by vulnerability scan. Security verification for the apparatus is realized by combining the static verification method and the dynamic verification method.” (emphasis added). The specification is explicit that either binary analysis or firmware analysis may be combined with either communication verification or software behavior verification. The amended claim limitation recites combining all four methods to achieve the security verification. For the purposes of this examination and in the interest of compact prosecution, this limitation will be examined based on the supported subject matter in the specification. Regarding claims 2-6 and 9-20: They are dependent on one or more rejected claims, and thus inherit those rejections. This rejection could be overcome by overcoming the rejection(s) to any claims upon which these claims depend, or by amending the claims such that they are no longer dependent on any rejected claim. The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. Claims 2-6 and 9-20 are rejected under 35 U.S.C. 112(b) as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor regards as the invention. Regarding claims 2, 3, 5, 9-11, 13, 15-17, and 19: The claims each recite, “extracting a candidate”. The claims are indefinite because it is unclear what “extracting” means in this context. The common use of the term in the art refers to retrieval of data from a source. It is unclear from where these “candidates” are extracted, and what their extraction entails. Regarding claims 4, 12, and 18: Claim 4 recites, “… information indicating a resource actually used by each entity in the security verification function …”. The claim goes on to recite, “… determining … a reduction amount of resources …”. Claims 12 and 18 recite similar information. Where the first part of the claim (and also the independent depended-on claim) indicates a singular resource used for security verification (which one skilled in the art may assume to be a type of credential, or a unit of authentication hardware), the later part of the claim indicates a reduction in amount of resources. This makes the term “resources” ambiguous as to what it actually refers. This rejection can be overcome by amending the claims such that the metes and bounds of the claims, vis a vie the resources, are made clear. The claims also recite, “a reduction amount of resources to reduce the security verification functions or to reduce a verification frequency.” It is unclear what action is being described. The depended-on independent claim previously recites the allocation of a single “security verification function” to each “entity.” It is unclear how these functions are meant to be reduced (removing paired entities and functions, reducing each function’s complexity, etc.). It is further unclear to what the “reduction amount of resources” refers in the context of “reduce a verification frequency,” as it is unclear what resources would affect the frequency of verifications through their reduction. Regarding claims 6, 14, and 20: Claim 6 recites, “… the resource information includes information indicating a standard resource used in each security verification function …”. Claims 14 and 20 recite similar language. Depended on claim 1 (and similarly claims 7 and 8) recites, “the resource information indicating a resource used for achieving the security verification function allocated to each entity …”. It is unclear whether this “standard resource” is somehow different than the resource already identified. If not, then it is unclear in what way it differs, and whether it is meant to supersede or be considered in addition to the original “resource” indicated in the “resource information.” Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1, 2, 7, 8, 10, and 16 are rejected under 35 U.S.C. 103 as being unpatentable over ROSE et al (Doc ID US 20200293638 A1), and further in view of ARONOWITZ et al (Doc ID US 20180034859 A1), GREWAL et al (Doc ID US 20210279311 A1), and BULYGIN et al (Doc ID US 20200074086 A1). Regarding claim 1: ROSE teaches: A function allocation control device for controlling a security verification system that executes a security verification function, the function allocation control device comprising a processor configured to execute operations comprising ([0003] "... a server device includes a processor device and a non-transitory computer readable medium. The non-transitory computer-readable medium includes a data store and a control access module. ... The control access module is executable by the processor device ..."): calculating a trust score, the trust score indicating a security level of each entity in which the trust score is calculated based on verification result information, and the verification result information indicating a result of verifying security of each entity ([0040] "… the server device determines a confidence score based on the ... attributes for historical login attempts, in block 310." and [0041] "In block 312, the server device determines a level of access to an application based on the confidence score."); allocating a security verification function to each entity on the basis of the calculated trust score …, wherein the security verification function is altered based on the trust score ([0044] "In block 314, the server device modifies the application functions and permissions based on the level of access."), the resource information indicating a resource used for achieving the security verification function allocated to each entity ([0044] "... permissions may be based on both the confidence score and a user profile. For example, user 1 and user 2 may both have confidence score of 75 ..., user 1 may need to provide additional authentication if ... atypical behavior, whereas user 2 may not be required to go through additional authentication ..."); and ARONOWITZ teaches the following limitation(s) not taught by ROSE: allocating a security verification function to each entity on the basis of the calculated trust score and resource information, wherein the security verification function is altered based on the trust score ([0054] "… select among a plurality of different challenges based on ... challenge selection factors, ... to achieve the required confidence level in the user's identity to authorize the transaction to proceed." and [0056] "Challenge selection factors ... include, for example, ... computational cost factors, network cost factors, ... user trust level factors …"), Examiner notes that while ROSE does teach allocating security functions, it does so based only on the “trust score.” ARONOWITZ is provided to teach allocating functions based on both a trust level and resource cost. the resource information indicating a resource used for achieving the security verification function allocated to each entity ([0060] "Computational cost challenge selection factors may include, for example, … battery power consumption, processing power required, memory required, et cetera)."), transmitting information of the allocated security verification function to the security verification system configured to execute the allocated security verification function ([0084] "… The server also records the one or more authentication methods in a storage device of the server ... (step 608). Then, the server sends the one or more authentication methods to the client device via a network …"). Calculating a score based on an entity attempting access, and allocating security features based on that score are known techniques in the art, as demonstrated by ROSE. Further, allocating security functions based on a trust level and resource cost, and forwarding the allocated functions in the system for execution are known techniques in the art, as demonstrated by ARONOWITZ. It would have been obvious to a person having ordinary skill in the art (PHOSITA) before the effective filing date of the claimed invention to modify the trust score and security function allocation of ROSE with the allocation criteria and staging of ARONOWITZ with the motivation to select the security functions which are not only appropriate to the risk of the requesting entity, but also most efficiently executed by the system. GREWAL teaches the following limitation(s) not taught by the combination of ROSE and ARONOWITZ: wherein when the each entity is a user, the security verification comprises knowledge authentication, ownership authentication, and biometric authentication([0013] "... authenticated user 199 may be ... a user who operates device 100 with permission of an owner of device 100 .... This authentication technique may rely on, for example, a password known to authenticated user 199 ..., a biometric unique to authenticated user 199 ..., a security device held by authenticated user 199, and so forth." and [0038] "... multiple authentication techniques may be selected at action 310."), and Limiting the verification of a user’s identity to the combination of passwords, biometrics, and ownership is a known technique in the art, as demonstrated by GREWAL. It would have been obvious to a PHOSITA before the effective filing date of the claimed invention to modify the dynamic security verification function allocation of ROSE and ARONOWITZ with the user identity verification of GREWAL with the motivation to limit the accepted identity verification to a specific combination so that all users are given a trust score based on the same criteria. BULYGIN teaches the following limitation(s) not taught by the combination of ROSE, ARONOWITZ, and GREWAL: when the each entity is an apparatus, the security verification comprises binary analysis, firmware analysis, communication verification and behavior verification ([0015] "... performing a risk analysis of all monitored host devices in a targeted system, including performing an analysis of firmware versions, hardware components and configurations, known vulnerabilities in hardware and firmware of the host devices .... ... performing runtime behavior monitoring, including analyzing the behavior of the host devices, firmware, and operating systems ..."); and Limiting the verification of a device to the combination of firmware analysis and software behavior combination of is a known technique in the art, as demonstrated by BULYGIN. It would have been obvious to a PHOSITA before the effective filing date of the claimed invention to modify the dynamic security verification function allocation of ROSE, ARONOWITZ, and GREWAL with the device verification of BULYGIN with the motivation to limit device verification to a specific combination of factors so that all devices are given a trust score based on the same criteria. Regarding claim 2: The combination of ROSE, ARONOWITZ, GREWAL, and BULYGIN teaches: The function allocation control device according to claim 1, wherein the allocating further comprises: extracting a candidate for changing the number of security verification functions to be allocated or changing a verification schedule, on the basis of the calculated trust score (ROSE [0034] "… The control access module 212 can also determine the level of authentication that should be applied to a user access attempt, where increased levels of authentication may be required during higher risk scenarios …"), determining whether or not to change allocation of the security verification function or whether or not to change the verification schedule, for the extracted entity on the basis of the resource information, and (ROSE [0025] "… The database server 104 can record ... a user profile for the server 102 to use in determining if any anomalous behavior occurs during further login attempts."), determining, based on the determining to change, a more specific content of the change (ROSE [0025] "… For example, ... the confidence score of the successful login, as determined by the server 102, may be of a lower score than what would typically be given to the user."). Regarding claims 7, 8, 10, and 16: These claims are rejected with the same justification, mutatis mutandis, as their counterpart claims 1 and 2 above. Claims 3-6, 9, 11-15, and 17-20 are rejected under 35 U.S.C. 103 as being unpatentable over ROSE et al (Doc ID US 20200293638 A1), ARONOWITZ et al (Doc ID US 20180034859 A1), GREWAL et al (Doc ID US 20210279311 A1), and BULYGIN et al (Doc ID US 20200074086 A1) as applied to claims 2, 10, and 16 above, and further in view of GUPTA et al (Doc ID US 20160110528 A1). Regarding claim 3: The combination of ROSE, ARONOWITZ, GREWAL, and BULYGIN teaches: The function allocation control device according to claim 2 GUPTA teaches the following limitation(s) not taught by the combination of ROSE, ARONOWITZ, GREWAL, and BULYGIN: The function allocation control device according to claim 2, wherein the allocating further comprises extracting a candidate indicating high trust for changing the verification schedule either to reduce the number of security verification functions to be allocated or to reduce a verification execution frequency ([0050] "… the computing device may evaluate a reduced number of authentication factors when the user confidence value is high, and evaluate a greater number of authentication factors when the user confidence value is low …"). Evaluating whether to reduce the amount of authentication required for a trusted entity is a known technique in the art, as demonstrated by GUPTA. It would have been obvious to a PHOSITA before the effective filing date of the claimed invention to modify the trust calculation and authentication selection of ROSE, ARONOWITZ, GREWAL, and BULYGIN with the reduction of authentication requirements of GUPTA with the motivation to streamline the authentication process. It is obvious to look to systems which base their authentication requirements on trust levels when it is desired to reduce the burden of authenticating for those entities in the system with a high amount of trust. Regarding claim 4: The combination of ROSE, ARONOWITZ, GREWAL, BULYGIN, and GUPTA teaches: The function allocation control device according to claim 3 wherein the resource information includes information indicating a resource actually used by each entity in the security verification function (ROSE [0044] "... permissions may be based on both the confidence score and a user profile. For example, user 1 and user 2 may both have confidence score of 75 ..., user 1 may need to provide additional authentication if ... atypical behavior, whereas user 2 may not be required to go through additional authentication ..."), and the allocating further comprises determining, on the basis of the resource information, a reduction amount of resources to reduce the security verification functions or to reduce a verification frequency (GUPTA [0050] "… the computing device may evaluate a reduced number of authentication factors when the user confidence value is high, and evaluate a greater number of authentication factors when the user confidence value is low …"). Evaluating whether to reduce the amount of authentication required for a trusted entity is a known technique in the art, as demonstrated by GUPTA. It would have been obvious to a PHOSITA before the effective filing date of the claimed invention to modify the trust calculation and authentication selection of ROSE, ARONOWITZ, GREWAL, BULYGIN, and GUPTA with the reduction of authentication requirements of GUPTA with the motivation to streamline the authentication process. It is obvious to look to systems which base their authentication requirements on trust levels when it is desired to reduce the burden of authenticating for those entities in the system with a high amount of trust. Regarding claim 5: The combination of ROSE, ARONOWITZ, GREWAL, and BULYGIN teaches: The function allocation control device according to claim 2, GUPTA teaches the following limitation(s) not taught by the combination of ROSE, ARONOWITZ, GREWAL, and BULYGIN: wherein the allocating further comprises extracting a candidate indicating low trust for increasing the number of security verification functions to be allocated ([0050] "… the computing device may evaluate a reduced number of authentication factors when the user confidence value is high, and evaluate a greater number of authentication factors when the user confidence value is low …"). Evaluating whether to increase the amount of authentication required for an untrusted entity is a known technique in the art, as demonstrated by GUPTA. It would have been obvious to a PHOSITA before the effective filing date of the claimed invention to modify the trust calculation and authentication selection of ROSE, ARONOWITZ, GREWAL, and BULYGIN with the reduction of authentication requirements of GUPTA with the motivation to streamline the authentication process. It is obvious to look to systems which base their authentication requirements on trust levels when it is desired to reduce the burden of authenticating for those entities in the system with a high amount of trust, while simultaneously increasing the burden for those entities with a low amount of trust. Regarding claim 6: The combination of ROSE, ARONOWITZ, GREWAL, BULYGIN, and GUPTA teaches: The function allocation control device according to claim 5 wherein the resource information includes information indicating a standard resource used in each security verification function (GUPTA [0038] "A computing device may implement … multiple independent operation sets for testing/evaluating different types of authentication factors …"), and the allocating further comprises determining, on the basis of the resource information, an increase amount of resources when a security verification function is added (GUPTA [0123] "In block 510, ... determine which of the authentication factors should be evaluated and select operation sets based on the determined number of authentication factors and so that the selected operation sets evaluate two or more of a knowledge-type authentication factor, a possession-type authentication factor, and an inherence-type authentication factor."). Utilizing a known set of authentication factors and selecting from them based on selected security functions are known techniques in the art, as demonstrated by GUPTA. It would have been obvious to a PHOSITA before the effective filing date of the claimed invention to modify the trust calculation and authentication selection of ROSE, ARONOWITZ, GREWAL, BULYGIN, and GUPTA with the reduction of authentication requirements of GUPTA with the motivation to ensure that a standard set of factors is selected from for authenticating an entity. It is obvious to choose from a standard list so that they can be deterministically assigned based on an input value such as a trust score. Regarding claims 9, 11-15, and 17-20: These claims are rejected with the same justification, mutatis mutandis, as their counterpart claims 3-6 above. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRANDON BINCZAK whose telephone number is (703)756-4528. The examiner can normally be reached M-F 0800-1700. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alexander Lagor can be reached on (571) 270-5143. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /BB/Examiner, Art Unit 2437 /BENJAMIN E LANIER/Primary Examiner, Art Unit 2437
Read full office action

Prosecution Timeline

Dec 14, 2023
Application Filed
Aug 28, 2025
Non-Final Rejection — §103, §112
Nov 13, 2025
Interview Requested
Dec 05, 2025
Examiner Interview Summary
Dec 05, 2025
Applicant Interview (Telephonic)
Jan 05, 2026
Response Filed
Feb 09, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12470534
PARTIAL POOL CREDENTIALLING AUTHENTICATION SYSTEM
2y 5m to grant Granted Nov 11, 2025
Patent 12452224
IMAGE DISPLAY DEVICE AND SYSTEM, AND OPERATION METHOD FOR SAME
2y 5m to grant Granted Oct 21, 2025
Patent 12425867
REGISTRATION AND SECURITY ENHANCEMENTS FOR A WTRU WITH MULTIPLE USIMS
2y 5m to grant Granted Sep 23, 2025
Patent 12417283
IOT ADAPTIVE THREAT PREVENTION
2y 5m to grant Granted Sep 16, 2025
Patent 12411919
Shared Assistant Profiles Verified Via Speaker Identification
2y 5m to grant Granted Sep 09, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
38%
Grant Probability
74%
With Interview (+36.1%)
2y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 60 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month