Prosecution Insights
Last updated: April 19, 2026
Application No. 18/720,691

TESTING OF AN ON-DEVICE MACHINE LEARNING TOOL

Non-Final OA §102§103§112
Filed
Jun 17, 2024
Examiner
LEE, PHILIP C
Art Unit
2454
Tech Center
2400 — Computer Networks
Assignee
Koninklijke Philips N V
OA Round
1 (Non-Final)
78%
Grant Probability
Favorable
1-2
OA Rounds
2y 9m
To Grant
96%
With Interview

Examiner Intelligence

Grants 78% — above average
78%
Career Allow Rate
237 granted / 306 resolved
+19.5% vs TC avg
Strong +19% interview lift
Without
With
+18.7%
Interview Lift
resolved cases with interview
Typical timeline
2y 9m
Avg Prosecution
18 currently pending
Career history
324
Total Applications
across all art units

Statute-Specific Performance

§101
6.7%
-33.3% vs TC avg
§103
46.1%
+6.1% vs TC avg
§102
24.1%
-15.9% vs TC avg
§112
16.8%
-23.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 306 resolved cases

Office Action

§102 §103 §112
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Election/Restrictions Applicant’s election without traverse of Group I, claims 1-12 and 16-20 in the reply filed on 1/9/26 is acknowledged. Claims 1-12 and 16-20 have been examined. Claim 13 has been withdrawn and claims 14 and 15 have been cancelled. Claim Rejections - 35 USC §112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 2, 6, 17 and 19 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention. The following terms lack proper antecedent basis: the input information – claims 6 and 19. Claim language in the following claims is not clearly understood: As per claim 2, lines 3 and 5, it is unclear what is “it” referring to. As per claim 17, line 3, it is unclear what is “it” referring to. PE2E - Docket and Application Viewer 4.1.6.150 Claim Rejections – 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-5, 7-9, 11-12, 16, 17 and 19 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Kovacs et al, WO 2022/186817 (hereinafter Kovacs). As per claim 1, Kovacs teaches the invention as claimed comprising: a training input generator circuit ([31]-[36]), wherein the training input generator circuit is arranged to design training input information ([31]-[36][133], e.g., design training input information (e.g., models, configurations, input test signals, reference outputs, test definition, model weights)), wherein the training input information is arranged to provide an expected output of a device in response to test input information ([31]-[36], e.g., wherein training input information are arranged to provide corresponding reference output of a device in response to test input); a test circuit [36], wherein the test circuit is arranged to apply the training input information and the test input information to the device ([36], e.g., running test sequence with test input), wherein the test circuit is arranged to obtain an output information ([36], e.g., obtaining output), wherein the output information is generated by the device in response to the test input information ([36], e.g., output is generated in response to the test input); and a model evaluator circuit, wherein the model evaluator circuit is arranged to compare the output information with the expected output so as to evaluate an on-device machine learning model ([36][56], e.g., compare the output of the ML model with the reference output). As per claim 2, Kovacs teaches the invention as claimed in claim 1 above. Kovacs further teach wherein the test circuit, is arranged to apply the training input information by transmitting it to the device ([31][41][55][56][69][81], e.g., transmitting the training input information to the UE), wherein the test circuit is arranged to obtain the output information by receiving it via a transceiver circuit ([52][69][81], e.g., receiving output from UE). As per claim 3, Kovacs teaches the invention as claimed in claim 1 above. Kovacs further teach wherein the apparatus is arranged to deploy a policy, wherein the policy is triggered by at least one predetermined condition, wherein the predetermined condition determines which devices require testing ([3][4], e.g., deploy testing, wherein testing is triggered by advertised/supported ML assistance capabilities, the advertised/supported ML assistance capabilities determine the UE requires testing). As per claim 4, Kovacs teaches the invention as claimed in claim 1 above. Kovacs further teach a test input design system ([31]-[36]), wherein the test input design system is arranged to design hardware level inputs ([31]-[36], e.g., design input received via a hardware (e.g., transceiver)), wherein the hardware level inputs trigger certain states of the on-device machine learning model so as to cause the device to produce an output as the output information ([36]). As per claim 5, Kovacs teaches the invention as claimed in claim 1 above. Kovacs further teach wherein the test input information corresponds to at least a portion of the training input information ([31]-[36]), wherein the model evaluator circuit is arranged to evaluate an accuracy of a response of the on-board machine learning model ([56], e.g., evaluating an output is correct by validating against reference output). As per claim 7, Kovacs teaches the invention as claimed in claim 1 above. Kovacs further teach wherein the apparatus is arranged to render the on-device machine learning model susceptible to testing by applying a model pre-training using a mixed data vocabulary ([30], e.g., applying a ML model pre-training by using model parameters (e.g., initialization, hyperparameters), configuration, input test signal, reference output, test definition, model weight), wherein the model pre-training comprises network-accessible parameters mixed with true training data for an intended function of the on-device machine learning model ([31]-[36], e.g., the ML model pre-training comprises parameters mixed with configuration, input test signal, reference output, test definition, model weight). As per claim 8, Kovacs teaches the invention as claimed in claim 1 above. Kovacs further teach comprising a radio frequency control algorithm wherein the radio frequency control algorithm is arranged to control the on-device machine learning model by using a test transceiver as networking hardware of the test circuit ([31]-[36[69][81], e.g., transmit and receive testing radio frequency input via hardware RF transceiver), wherein the radio frequency control algorithm is arranged to alter at least one transmission characteristic of transmissions of the test transceiver ([31]-[36[69][81], e.g., alter RF transmission/reception of the transceiver). As per claim 9, Kovacs teaches the invention as claimed in claim 1 above. Kovacs further teach comprising: a test timing algorithm wherein the test timing algorithm is arranged to determine when the device requires testing ([3][4], e.g., the advertised/supported ML assistance capabilities determine the UE requires testing); and a status database, wherein the status database is arranged to store results of model tests and policies regarding actions to be taken for failed tests ([31][36][53], e.g., storing feedback of the test and policies regarding actions (e.g., declare fail or timeout or invalid) to be taken for failed tests). As per claim 11, Kovacs teaches the invention as claimed in claim 1 above. Kovacs further teach wherein the apparatus is arranged to distribute the test input information or a command triggering local testing in a unicast or multicast or broadcast channel (fig. 5A; [36][83]). As per claim 12, Kovacs teaches the invention as claimed in claim 1 above. Kovacs further teach wherein the training input information is designed based on input information ([31]-[36]), wherein the input information is derived from known usage of the on-device machine learning model ([33]), wherein the input information is based on a type of the on- device machine learning model ([30]). As per claim 16, Kovacs teaches the invention as claimed comprising: designing training input information based on input information, and a type of the on-device machine learning model ([30]-[36], e.g., design training input information based on input information (e.g., models, configurations, input test signals, reference outputs, test definition, model weights) and type of ML model (e.g., full model for inference only capability or model for ML training and inference capabilities)), wherein the input information is derived from known usage of an on-device machine learning model ([33]), wherein the input information is arranged to provide an expected output of the device is generated in response to test input information ([31]-[36], e.g., wherein training input information are arranged to provide corresponding reference output of a device in response to test input); applying the input information and the test input information to the device ([36], e.g., running test sequence with test input); obtaining an output information, wherein the output information is generated by the device in response to the test input information ([36], e.g., obtaining output generated in response to the test input); and comparing the obtained output information with the expected output so as to evaluate the on-device machine learning model ([36][56], e.g., compare the output of the ML model with the reference output). As per claim 17, Kovacs teaches the invention as claimed in claim 1 above. Kovacs further teach wherein the test circuit, is arranged to apply the training input information by transmitting it to the device ([31][41][55][56][69][81], e.g., transmitting the training input information to the UE), wherein the test circuit is arranged to apply the training input information by interfacing a hardware sensing unit on the device ([31]-[36][69][81], e.g., applying training input information (e.g., models, configurations, input test signals, reference outputs, test definition, model weights) by interfacing an antenna), wherein the test circuit is arranged to obtain the output information by analyzing an output of the hardware sensing unit on the device ([52][69][81], e.g., receiving output from UE by analyzing an output via an antenna of the device). As per claim 19, Kovacs teaches the invention as claimed in claim 1 above. Kovacs further teach wherein the training input generator circuit is arranged to design modifications to a portion of known model inputs ([33][34]), wherein the portion of known model inputs uses the input information as the test input information to test by checking whether same outputs are obtained as during original training within a given range ([36][38]). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Kovacs in view of Official Notice. As per claim 6, Kovacs teaches the invention as claimed in claim 1 above. Although Kovacs teach comprising an external input apparatus, wherein the external input apparatus is arranged to store the test input information, the training input information and the input information ([31]-[36][39], e.g., the external input apparatus is arranged to store models, configurations, input test signals, reference outputs, test definition, model weights to be downloaded/provided to the UE), wherein the test input information is associated to expected responses of at least one network devices ([31]-[36], e.g., corresponding reference output of a device in response to test input), however, Kovacs is silent in regards to the external input apparatus as a database. Official Notice is taken for the concept of database is well known and accepted in the art. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to include a database because by doing so it would allow data to be remotely accessed and retrieved by connected network devices. Claims 10 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Kovacs in view of Carvalho et al, U.S. Patent Application Publication 2019/0318099 (hereinafter Carvalho). As per claim 10, Kovacs teaches the invention as claimed in claim 1 above. Kovacs is silent in regards to the existence of one or more backdoor activations. Carvalho teaches wherein the training input generator circuit is arranged to design modifications to a portion of known model inputs (27][31]), wherein the portion of known model inputs cause the existence of one or more backdoor activations in the on-device machine learning model ([19][21][29]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate Carvalho’s teaching into Kovacs’s system in order to detect whether a ML model in Kovacs’s system has a backdoor security vulnerability and take appropriate actions, thus improving the security of the ML model in Kovacs’s system [26]. As per claim 18, Kovacs teaches the invention as claimed in claim 1 above. Kovacs is silent in regards to data tagging. Carvalho teaches wherein the training input generator circuit is arranged to design modifications to a portion of known model inputs [52], wherein the portion of known model inputs applies data tagging by adding tagged data to a training set to allow for statistical identification of training input information ([32][52][56][57]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate Carvalho’s teaching into Kovacs’s system in order to detect whether a ML model in Kovacs’s system has a backdoor security vulnerability and take appropriate actions, thus improving the security of the ML model in Kovacs’s system [26]. Claim 20 is rejected under 35 U.S.C. 103 as being unpatentable over Kovacs in view of Sinn et al, U.S. Patent Application Publication 2021/0312336 (hereinafter Sinn). As per claim 20, Kovacs teaches the invention as claimed in claim 1 above. Although Kovacs teaches wherein the training input generator circuit is arranged to design modifications to a portion of known model inputs([33][34]), wherein the portion of known model inputs is for training the on-device machine learning model ([33][34][36][38]), however, Kovacs is silent in regards to federated learning. Sinn teaches applying federated learning for training machine learning model ([16][17][75]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate Sinn’s teaching into Kovacs’s system in order to use federated learning in Kovacs’s system to provide optimized machine learning model features [15] Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Philip Lee whose telephone number is (571)272-3967. The examiner can normally be reached on 6a-3p M-F. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Glenton Burgess can be reached on 571-272-3949. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair- direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /PHILIP C LEE/Primary Examiner, Art Unit 2454
Read full office action

Prosecution Timeline

Jun 17, 2024
Application Filed
Feb 19, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603820
SYSTEM AND METHOD FOR CELLULAR NETWORK PREDICTION MODEL ANALYSIS
2y 5m to grant Granted Apr 14, 2026
Patent 12596794
SYSTEMS AND METHODS FOR ADAPTIVE ACTION WITH DISTRIBUTED ENFORCEMENT POINTS
2y 5m to grant Granted Apr 07, 2026
Patent 12598243
Service Request and Response Handling
2y 5m to grant Granted Apr 07, 2026
Patent 12580971
ASSIGNING AGENTS TO COMMUNICATION SESSIONS BASED ON LANGUAGE PREFERENCES IN MOBILE APPLICATIONS
2y 5m to grant Granted Mar 17, 2026
Patent 12580825
APPARATUS, METHOD, AND COMPUTER PROGRAM
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
78%
Grant Probability
96%
With Interview (+18.7%)
2y 9m
Median Time to Grant
Low
PTA Risk
Based on 306 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month