Prosecution Insights
Last updated: April 19, 2026
Application No. 18/784,799

LEVERAGING PARTIALLY OBSERVABLE INFRASTRUCTURE FOR DATASET BUILDING

Non-Final OA §103§112
Filed
Jul 25, 2024
Examiner
WILLOUGHBY, ALICIA M
Art Unit
2156
Tech Center
2100 — Computer Architecture & Software
Assignee
DELL PRODUCTS, L.P.
OA Round
1 (Non-Final)
53%
Grant Probability
Moderate
1-2
OA Rounds
3y 10m
To Grant
79%
With Interview

Examiner Intelligence

Grants 53% of resolved cases
53%
Career Allow Rate
257 granted / 481 resolved
-1.6% vs TC avg
Strong +26% interview lift
Without
With
+25.8%
Interview Lift
resolved cases with interview
Typical timeline
3y 10m
Avg Prosecution
31 currently pending
Career history
512
Total Applications
across all art units

Statute-Specific Performance

§101
17.0%
-23.0% vs TC avg
§103
47.1%
+7.1% vs TC avg
§102
14.8%
-25.2% vs TC avg
§112
13.9%
-26.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 481 resolved cases

Office Action

§103 §112
DETAILED ACTION This non-final rejection is responsive to communication filed July 25, 2024. Claims 1-20 are pending in this application. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on July 25, 2024 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 9 and 19 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 9 and 19 recite “receiving m node features from the edge nodes in the set of edge nodes, and m<<M, where M is a total number of nodes in the network.” However, it is unclear as to what is meant by “m<<M”. For example, it is unclear as to whether the symbol “< <” means less than or something else. Appropriate correction is required. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-20 are rejected under 35 U.S.C. 103 as being unpatentable over Fu et al. (US 2024/0273364 A1) (‘Fu’) in view of Murai et al. (“Selective Harvesting over Networks” – from IDS) (‘Murai’). With respect to claims 1 and 11, Fu teaches a method and a non-transitory storage medium having stored therein instructions that are executable by one or more hardware processors to perform operations comprising: receiving respective sets of node features (i.e. application scenario features) from each edge node in a set of edge nodes of a network (i.e. edge devices in a system) (paragraph 71-72); identifying those edge nodes in the set of edge nodes that contain datapoints corresponding to a specified class of edge nodes (i.e. edge devices in class group) (paragraphs 9 and 78-80); using the datapoints to train a model (i.e. training a neural network) (paragraphs 8, 84, 97, and 106); and after the model has been trained, applying the model to the network (i.e. deploying trained neural network; also using K adjacent edge devices for iterative training) (paragraphs 115-116 and 120). Fu does not explicitly teach training and applying a selective harvesting (SH) model, wherein the applying is constrained by a budget of k queries; collecting datapoints from edge nodes in the specified class that were identified by the applying of the SH model to the network; when a threshold number of the edge nodes in the specified class has been identified by application of the SH model to the network, collecting respective data points and features from each of those edge nodes of the specified class; and building a final dataset that comprises the edge nodes of the specified class, and their associated data points and features, that were identified by application of the SH model to the network. Murai teaches training (page 2, section 1, paragraphs 3-4; page 5, section 3; page 7, section 3.2, paragraph 1) and applying a selective harvesting (SH) model, wherein the applying is constrained by a budget of k queries (pages 8-9, section 4; page 14, Algorithm 1 along with explanation below; page 16, section 6.2, paragraphs 1-3); collecting datapoints from nodes (labels, attributes, connections of nodes) in the specified class (i.e. target node type) that were identified by the applying of the SH model to the network (abstract, page 2, section 1, paragraphs 2 and 4; page 5, paragraphs 1-4); when a threshold number of the nodes in the specified class has been identified by application of the SH model to the network, collecting respective data points and features from each of those nodes of the specified class (page 14, Algorithm 1 along with explanation below; page 16, section 6.2, paragraphs 1-3, page 17); and building a final dataset that comprises the nodes of the specified class, and their associated data points and features, that were identified by application of the SH model to the network (targets found, which include node attributes) (page 2, section 1, paragraph 2; page 5, Table 1 and paragraphs 1 and 4; pages 16-17, section 6.2). It would have been obvious to a person having ordinary skill in the art prior to the effective filing date of the invention to have modified the model of Fu to be a selective harvesting model as taught by Murai to enable discovery of the largest number of target (edge) nodes given a fixed budget and network that is not fully observed (Murai, abstract and conclusion), thereby improving the incremental training set of edge device data in Fu. With respect to claims 2 and 12, Fu in view of Murai teaches wherein when the threshold number of the edge nodes of the specified class has not been reached, retraining the SH model with new data, and applying the retrained SH model to the network until the threshold number of edge nodes of the specified class has been reached (page 14, Algorithm 1 along with explanation below; page 16, section 6.2, paragraphs 1-3, page 17). With respect to claims 3 and 13, Fu in view of Murai teaches wherein the budget of k queries specifies a number of times that the network will be queried to identify edge nodes in the specified class (Fu teaches edge nodes in specified class, paragraphs 9 and 78-80)(Murai, abstract; page 2, section 1, paragraphs 1-2; page 8, section 4, paragraph 1; page 14, Algorithm 1). With respect to claims 4 and 14, Fu in view of Murai teaches wherein applying the SH model to the network comprises applying the SH model to less than the entire network (Murai, abstract, page 2, section 1, paragraphs 2-3; page 23, section 9). With respect to claims 5 and 15, Fu in view of Murai teaches wherein for purposes of applying the SH model to the network, the network is modeled as a partially observed graph (Murai, abstract, page 2, section 1, paragraphs 2-3; page 23, section 9). With respect to claims 6 and 16, Fu in view of Murai teaches wherein the edge nodes in the final dataset all share a common domain (i.e. nodes of a target class) (Murai, page 21, section 7, paragraph 1) . With respect to claims 7 and 17, Fu in view of Murai teaches wherein the edge nodes in the final dataset are discovered without requiring application of the SH model to the entire network Murai, abstract, page 2, section 1, paragraphs 2-3; page 23, section 9). With respect to claims 8 and 18, Fu in view of Murai teaches wherein the SH model comprises a D3TS algorithm (Murai, abstract, pages 12-14, sections 5, 5.1, and 5.2; page 16, section 6.2). With respect to claims 9 and 19, Fu in view of Murai teaches wherein receiving respective sets of node features comprises receiving m node features from the edge nodes in the set of edge nodes, and m<<M, where M is a total number of nodes in the network (Fu, paragraph 71-72; Murai, page 2, section 1, paragraphs 2-3; page 5, paragraph 4; page 15, Tables 3 and 4; page 16, paragraph 2) With respect to claims 10 and 20, Fu in view of Murai teaches wherein the respective sets of node features each comprise one or more representative datapoints collected by the node from which the set of node features was received (Fu, paragraph 69, 71-72; Murai, page 2, section 1, paragraphs 2-3; page 5, paragraph 4; page 15, Tables 3 and 4; page 16, paragraph 2). Contact Information Any inquiry concerning this communication or earlier communications from the examiner should be directed to ALICIA M WILLOUGHBY whose telephone number is (571)272-5599. The examiner can normally be reached 9-5:30, EST, M-F. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ajay Bhatia can be reached at 571-272-3906. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ALICIA M WILLOUGHBY/ Primary Examiner, Art Unit 2156
Read full office action

Prosecution Timeline

Jul 25, 2024
Application Filed
Jan 23, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12572525
SYSTEMS AND METHODS FOR ENHANCED CLOUD-BASED RULES CONFLICT CHECKING WITH DATA VALIDATION
2y 5m to grant Granted Mar 10, 2026
Patent 12566752
MATCHING AND MERGING USING METADATA CONFIGURATION BASED ON AN N-LAYER MODEL
2y 5m to grant Granted Mar 03, 2026
Patent 12530340
Query Processor
2y 5m to grant Granted Jan 20, 2026
Patent 12511181
RECOMMENDATION SYSTEM, CONFIGURATION METHOD THEREFOR, AND RECOMMENDATION METHOD
2y 5m to grant Granted Dec 30, 2025
Patent 12505082
METHOD OF PROCESSING DATA IN A DATABASE
2y 5m to grant Granted Dec 23, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
53%
Grant Probability
79%
With Interview (+25.8%)
3y 10m
Median Time to Grant
Low
PTA Risk
Based on 481 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month