Prosecution Insights
Last updated: April 19, 2026
Application No. 18/755,967

SYSTEM AND METHOD FOR MANAGEMENT OF POST DATA MIGRATION

Non-Final OA §103§112
Filed
Jun 27, 2024
Examiner
LE, JESSICA N
Art Unit
2169
Tech Center
2100 — Computer Architecture & Software
Assignee
Onix Networking Corp.
OA Round
1 (Non-Final)
73%
Grant Probability
Favorable
1-2
OA Rounds
3y 11m
To Grant
99%
With Interview

Examiner Intelligence

Grants 73% — above average
73%
Career Allow Rate
366 granted / 504 resolved
+17.6% vs TC avg
Strong +29% interview lift
Without
With
+28.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 11m
Avg Prosecution
21 currently pending
Career history
525
Total Applications
across all art units

Statute-Specific Performance

§101
18.0%
-22.0% vs TC avg
§103
48.8%
+8.8% vs TC avg
§102
12.8%
-27.2% vs TC avg
§112
12.2%
-27.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 504 resolved cases

Office Action

§103 §112
DETAILED ACTION This communication is responsive to the instant application filed on 06/27/2024. Claims 1, and 19-20 are independent claims. Claims 1-20 are pending in this application. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement The information disclosure statement (IDS) submitted on 06/27/2024 have been considered and recorded. The submission is in compliance with the provisions of 37 CFR §1.97. See form PTO-1449 singed and attached hereto. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 1-18 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites the limitations "the chatbot sub module" in line 15, “the documentation and reporting submodule” in line 18, “the improvement sub module” in line 22, “the optimization sub module” in line 30. There are insufficient antecedent basis for these limitations in the claim. Claim 1 further recites the phrase "such as" in line 32 which renders the claim indefinite because it is unclear whether the limitations following the phrase are part of the claimed invention. See MPEP § 2173.05(d). Similar rejection is also applied to claim 19 at line 19, e.g. “such as”, and claim 20 at line 21, e.g., “such as”. Claim 2 recites the limitation “the error analysis sub module” in line 3. There is insufficient antecedent basis for this limitations in the claim. Claims 3-18 are also rejected because of dependency to claim 1. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-3, 5-11, and 13-20 are rejected under 35 U.S.C. 103 as being unpatentable over Denyer et al., US Patent No. 10,896,160 B2 (hereinafter as “Denyer”) in view of Bedadala et al., US Pub. No. 2021/0133234 A1 (hereinafter as “Bedadala”), and further in view of Sharma et al., US Pub. No. 2023/0401138 A1 (hereinafter as “Sharma”). Regarding claim 1, Denyer teaches: a computer-implemented system for management of post data migration (Col. 21, lines 22-26: e.g., “The migration planning API 38 may check whether the computing node Sn is in backups, monitoring, and/or connecting with a similar amount of system in its communication profile before and post migration.”) comprising: a hardware processor (Col. 9, lines 32-38: e.g., “processor”); and a memory coupled to the hardware processor (Col. 9, lines 32-38: e.g., “memory, diskspace, processor”), wherein the memory comprises a set of program instructions in the form of a processing subsystem, configured to be executed by the hardware processor, wherein the processing subsystem hosted on a server and configured to execute on a network to control bidirectional communications among a plurality of modules (Fig. 1; and again in Col. 9, lines 32-38; Col. 27, lines 49-54) comprising: a data acquisition module configured to receive a plurality of inputs as a result of post data migration (see Figs. 4-8 are shown the interface (API) as the data acquisition module for configuring to receive the plurality of inputs, and Col. 7, lines 33-36: “The GUI enables inputting of credentials into the collector node 30 to validate the computing nodes Sn, as identified at step 206. In a manual version, an end-user inputs the credentials through the GUI”), wherein the plurality of inputs comprises infrastructure inventory, application logs, application configuration, database inventory, database logs, security and compliance report and backup configuration report (Fig. 1 is shown the database logs at element 36, and application logs at element 53; Figs. 4-8 are shown at least the configuration report; and Col. 20, lines 30-39: e.g., “logging incidents, …”; Col. 24, lines 7-15: “For infrastructure services, the filter applies to backup servers, domain controller traffic, anti-virus servers, inventory management application, security information and event management servers, threat detection software, and the like. For Link Local IP Addresses, certain servers have a 169.254 Address and these connections need to be filtered out as they will extraneous data points. For self-traffic, some processes establish connections to themselves, which should be filtered out.”; and further more in Col. 29, lines 35-63). Denyer does not explicitly teach: “a chatbot module operatively coupled to the data acquisition module wherein the chatbot sub module is configured to provide on-demand support to a user during and post the data migration; a documentation and reporting module operatively coupled to the chatbot module wherein the documentation and reporting submodule is configured to generate a document and report of the data migration using an artificial intelligence model; an improvement module operatively coupled to the documentation and reporting module wherein the improvement sub module is configured to perform a continuous improvement loop after each stage of the data migration to analyze the efficiency of the data migration and provide corrective actions; a fine-tuning module operatively coupled to the improvement module wherein the fine-tuning module is configured to fine-tune the process for subsequent migration waves; and an optimization module operatively coupled to the fine-tuning module wherein the optimization sub module is configured to constantly analyze a cloud platform environment to identify areas for improvement, such as right-sizing instances, optimizing storage, and adjusting configurations based on usage patterns thereby managing post data migration.” In the same field of endeavor (i.e., data processing”), Bedadala teaches: a chatbot module operatively coupled to the data acquisition module wherein the chatbot sub module is configured to provide on-demand support to a user during and post the data migration (see Figs. 6-7 are shown the chatbot module via graphical user interface as application program interface (API) operatively couples to the data acquisition module/software program; and pars. [0029] and [0240] via “initiated on demand”, and [0265]); a documentation and reporting module operatively coupled to the chatbot module wherein the documentation and reporting submodule is configured to generate a document and report of the data migration using an artificial intelligence model (see par. [0030] “ generate a status report for currently pending or recently completed jobs for the database management system”; and pars. [0022-23] teach artificial intelligence (AI) model, [0304-306]; and further in pars. [0099-113] for reporting, reports/document, and [0116] e.g., “management database 146 may comprise data needed to kick off secondary copy operations (e.g., storage policies, schedule policies, etc.), status and reporting information about completed jobs (e.g., status and error reports on yesterday's backup jobs), and additional information sufficient to enable restore and disaster recovery operations (e.g., media agent associations, location indexing, content indexing, etc.).”, and [0121]: “The user interfaces described herein may provide improved human-computer interactions, allowing for significant cognitive and ergonomic efficiencies and advantages over previous systems, including reduced mental workloads, improved decision-making, and the like. User interface 158 may operate in a single integrated view or console (not shown). The console may support a reporting capability for generating a variety of reports, which may be tailored to a particular aspect of information management”, and [0201, and 283-286]); and an improvement module operatively coupled to the documentation and reporting module wherein the improvement sub module is configured to perform a continuous improvement loop after each stage of the data migration to analyze the efficiency of the data migration and provide corrective actions (par. [0083] teaches analyzing the efficiency of the data/metadata migration; par. [0147] “These operations can generally include (i) data movement operations, (ii) processing and data manipulation operations, and (iii) analysis, reporting, and management operations”; pars. [0190-193] further teach the data analysis and improvement after each stage; and par. [0304] e.g., “Supervised machine learning algorithms analyze a “known” training dataset, and then produces an inferred function to predict output values. The learning algorithm can also compare its output with the correct, intended output and find errors to modify the model accordingly”). Accordingly, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the instant application to combine the teachings of the cited references because the teachings of Bedadala would have provided Denyer with the above indicated limitations for allowing a skill artisan in motivation to generate a document and report of the data migration using an artificial intelligence model for improving, analyzing and providing the data migration with corrective actions (Bedadala: Figs. 4-7; and pars. [0026, 131, and 306]). Denyer and Bedadala do not explicitly teach: “a fine-tuning module operatively coupled to the improvement module wherein the fine-tuning module is configured to fine-tune the process for subsequent migration waves; and an optimization module operatively coupled to the fine-tuning module wherein the optimization sub module is configured to constantly analyze a cloud platform environment to identify areas for improvement, such as right-sizing instances, optimizing storage, and adjusting configurations based on usage patterns thereby managing post data migration.” In the same field of endeavor (i.e., data processing), Sharma teaches: a fine-tuning module operatively coupled to the improvement module wherein the fine-tuning module is configured to fine-tune the process for subsequent migration waves (par. [0004] e.g., “Workload migration in case of datacenter consolidation and evacuation is a cumbersome process that involves multiple stages, for example, viz., identifying candidate VMs, putting together subset of these VMs into one or more groups based on some business criteria and eventually scheduling this wave of migrations in a way that VM groups are migrated to the target in a certain order” teach the wave of migration involving multiple stages as interpreted the fine-tuned process(es); and par. [0065] for “a migration wave”; and pars. [0113-114] for teaching fine-tined process); and an optimization module operatively coupled to the fine-tuning module (see par. [0004] as explained above to the fine-tuning process, and par. [0081] via an optimal module as interpreted as the optimization module) wherein the optimization sub module is configured to constantly analyze a cloud platform environment to identify areas for improvement (par. [0038] via “Each public cloud computing environment includes an infrastructure platform 138…”), such as right-sizing instances, optimizing storage, and adjusting configurations based on usage patterns thereby managing post data migration (par. [0005] e.g., “the size of the VMs”, and par. [0058]; pars. [0053], [0079], and [0084] teaches changing/adjusting the configurations which helps the migration based on usage patterns). Accordingly, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the instant application to combine the teachings of the cited references because the teachings of Sharma would have provided Denyer and Bedadala with the above indicated limitations for allowing a skill artisan in motivation to perform the wave of migration in the cloud computing platform environment to identify areas of VMs using the fine-tuned process(es) (Sharma: pars. [0042-45]). Regarding claim 2, Denyer and Bedadala, in combination, teach: an error analysis module (Denyer: par. [0131] “any type of error correcting code”) operatively coupled to the documentation and reporting module (Denyer: pars. [0203-204]) wherein the error analysis sub module is configured to: analyze errors that occurred during the data migration (Denyer: see pars. [0131-132], e.g., “The migration processing API 34 may process the data 44 by error checking the data 44. For instance, the migration processing API 34 may utilize an error-checking algorithm to verify the data 44 received by the migration processing API 34 has not been damaged, corrupted, or otherwise subject to unwanted errors…”; and Bedadala: e.g., error reports, see par. [0116]; par. [0196] e.g., “performing error tracking”, and par. [0304]); and provide automated remediation plans based on the errors (Denyer: Fig. 2B in “Migration Planning” section, and pars. [0175 and 200] as well; and Bedadala: see par. [0202] teaches the remediation plans) Regarding claim 3, Denyer, Bedadala, and Sharm, in combination, teach: wherein the document and report of the migration (Denyer: pars. [0203-204]) comprises timelines, resource configurations and performance metrics (Denyer: par. [0108-110] teaches “uploads time” and “a specified period of time” which are implied timelines, and [0112 and 152] as well for timelines, [0029] via “infrastructure”, virtual resources, technical resources associated with the enterprise, and [0186], and [0036] via “migration planning is performed” is interpreted as performance metrics, and [0054-57]; Bedadala: pars. [0078, 83-84] for timelines, [0197-198, and 212] teaches resource usage, system resources configuration(s); and Sharma: pars. [0044, and 84] teach timelines, and Abstract: e.g., “migration metrics”= performance metrics, and see further in par. [0071]). Regarding claim 5, Denyer and Bedadala, in combination, teach: an on-demand chat interface configured to display the said document and report (Denyer: see Figs. 4-8 via GUIs; and Bedadala: see Figs. 7-10 for the on-demand chat interface for displaying the document/report) Regarding claim 6, Bedadala teaches: wherein the chatbot module (Figs. 7-10 for chatbot module) is configured to constantly provide answers to queries, documentations and assists the user with a plurality of tasks (Abstract: e.g., “he data agent is trained with a corpus of technical documents and rules to determine the intent or keywords for answering the query”, and par. [0023]). Regarding claim 7, Bedadala teaches: wherein the document and report are stored in an integrated database (see pars. [0032-33] e.g., “In other implementations, the disclosed technology can integrate a voice-recognition device with database management software.”, and “Detailed descriptions and examples of systems and methods according to one or more illustrative embodiments of the disclosed technology can be found in the section titled Natural Language Integrated with Database Management, and also in FIGS. 3-6 herein…”, and [0201]). Regarding claim 8, Denyer and Sharma, in combination, teach: a verification module operatively coupled to the optimization module wherein the verification module is configured to test one or more migrated components based on functionality (Denyer: see pars. [0131-132] for verification algorithm module/software, and checking=testing; and Sharma: see par. [0113] “In order to fine tune the hyperparameters for the algorithm and test the migration prediction on real-world data, migrations were run with different sets of VMs under different scenarios, to bring variation in data. Also, dumps were used in a privacy focused manner to collect data. On these data, different machine learning algorithms, such as decision tree, linear regression, random forest, were tested. Based on these tests, random forest was found to be best performing with default parameters. The parameters were later fine-tuned using the collected data”). Regarding claim 9, Denyer teaches: wherein the verification module is configured to verify success or failure of the data migration (Denyer: see details in pars. [0131-132]). Regarding claim 10, Denyer teaches: wherein the optimization module is configured to constantly review the data migration at pre-determined time intervals (Denyer: see par. [0076] teaches the pre-determined time intervals, e.g., “different jobs (Sn Data Collection) running at different intervals. For example, dependency data 44b may run at an interval of every 5 minutes, but operating system data collection is every 6 hours. Dependency data 44b and Metadata 44a may both be JSON files—all data collected from Sn is serialized into JSON and processed by the Migration Processing API as such.”). Regarding claim 11, Denyer and Bedadala, in combination, teach: wherein the optimization module is configured to make one or more changes based on the review (Denyer: see Figs. 4-8 and par. [0204]; and Bedadala: see Figs. 6-7 and 9-10, and par. [0210] for changes based on review(s); and par. [0253] “review status or other status within a workflow (e.g., reviewed or un-reviewed); modification times or types of modifications; and/or any other data attributes in any combination, without limitation”). Regarding claim 13, Denyer and Bedadala teach: wherein the artificial intelligence model is configured with generative artificial intelligence (Denyer: par. [0148] “The techniques described herein, however, may utilize machine learning and artificial intelligence to execute predictive analytics based also on past data to determine criticality parameters”; and Bedadala: see par. [0022] “Artificial intelligence (AI) often includes a computer configured to understand a natural language such as English. If a computer can understand and respond to a human in a natural language, the computer provides utility and an intimate level of human interaction that humans appreciate. Accordingly, there is significant interest in developing software and hardware to effectively communicate with humans in a natural language…”). Regarding claim 14, Bedadala teaches: wherein the artificial intelligence model is a Large Language Model (see pars. [0024-25] teach the large language model). Regarding claim 15, Bedadala and Sharma teach: wherein the artificial intelligence model is trained with data from past data migration waves to make precise predictions (Bedadala: par. [0022] via Aritificial Intelligence (AI), and par. [0197] for precise predictions; and Sharma: see pars. [0108-112] via the migration wave, and via the predictor data/metadata fetcher=training, and the prediction handling). Regarding claim 16, Bedadala teaches: an integrated database to store inventory and logs of the data migrations (see Figs. 1A-1E via integrated database(s0 for storing data/metadata/information, [0103] via “inventory/capacity”, par. [0112] e.g., “log files” storing in database(s), and [0235] “event logs”). Regarding claim 17, Bedadala teaches: wherein the integrated database in one of a structured query language database, non-structured query language database and a sequential database (par. [0026] “(e.g., Structure Query Language or “SQL”)”). Regarding claim 18, Denyer and Bedadala, in combination, teach: wherein the plurality of inputs are received through a user interface (Denyer: see GUIs in Figs. 4-8; and Bedadala: see Figs. 6-7, and 9-10 via graphical user interface (GUIs) for inputs). Claims 19-20 are rejected in the analysis of above claim 1, and therefore, the claims are rejected on that basis. Allowable Subject Matter Claims 4 and 12, in combination as considered a whole, are objected to as being dependent upon the rejected base claim 1, but would be allowable if rewritten in independent form including all of the limitations of the base claim 1, and the intervening claim 3. Prior Arts The prior art made of record on form PTO-892 and not relied upon is considered pertinent to applicant's disclosure. Applicant is required under 37 C.F.R. § 1.111(c) to consider these references fully when responding to this action. It is noted that any citation to specific, pages, columns, lines, or figures in the prior art references and any interpretation of the references should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. See In re Heck, 699 F.2d 1331, 1332-33, 216 USPQ 1038, 1039 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 USPQ 275,277 (CCPA 1968)); Merck & Co. v. Biocraft Laboratories, 874 F.2d 804, 10 USPQ2d 1843 (Fed. Cir.), cert. denied, 493 U.S. 975 (1989). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to Jessica N. Le whose telephone number is (571)270-1009. The examiner can normally be reached M-F 9:30 am - 5:30 pm (EST). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, SHERIEF BADAWI can be reached at (571) 272-9782. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Jessica N Le/Examiner, Art Unit 2169 /MD I UDDIN/ Primary Examiner, Art Unit 2169
Read full office action

Prosecution Timeline

Jun 27, 2024
Application Filed
Mar 11, 2026
Non-Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12585711
SYSTEMS AND METHODS FOR WEB SCRAPING
2y 5m to grant Granted Mar 24, 2026
Patent 12554704
STALE DATA RECOGNITION
2y 5m to grant Granted Feb 17, 2026
Patent 12475100
USING AD-HOC STORED PROCEDURES FOR ONLINE TRANSACTION PROCESSING
2y 5m to grant Granted Nov 18, 2025
Patent 12450225
DYNAMICALLY LIMITING THE SCOPE OF SPREADSHEET RECALCULATIONS
2y 5m to grant Granted Oct 21, 2025
Patent 12393604
SYSTEMS AND METHODS FOR PREVENTING DATABASE DEADLOCKS DURING SYNCHRONIZATION
2y 5m to grant Granted Aug 19, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
73%
Grant Probability
99%
With Interview (+28.6%)
3y 11m
Median Time to Grant
Low
PTA Risk
Based on 504 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month