Prosecution Insights
Last updated: April 19, 2026
Application No. 18/342,547

SYSTEM, METHOD, AND APPARATUS FOR DATA QUERY USING NETWORK DEVICE

Non-Final OA §101§103
Filed
Jun 27, 2023
Examiner
ROSTAMI, MOHAMMAD S
Art Unit
2154
Tech Center
2100 — Computer Architecture & Software
Assignee
Huawei Technologies Co., Ltd.
OA Round
5 (Non-Final)
67%
Grant Probability
Favorable
5-6
OA Rounds
3y 10m
To Grant
93%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
425 granted / 635 resolved
+11.9% vs TC avg
Strong +26% interview lift
Without
With
+26.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 10m
Avg Prosecution
37 currently pending
Career history
672
Total Applications
across all art units

Statute-Specific Performance

§101
21.3%
-18.7% vs TC avg
§103
54.9%
+14.9% vs TC avg
§102
9.7%
-30.3% vs TC avg
§112
4.4%
-35.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 635 resolved cases

Office Action

§101 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Continued Examination Under 37 CFR 1.114 A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 2/12/2026 has been entered. Status of Claims Claims 1, 2, and 4-19 are pending of which claims 1, 10 and 14 are in independent form. Claims 1, 2, and 4-19 are rejected under 35 U.S.C. 101 including (Abstract idea). Claims 1, 2, and 4-19 are rejected under 35 U.S.C. 103. Response to Arguments Applicant’s arguments with respect to claim(s) 1, 2, and 4-19 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Regarding the arguments presented for 35 USC 101 rejection, examiner specifies that the arguments/amendments and arguments fail to overcome the 35 USC 101 rejection. More specifically: Step 2A, Prong One (Judicial Exception) The claims are directed to a computer implemented method/system for data processing. The recited steps are essentially: Receiving a query request; Generating an execution plan based on the query request; Generating tasks based on the execution plan; Determining offloadable vs unoffloadable tasks; Determining a device that should execute the offloadable task; Sending instruction so the selected device executes the task; Executing remaining tasks on another processor. These limitations collectively amount to: Collecting and processing a request; Breaking work into task; Classifying tasks based on rules on criteria; Assigning tasks to available resources; Executing tasks and coordinating their completion. The claims describe planning and allocating work among available resources based on rules. This maps directly onto recognized abstract categories: Mental Process: evaluating tasks and determining where they should be executed. Organizing and Presenting Information: task allocation and work distribution. Information Analysis and Decision Making: determining execution plans and task assignment. Therefore, examiner specifies that: dividing work into tasks, evaluating tasks based on criteria; assigning tasks to available resources, and coordinating execution of tasks, as abstract concepts that can be performed mentally or generic technology that performs these processes. Step 2A, Prong Two (Practical Application) The claims do not integrate the abstract idea into a practical application. The claims merely recite generic components: Central node; Network device; Working node; Processors; Execution instructions. These elements merely implement the abstract idea using generic computing components. There claims do not: Improve network architecture; Improve query execution technology; Improve processor operation; Introduce a new data structure; Introducing a new scheduling mechanism. The components merely execute the abstract task-allocation logic. The recited components perform their generic/ordinary, expected functions, which is considered insufficient and considered mere automation of an abstract idea. These claims are merely functional software components performing generic task. Therefore, the claims do not integrate the abstract idea into a practical application. Therefore, the claims do not integrate the abstract idea into a practical application. Applicant’s Argument: Applicant argues, on pages 12-14 of the "Remarks”, that the prior art of record does not teach “determine an offloadable task and an unoffloadable task in the plurality of tasks”; “the working node includes a processor configured to execute the unoffloadable task”. Examiner's Response: Examiner respectfully disagrees; the combination of Yu and Campbell clearly teaches, an unoffloadable task; and the working node includes a processor configured to execute the unoffloadable task (The one or more tasks making up this offloadable functionality of application 320 may be referred to herein as “workloads” or “edge compute tasks,” and may include any computing operations, instructions, or other computational work associated with application 320 and offloadable to an edge compute node such as one of nodes 204. For example, such edge compute tasks may include applications, lambda functions, services, or other suitable workloads, and may be offloaded (e.g., by necessity or for convenience) to UE device 206 while other tasks are performed locally by onboard resources of UE device 206, thereby enabling, facilitating, and/or enhancing the functionality of application 320 ¶ [0046]. The dynamic workload of each node 204 may be represented by real-time or near-real-time data representative of how busy each node 204 is in relation to its total capacity. For example, workload data may include statistical data indicative of how many tasks 318 are being handled by each node 204, how many tasks 318 could theoretically be handled by these nodes 204, how much latency is being measured for different tasks 318 being performed for different UE devices 206 offloading tasks to different nodes 204, how much latency is expected for these tasks 318, and so forth ¶ [0051]. These section clearly teaches, unoffloadable task and working nodes; more specifically examiner specifies that, the tasks that are performed locally by on board resources of User Equipment (UE) are considered unoffloadable tasks, and tasks that are offloaded based on latency are offloadable tasks that are processed by a different node/cloud. Examiner specifies that cloud in connection to offloading is provided in ¶ [0045]). Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1, 2, and 4-19 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. The claim(s) recite(s) using network devices to determine when to offload tasks. With respect to step 1 of the patent subject matter eligibility analysis, the claims are directed to a process, machine, manufacture, or composition of matter. Independent claims 1 is a system claim including a processor. Independent claim 10 are directed to a method, which is a process. Independent claim 14 is directed to a display interface. All other claims depend on claims 1, 10 and 14. As such, claims 1-20 are directed to a statutory category. Regarding claims 1, 10 and 14, With respect to step 2A, Prong One, prong one, the claims recite an abstract idea, law of nature, or natural phenomenon. Specifically, the following limitations recite mathematical concepts and/or mental processes and/or certain methods of organizing human activity. The claims are directed to a computer implemented method/system for data processing. The recited steps are essentially: Receiving a query request; Generating an execution plan based on the query request; Generating tasks based on the execution plan; Determining offloadable vs unoffloadable tasks; Determining a device that should execute the offloadable task; Sending instruction so the selected device executes the task; Executing remaining tasks on another processor. These limitations collectively amount to: Collecting and processing a request; Breaking work into task; Classifying tasks based on rules on criteria; Assigning tasks to available resources; Executing tasks and coordinating their completion. The claims describe planning and allocating work among available resources based on rules. This maps directly onto recognized abstract categories: Mental Process: evaluating tasks and determining where they should be executed. Organizing and Presenting Information: task allocation and work distribution. Information Analysis and Decision Making: determining execution plans and task assignment. Therefore, examiner specifies that: dividing work into tasks, evaluating tasks based on criteria; assigning tasks to available resources, and coordinating execution of tasks, as abstract concepts that can be performed mentally or generic technology that performs these processes. With respect to step 2A, Prong Two, prong two, the claims do not recite additional elements that integrate the judicial exception into a practical application. The following limitations are considered “additional elements” and explanation will be given as to why these “additional elements” do not integrate the judicial exception into a practical application. The claims do not integrate the abstract idea into a practical application. The claims merely recite generic components: Central node; Network device; Working node; Processors; Execution instructions. These elements merely implement the abstract idea using generic computing components. There claims do not: Improve network architecture; Improve query execution technology; Improve processor operation; Introduce a new data structure; Introducing a new scheduling mechanism. The components merely execute the abstract task-allocation logic. The recited components perform their generic/ordinary, expected functions, which is considered insufficient and considered mere automation of an abstract idea. These claims are merely functional software components performing generic task. Therefore, the claims do not integrate the abstract idea into a practical application. Therefore, the claims do not integrate the abstract idea into a practical application. With respect to Step 2B. The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. The additional limitations are directed to a computer readable storage medium, computer, memory, and processor, at a very high level of generality and without imposing meaningful limitations on the scope of the claim. The additional elements are: generic processors, nodes, network device, sending instructions. All steps are well understood, routine, conventional computer components performing conventional functions. Such generic, high‐level, and nominal involvement of a computer or computer‐based elements for carrying out the invention merely serves to tie the abstract idea to a particular technological environment, which is not enough to render the claims patent‐eligible, as noted at pg.74624 of Federal Register/Vol. 79, No. 241, citing Alice, which in turn cites Mayo. Further, See, e.g., Alice Corp. Pty. Ltd. v. CLS Bank Int'l, 134 S. Ct. 2347, 2359‐60, 110 USPQ2d 1976, 1984 (2014). See also OIP Techs. v. Amazon.com, 788 F.3d 1359, 1364, 115 USPQ2d 1090, 1093‐94 (Fed. Cir. 2015) ("Just as Diehr could not save the claims in Alice, which were directed to 'implement[ing] the abstract idea of intermediated settlement on a generic computer', it cannot save O/P's claims directed to implementing the abstract idea of price optimization on a generic computer.") (citations omitted). See also, Affinity Labs of Texas LLC v. DirecTV LLC, 838 F.3d 1253, 1257‐1258 (Fed. Cir. 2016) (mere recitation of a GUI does not make a claimpatent‐eligible); Intellectual Ventures I LLC v. Capital One Bank, 792 F.3d 1363, 1370 (Fed. Cir. 2015) ("the interactive interface limitation is a generic computer element".). The additional elements are broadly applied to the abstract idea at a high level of generality ("similar to how the recitation of the computer in the claims in Alice amounted to mere instructions to apply the abstract idea of intermediated settlement on a generic computer,") as explained in MPEP § 2106.05(f)) and they operate in a well‐understood, routine, and conventional manner. MPEP § 2106.0S(d)(II) sets forth the following: The courts have recognized the following computer functions as well-understood, routine, and conventional functions when they are claimed in a merely generic manner (e.g., at a high level of generality) or as insignificant extra-solution activity. • Receiving or transmitting data over a network, e.g., using the Internet to gather data, Symantec ... ; TLI Communications LLC v. AV Auto. LLC ... ; OIP Techs., Inc., v. Amazon.com, Inc ... ; buySAFE, Inc. v. Google, Inc ... ; • Performing repetitive calculations, Flook ... ; Bancorp Services v. Sun Life ... ; • Electronic recordkeeping, Alice Corp ... ; Ultramercial ... ; • Storing and retrieving information in memory, Versata Dev. Group, Inc. v. SAP Am., Inc ... ; • Electronically scanning or extracting data from a physical document, Content Extraction and Transmission, LLC v. Wells Fargo Bank ... ; and • A web browser's back and forward button functionality, Internet Patent • Corp. v. Active Network, Inc. ... . . . Courts have held computer-implemented processes not to be significantly more than an abstract idea (and thus ineligible) where the claim as a whole amounts to nothing more than generic computer functions merely used to implement an abstract idea, such as an idea that could be done by a human analog (i.e., by hand or by merely thinking). In addition, when taken as an ordered combination, the ordered combination adds nothing that is not already present as when the elements are taken individually. There is no indication that the combination of elements integrate the abstract idea into a practical application. Their collective functions merely provide conventional computer implementation. Therefore, when viewed as a whole, these additional claim elements do not provide meaningful limitations to transform the abstract idea into a practical application of the abstract idea or that the ordered combination amounts to significantly more than the abstract idea itself. The dependent claims have been fully considered as well, however, similar to the findings for claims above, these claims are similarly directed to the “Mental Processes” grouping of abstract ideas set forth in the 2019 PEG, without integrating it into a practical application and with, at most, a general purpose computer that serves to tie the idea to a particular technological environment, which does not add significantly more to the claims. The ordered combination of elements in the dependent claims (including the limitations inherited from the parent claim(s)) add nothing that is not already present as when the elements are taken individually. There is no indication that the combination of elements improves the functioning of a computer or improves any other technology. Their collective functions merely provide conventional computer implementation. Accordingly, the subject matter encompassed by the dependent claims fails to amount to significantly more than the abstract idea. Therefore, independent claims 1, 10 and 14 are rejected under 35 U.S.C. 101. Dependent claims 2, 4-9, 11-13 and 15-19 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The judicial exception is not integrated into a practical application. The claims do not include additional elements that are sufficient to amount to significantly more than judicial exception. The eligibility analysis in support of these findings is provided below, in accordance with the "2019 Revised Patent Subject Matter Eligibility Guidance" (published on 1/7/2019 in Fed, Register, Vol. 84, No. 4 at pgs. 50-57, hereinafter referred to as the "2019 PEG"). Step 1. Analyzed above. Step 2A. In accordance with Step 2A Prong One of 2019 PEG, it is noted that the independent claims from which the dependent claims rely recite abstract ideas falling within the mental processes enumerated groupings of abstract ideas set forth in the 2019 PEG as detailed above. Examiner is of the position the dependent claims 2, 8-9 and 17 also recite abstract ideas falling within the mental processes enumerated groupings of abstract ideas as follows: "2. (Previously Presented) The system according to claim 1, wherein being configured to determine, for each task in the plurality of tasks an execution device includes being configured to: search for an offloadable task in the plurality of tasks, and determine that an execution device of the offloadable task is the network device, wherein the offloadable task is a preset task that is to be offloaded to the network device for execution", as drafted recites a mental process based on evaluations and judgments (See MPEP 2106.04(a)(2) III A Examples of claims that recite mental processes, Example 1); "8. (Original) The system according to claim 2, wherein after receiving a data packet and based on determining that the data packet comprises an identifier of the offloadable task executed by the network device, the network device is configured to execute the offloadable task based on the data packet.", as drafted recites a mental process based on evaluations and judgments (See MPEP 2106.04(a)(2) III A Examples of claims that recite mental processes, Example 1); "17. (Previously Presented) The interface according to claim 14, wherein the determination of whether the task is an offloadable task or an unoffloadable task includes a determination of whether the task includes an unoffloadable operator", as drafted recites a mental process based on evaluations and judgments (See MPEP 2106.04(a)(2) III A Examples of claims that recite mental processes, Example 1). With respect to Step 2A Prong Two of the 2019 PEG, dependent claims 3-7 and 15-16 recite additional elements identified below: "3. (Original) The system according to claim 2, wherein the central node is configured to send a setting instruction of the offloadable task to the network device; and the network device is configured to set the offloadable task based on the setting instruction", as drafted recites insignificant extra solution activity (See MPEP 2106.05(g) Mere Data Gathering Example iv obtaining a setting instruction, pre-solution activity, prior to execution of a task); "4. (Original) The system according to claim 1, wherein the network device is a network interface card of the working node or a forwarding device, and the forwarding device comprises a switch and a router.", as drafted recites recite generic computer components performing generic computer functions (See MPEP 2106.05(f)(2) examples where the courts have found the additional elements to be mere instructions to apply an exception, because they do no more than merely invoke computers or machinery as a tool to perform an existing process include: Example iii); "5. (Original) The system according to claim 4, wherein the forwarding device comprises a data port and a control port, and the central node is configured to as drafted recites recite generic computer components performing generic computer functions (See MPEP 2106.05(f)(2) examples where the courts have found the additional elements to be mere instructions to apply an exception, because they do no more than merely invoke computers or machinery as a tool to perform an existing process include: Example iii) "send a setting instruction of an offloadable task whose execution device is the forwarding device to the forwarding device through the control port, and send a setting instruction of a task whose execution device is the network interface card or the working node to the forwarding device through the data port; and the forwarding device is configured to: set the offloadable task based on the setting instruction received from the control port, and forward the setting instruction received from the data port.' as drafted recites insignificant extra solution activity (See MPEP 2106.05(g) Mere Data Gathering Example iv obtaining a setting instruction, pre- solution activity, prior to execution of a task); "6. (Original) The system according to claim 4, wherein based on the network device that executes the offloadable task being the network interface card of the working node, the central node is configured to send the setting instruction of the offloadable task to the working node; and the working node is configured to set the offloadable task on the network interface card of the working node based on the setting instruction.", as drafted recites insignificant extra solution activity (See MPEP 2106.05(g) Mere Data Gathering Example iv obtaining a setting instruction, pre-solution activity, prior to execution of a task); "7. (Original) The system according to claim 6, wherein the setting instruction of the offloadable task comprises an offloadable flag; and the working node is configured to set the offloadable task on the network interface card of the working node after receiving the setting instruction and based on determining that the setting instruction comprises the offloadable flag", as drafted recites insignificant extra solution activity (See MPEP 2106.05(g) Mere Data Gathering Example iv obtaining a setting instruction, pre-solution activity, prior to execution of a task); "15. (Original) The interface according to claim 14, wherein the query command input area, the task display area, and the execution device display area are displayed on a same interface", as drafted recites recite generic computer components performing generic computer functions (See MPEP 2106.05(f)(2) examples where the courts have found the additional elements to be mere instructions to apply an exception, because they do no more than merely invoke computers or machinery as a tool to perform an existing process include: Example iii); and "16. (Original) The interface according to claim 14, wherein the query command input area, the task display area, and the execution device display area are displayed on different interfaces" as drafted recites recite generic computer components performing generic computer functions (See MPEP 2106.05(f)(2) examples where the courts have found the additional elements to be mere instructions to apply an exception, because they do no more than merely invoke computers or machinery as a tool to perform an existing process include: Example iii). Dependent claims 11-13 and 18-19 are similar in scope to dependent claims 2-9 and 15-17 provided above. The additional element identified above fail to integrate the abstract idea into a practical application because the additional elements amount to generic computer components performing generic computer functions and insignificant extra-solution activity, See MPEP 2106.05(g) which lists three considerations when making a determination as to whether additional elements are insignificant extra-solution activity. Step 2B. Similar to the analysis under 2A Prong Two, because the additional elements of the dependent claims amount to generic computer components performing generic computer functions and insignificant extra solution activity, the additional elements do not add significantly more to the judicial exception such that the claims as a whole would be patent eligible. Therefore, dependent claims 2-9, 11-13 and 15-19 are rejected under 35 U.S.C. 101. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 2 and 4-19 are rejected under 35 U.S.C. 103 as being unpatentable over Yu; Yifan (US 20200036808 A1) [Yu] in view of Campbell; Kirk et al. (US 20200351336 A1) [Campbell] in view of Fender; Pit et al. (US 20200394191 A1) [Fender]. Regarding claim 1, Yu discloses, a data query system, comprising a central node, a working node, and a network device, wherein the central node is connected to the working node through the network device; the central node includes a processor (Fig 2) configured to: generate, based on a query request, a task for executing the query request (Fig. 5; requested execution of a function implies a task); determine an offloadable task, wherein the offloadable task is indicated by an offloadable operator included in the offloadable task (Par. 0020 "According to the illustrated example, web requests for offloading are transmitted with a destination identifying the example web server 106, but include a parameter identifying the requests as offloading requests. For example, according to the illustrated example, the web requests include a uniform resource indicator (URI) QUERY parameter indicating offloading"); send a setting instruction of the offloadable task to the network device, wherein the setting instruction is used to set, on the network device, the offloadable task to be executed by the network device; and wherein the network device includes a processor configured to execute the offloadable task according to the setting instruction (Fig. 5. Input data reads on instructions). Yu, however, does not explicitly disclose a plurality of tasks and determine a network device for executing the offloadable task; an unoffloadable task; and the working node includes a processor configured to execute the unoffloadable task. However, Campbell teaches a plurality of tasks (Fig. 3) and determine a network device for executing the offloadable task (Abstract, Par 0026; anode selection policy); an unoffloadable task; and the working node includes a processor configured to execute the unoffloadable task (The one or more tasks making up this offloadable functionality of application 320 may be referred to herein as “workloads” or “edge compute tasks,” and may include any computing operations, instructions, or other computational work associated with application 320 and offloadable to an edge compute node such as one of nodes 204. For example, such edge compute tasks may include applications, lambda functions, services, or other suitable workloads, and may be offloaded (e.g., by necessity or for convenience) to UE device 206 while other tasks are performed locally by onboard resources of UE device 206, thereby enabling, facilitating, and/or enhancing the functionality of application 320 ¶ [0046]. The dynamic workload of each node 204 may be represented by real-time or near-real-time data representative of how busy each node 204 is in relation to its total capacity. For example, workload data may include statistical data indicative of how many tasks 318 are being handled by each node 204, how many tasks 318 could theoretically be handled by these nodes 204, how much latency is being measured for different tasks 318 being performed for different UE devices 206 offloading tasks to different nodes 204, how much latency is expected for these tasks 318, and so forth ¶ [0051]. These section clearly teaches, unoffloadable task and working nodes; more specifically examiner specifies that, the tasks that are performed locally by on board resources of User Equipment (UE) are considered unoffloadable tasks, and tasks that are offloaded based on latency are offloadable tasks that are processed by a different node/cloud. Examiner specifies that cloud in connection to offloading is provided in ¶ [0045]). Both the Yu reference and the Campbell reference are in the same field of endeavor of offloading computing processes. Before the effective filing date of the claimed invention it would have been obvious to one of ordinary skill in the art to combine the performing an offloading task in Yu with node selection policy as taught in Campbell to match the computing resources with the complexity of a task (See Campbell Abstract). Accordingly, Yu in view of Campbell teaches (Campbell, Fig. 3) an offloadable task in the plurality of tasks. However, neither Yu nor Campbell explicitly facilitates an execution plan; generate, based on the execution plan. Fender discloses, an execution plan; generate, based on the execution plan (RDBMS 110 may use the parse tree, as decorated by semantic analysis, for query planning, optimization, and logic generation. RDBMS 110 may use the semantic analysis to generate multiple alternate query plans and/or structurally transform a query plan, such as execution plan 120, according to optimization heuristics as discussed later herein. Execution plan 120 specifies a combination of data operators whose application may fulfill database statement 150 ¶ [0047]-[0049]. In an embodiment, execution plan 120 is a logical tree (not shown) of operation nodes, and multiple nodes and/or at least one subtree can be marked for offloading. For example, a compound expression (not shown) may be partially or totally offloaded in a same access request to offload engine 160. In an embodiment, operations from different lexical clauses of database statement 150 may be offloaded. For example, a filtration clause and a sorting clause may be offloaded together ¶ [0067]). It would have been obvious to one ordinary skilled in the art at the time of the filing of the present invention to combine the teachings of the cited references because Fender's system would have allowed Yu and Campbell to facilitate an execution plan; generate, based on the execution plan. The motivation to combine is apparent in the Yu and Campbell’s reference, because there is a need to improve database scalability and workload distribution. Regarding claim 2, Yu in view of Campbell in view of Fender discloses, wherein being configured to determine an offloadable task includes being configured to: search for an offloadable task in the plurality of tasks, and determine that an execution device of the offloadable task is the network device, wherein the offloadable task is a preset task that is to be offloaded to the network device for execution (The rejection rational of claim 1 is applicable. Selecting a task based on the offloading indicator reads on the search for an offloadable task. An offloadable task reads on a preset task.) Regarding claim 4, Yu in view of Campbell in view of Fender discloses, wherein the network device is a network interface card of the working node or a forwarding device, and the forwarding device comprises a switch and a router (Yu: paragraph [0072]). Regarding claim 5, Yu in view of Campbell in view of Fender discloses, wherein the forwarding device comprises a data port and a control port, and the central node is configured to: send a setting instruction of an offloadable task whose execution device is the forwarding device to the forwarding device through the control port, and send a setting instruction of a task whose execution device is the network interface card or the working node to the forwarding device through the data port; and the forwarding device is configured to: set the offloadable task based on the setting instruction received from the control port, and forward the setting instruction received from the data port (Examiner notes, first, that the forwarding device in the parent claim 4 is optional. Second, a data port and a control port are disclosed by their respective functionalities discussed in the rejection of parent claims.) Regarding claim 6, Yu in view of Campbell in view of Fender discloses, wherein based on the network device that executes the offloadable task being the network interface card of the working node, the central node is configured to send the setting instruction of the offloadable task to the working node; and the working node is configured to set the offloadable task on the network interface card of the working node based on the setting instruction (The rejection rational of the parent claims is applicable.) Regarding claim 7, all of the particulars of claims 1, 4 and 6 have been addressed above. Additionally, Tian as modified with Campbell discloses: wherein the setting instruction of the offloadable task comprises the offloadable flag; and the working node is configured to set the offloadable task on the network interface card of the working node after receiving the setting instruction and based on determining that the setting instruction comprises the offloadable flag (Rejection rational for the offloadable operator in the parent claims is applicable to an offloadable flag. Examiner notes that the instant spec is using terms "offloadable operator" and "offloadable flag" interchangeably.) Regarding claim 8, Yu in view of Campbell in view of Fender discloses, wherein after receiving a data packet and based on determining that the data packet comprises an identifier of the offloadable task executed by the network device, the network device is configured to execute the offloadable task based on the data packet (Yu, Abstract. A web request reads on a data packet.) Regarding claim 9, Yu in view of Campbell in view of Fender disclose, wherein based on being configured to send the setting instruction of the offloadable task to the network device, the central node is configured to send the setting instruction of the offloadable task to the network device after determining the offloadable task and based on determining that the offloadable task meets an offloading policy corresponding to the offloadable task (Yu, Par. [0066]. Policy data includes an offloading policy). Regarding claims 10 - 13, rejection rational of claims 1, 4, 5 and 8 is applicable. Regarding claim 14, rejection rational of claim 1 is applicable. Moreover, Campbell in par. [0096] discloses GUI. Claimed GUI components are disclosed by their respective functionality. Also, examiner notes that determination whether a task is offloadable or unoffloadable is a binary one. Accordingly, offloadable operator/flag provides an indication for both. Regarding claims 15 and 16, rejection rational of claim 14 is applicable. Examiner notes that it is a design choice on whether to place GUI elements on the same interface or not. Regarding claims 17, 18 and 19, Yu in view of Campbell teaches that the unoffloadable operator is defined in a protocol (Yu, Par. [0020]. Usage of URI as offloadable indicator reads on a protocol defining the indicator.) Conclusion The examiner requests, in response to this Office action, support be shown for language added to any original claims on amendment and any new claims. That is, indicate support for newly added claim language by specifically pointing to page(s) and line no(s) in the specification and/or drawing figure(s). This will assist the examiner in prosecuting the application. When responding to this office action, Applicant is advised to clearly point out the patentable novelty which he or she thinks the claims present, in view of the state of the art disclosed by the references cited or the objections made. He or she must also show how the amendments avoid such references or objections See 37 CFR 1.111(c). Any inquiry concerning this communication or earlier communications from the examiner should be directed to MOHAMMAD S ROSTAMI whose telephone number is (571)270-1980. The examiner can normally be reached Mon-Fri From 9 a.m. to 5 p.m.. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Boris Gorney can be reached at (571)270-5626. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. 3/7/2026 /MOHAMMAD S ROSTAMI/Primary Examiner, Art Unit 2154
Read full office action

Prosecution Timeline

Jun 27, 2023
Application Filed
Jun 15, 2024
Non-Final Rejection — §101, §103
Aug 09, 2024
Response Filed
Nov 20, 2024
Final Rejection — §101, §103
Feb 14, 2025
Response after Non-Final Action
Mar 25, 2025
Request for Continued Examination
Mar 30, 2025
Response after Non-Final Action
May 28, 2025
Non-Final Rejection — §101, §103
Aug 29, 2025
Response Filed
Nov 12, 2025
Final Rejection — §101, §103
Feb 12, 2026
Request for Continued Examination
Feb 24, 2026
Response after Non-Final Action
Mar 07, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596705
CHANGE CONTROL AND VERSION MANAGEMENT OF DATA
2y 5m to grant Granted Apr 07, 2026
Patent 12579127
DETECTING LABELS OF A DATA CATALOG INCORRECTLY ASSIGNED TO DATA SET FIELDS
2y 5m to grant Granted Mar 17, 2026
Patent 12561392
RELATIVE FUZZINESS FOR FAST REDUCTION OF FALSE POSITIVES AND FALSE NEGATIVES IN COMPUTATIONAL TEXT SEARCHES
2y 5m to grant Granted Feb 24, 2026
Patent 12561360
INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD, AND NON-TRANSITORY RECORDING MEDIUM
2y 5m to grant Granted Feb 24, 2026
Patent 12561312
DISTRIBUTED STREAM-BASED ACID TRANSACTIONS
2y 5m to grant Granted Feb 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
67%
Grant Probability
93%
With Interview (+26.3%)
3y 10m
Median Time to Grant
High
PTA Risk
Based on 635 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month