DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
1. This communication is in response to claims 1-20 filed on 08/16/2024.
Claim Objections
Claims 8 and 18 are objected to because of the following informalities:
2. Claims 8 and 18 recite, “the updated probing strategy based a change in one or more path performance metrics”. It is submitted that the claims should be amended to include “on” to recite, “based on a change” in order to be grammatically correct.
Appropriate correction is required.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
3. Claims 1, 3-5, 7, 8, 10, 11, 13-15, 17, 18, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Cantwell et al. (US 2018/0013657) in view of Arama et al. (US 7,447,622).
Regarding claim 1, Cantwell teaches a method, comprising:
receiving, at a device (intelligent physical layer switch device 200 of FIG. 2) and via a user interface (device 242 of FIG. 2), an instruction to automate probing strategy formation for an agent in a computer network (Test automation refers to the use of specialized test hardware and software systems that are driven by a test script to dynamically control the configuration and execution of the test on the network being tested, [0017]; the embedded controller 210 running one or more test control applications receives a request from the client device 242 operatively coupled to the integrated network switch 200 (as shown in FIG. 2). In an embodiment, the request received from the client device 242 specifies a particular test script to be executed by the embedded controller 210 and/or specifies criteria (i.e. specific configuration) for testing the network traffic flow activity data, [0067]);
obtaining, by the device, network telemetry from the computer network (the test engine 218 is enabled to collect statistics (e.g., packet counts, byte counts, error counts, utilization, etc.) at least on a subset of the first plurality of ports 202 and/or a subset of the second plurality of ports 204. In one embodiment, the test engine 218 may include a statistics collector 222 adapted to receive a packet from the packet header parser 220 and determine, based on the packet, what type of event is occurring, [0049]);
selecting, by the device, between a path tracing mode (test scenarios are run using exemplary software application simulators 128. Trace data including network response and load information is collected during the test scenarios, [0034]; test packets generated by the data traffic generator 304 described below with packet trace information contained in one or more log files to determine if the network 101 performed consistently with the applicable protocols and performance requirements, [0060]; One aspect of the various embodiments is to produce complete “end to end” packet trace information, [0065]; Packet trace information may be used to reconstruct transmitted voice/video streams in order to examine the behavior of the network 101, [0065]) and a metric collection mode (The impairment tool 124 is configured to generate and inject impairments into the network testing system 100. Throughout this description the term impairment will be used to indicate any type of abnormal operation which may be injected into the testing system to determine the network's response to the abnormal operation, [0032]; the application simulator 128 may be used to simulate the performance of an application by actually generating messages and packets based on a model of the traffic that an application is expected to generate in use, and collecting performance statistics based on these generated test messages and packets, [0033]) to form a probing strategy for the agent, based on the instruction (a plurality of desired test configurations may be defined externally from the test environment 200, stored within the test environment 200 and selectively loaded for carrying out one or more tests on the network, device or system under test, [0036]; a plurality of configuration images of integrated configurations can be stored in the FPGA image storage 232, [0057]; search the FPGA image storage 232 to find a configuration image associated with the requested test script and/or to find an image that satisfies the requested test criteria. When the embedded controller 210 identifies a desirable set (suite) of tests to be performed, the embedded controller 210 retrieves the corresponding configuration image from the FPGA image storage 232, and loads the retrieved configuration image on the test tool engine 218, [0067]); and
instructing, by the device, the agent in the computer network to send one or more packets via the computer network according to the probing strategy (the embedded controller 210 sends a request to the data traffic generator module 304 to either randomly generate or specifically generate flows of information (i.e., traffic) based on the test script being executed, [0068]; routing of the generated network data and/or impairments data using the one or more transmission ports specified in the test template by sending corresponding control data, [0070]).
However, Cantwell does not explicitly disclose using the network telemetry to select between modes.
Arama teaches selecting, by a device and using network telemetry, between a first mode and a second mode (the filter profile engine 402 may also use feedback from the diagnostics module 206 and the executive test engine 216 to create one or more filter profiles, column 6 lines 60-62; The progressive testing module 504 may also alter and/or vary the progression, sequence, and combination of filters in a filter profile being used for testing, depending on performance feedback. For example, if an electronic device 108, 110, 112 fails a lightweight packet drop test, then the progressive testing module 504 might skip burst and jitter simulations. If the electronic device 108, 110, 112 performs well during a packet dropping simulation, however, the progressive testing module 504 might continue the packet dropping simulation and add a packet reordering "test" or simulation on top of it, column 7 lines 45-55).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to alter testing based on performance feedback in the system/method of Cantwell as suggested by Arama to improve on-the-fly network and traffic analysis. One would be motivated to combine these teachings because using real-time network conditions to adaptively switch and modify testing modes would enable efficient adjustments to better simulate and evaluate realistic network conditions.
Regarding claim 3, Cantwell teaches the method as in claim 1, wherein the device selects the metric collection mode and the agent sends the one or more packets along a path in the computer network to collect at least one of: a round trip time metric, a delay metric, a loss metric, or a jitter metric (the test data generator 122 may generate traffic with accumulated jitter, [0031]; the test engine 218 may include a statistics collector 222 adapted to receive a packet from the packet header parser 220 and determine, based on the packet, what type of event is occurring. Exemplary event information may include whether the packet is enqueued, dequeued, dropped, includes an error, etc, [0049]; Network performance can be evaluated so that impairments such as packet loss or erasure and delay jitter can be corrected, [0065]).
Regarding claim 4, Cantwell teaches the method as in claim 1, wherein the network telemetry indicates at least one of: a probing error, a probe response rate, a round trip time, a packet loss, a path length, or a gap in a packet trace (the network analysis tools 126 may search for errors in data streams to help diagnose various network issues uncovered in the test environment 100, [0034]; the test engine 218 is enabled to collect statistics (e.g., packet counts, byte counts, error counts, utilization, etc.) at least on a subset of the first plurality of ports 202 and/or a subset of the second plurality of ports 204, [0049]; the statistics collector 222 may track, for example, how many bytes have been enqueued, dequeued, dropped, etc. from a particular source, [0049]; Network performance can be evaluated so that impairments such as packet loss or erasure and delay jitter can be corrected, [0065]).
Regarding claim 5, Cantwell teaches the method as in claim 1, further comprising:
forming, by the device, an updated probing strategy by updating the probing strategy to swap between the path tracing mode and the metric collection mode to select whichever mode was not selected when forming the probing strategy for the agent (In response to determining that the test script contains additional tests (decision block 410, “yes” branch), the embedded controller 210 may determine if a different configuration image, such as an image containing a different software module, needs to be retrieved from the FPGA image storage 232 and may repeat the above-described steps 402-408 for the next test to be executed, [0071]); and
instructing, by the device, the agent to send one or more packets via the computer network according to the updated probing strategy (the embedded controller 210 sends a request to the data traffic generator module 304 to either randomly generate or specifically generate flows of information (i.e., traffic) based on the test script being executed, [0068]; the embedded controller 210 may change to “smart” or “stats” mode of operations, if needed, search the FPGA image storage 232 to find a configuration image associated with the requested test script and/or to find an image that satisfies the requested test criteria, [0067]).
Regarding claim 7, Cantwell does not explicitly disclose the method as in claim 5, wherein the device forms the updated probing strategy in response to the network telemetry indicating a network configuration change for a host of the agent.
Arama teaches wherein the device forms an updated probing strategy in response to the network telemetry indicating a network configuration change for a host of an agent (The progressive testing module 504 may also alter and/or vary the progression, sequence, and combination of filters in a filter profile being used for testing, depending on performance feedback. For example, if an electronic device 108, 110, 112 fails a lightweight packet drop test, then the progressive testing module 504 might skip burst and jitter simulations. If the electronic device 108, 110, 112 performs well during a packet dropping simulation, however, the progressive testing module 504 might continue the packet dropping simulation and add a packet reordering "test" or simulation on top of it, column 7 lines 44-55; gauges the ability of the electronic device to make efficient use of available network conditions. The evaluation also allows modification of the current network condition simulation(s) and network connectivity simulation(s) or creation of a new test, column 11 lines 66-67 – column 12 lines 1-4).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to alter testing based on performance feedback in the system/method of Cantwell as suggested by Arama to improve on-the-fly network and traffic analysis. One would be motivated to combine these teachings because using real-time network conditions to adaptively switch and modify testing modes would enable efficient adjustments to better simulate and evaluate realistic network conditions.
Regarding claim 8, Cantwell does not explicitly disclose the method as in claim 5, wherein the device forms the updated probing strategy based a change in one or more path performance metrics collected by the agent using the one or more packets.
Arama teaches wherein the device forms an updated probing strategy based a change in one or more path performance metrics collected by an agent using one or more packets (The executive test engine 216 in turn may modify a simulation based on the information, which can be sensed again by the performance sniffers 708, in a continuing cycle. The failure detector 712 can point out when an electronic device 108, 110, 112 is not responding properly and/or when a particular network simulation is no longer worth pursuing, column 10 lines 22-27; At block 1010, the data traffic is evaluated during the application of the filters. At block 1012, if the test criteria are fulfilled, then the method 1000 branches to block 1014 and ends. At block 1012, if the test criteria are not fulfilled, then the method 1000 branches to block 1016 and the method 1000 repeats with a reselection of filters based on the evaluation, column 12 lines 19-27).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to alter testing based on performance feedback in the system/method of Cantwell as suggested by Arama to improve on-the-fly network and traffic analysis. One would be motivated to combine these teachings because using real-time network conditions to adaptively switch and modify testing modes would enable efficient adjustments to better simulate and evaluate realistic network conditions.
Regarding claim 10, Cantwell teaches the method as in claim 1, wherein the agent is hosted by an endpoint in the computer network (consolidate the above described automated test tools 122-128 into the physical layer switch device 120, [0035]).
The apparatus of claim 11 and the tangible, non-transitory, computer readable medium of claim 20 comprise limitations equivalent to those of method claim 1, and therefore are rejected in view of the same rationale.
The apparatus of claim 13 comprises limitations equivalent to those of method claim 3, and therefore is rejected in view of the same rationale.
The apparatus of claim 14 comprises limitations equivalent to those of method claim 4, and therefore is rejected in view of the same rationale.
The apparatus of claim 15 comprises limitations equivalent to those of method claim 5, and therefore is rejected in view of the same rationale.
The apparatus of claim 17 comprises limitations equivalent to those of method claim 7, and therefore is rejected in view of the same rationale.
The apparatus of claim 18 comprises limitations equivalent to those of method claim 8, and therefore is rejected in view of the same rationale.
4. Claims 2, 9, 12, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Cantwell-Arama in view of Cao et al. (US 2003/0161265).
Regarding claim 2, Cantwell teaches the method as in claim 1, wherein the device selects the path tracing mode and the one or more packets comprise path tracing packets to identify trace information along a path in the computer network (test packets generated by the data traffic generator 304 described below with packet trace information contained in one or more log files to determine if the network 101 performed consistently with the applicable protocols and performance requirements, [0060]; produce complete “end to end” packet trace information along with user-defined performance metrics. Packet trace information may be used to reconstruct transmitted voice/video streams in order to examine the behavior of the network 101, [0065]).
However, Cantwell-Arama do not explicitly disclose path tracing packets identifying hops along a path in the computer network.
Cao teaches wherein a device selects a path tracing mode and one or more packets comprise path tracing packets to identify hops along a path in a computer network (Processing by the intermediate node NMMs may involve writing the stored network performance conditions into the tracer packets, [0013]; When the tracer packets travel back to respective end devices, respective end device NMMs operating therein may decipher the information accumulated in the respective tracer packets, [0014]; Each of the segments may be used to store network service information, or network condition information, supplied by the intermediate node NMMs 32 and the gateway NMM 34, respectively, as the tracer packet travels through the first heterogeneous network 14, [0075]; upon initiation of network service probing, the outgoing datastream including the tracer packet travels over the first heterogeneous network 14 through at least one intermediate node 20. Typically, the datastream will make several hops between intermediate nodes 20 prior to reaching the gateway 22, [0099]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to accumulate information from intermediate nodes along a path in the system/method of Cantwell-Arama as suggested by Cao in order to determine network conditions experienced by each hop of a probe tracer packet. One would be motivated to combine these teachings to efficiently identify where on a network certain performance errors or conditions are occurring.
Regarding claim 9, Cantwell-Arama do not explicitly disclose the method as in claim 5, wherein the device forms the updated probing strategy based on an indication that a path via which the agent sent the one or more packets has changed.
Cao teaches wherein a device forms an updated probing strategy based on an indication that a path via which the agent sent one or more packets has changed (The intermediate node NMM 32 operating on the base station may inform a probing end device 18 of the over-crowded condition via network service information, [0057]; the traffic Monitoring component 54 monitors for sudden changes in network traffic characteristics. Sudden changes may include, for example, sudden decrease in bandwidth, increases in transmission delay or any other operational characteristics of the network architecture 12. Upon identification of sudden changes, the traffic Monitoring component 54 may notify the Event Generator component 66, which will generate an event. The event may enable the probing Trigger component 64 to trigger generation of at least one tracer packet by the packet Generator component 62. In one embodiment, the nature of the event may be used to determine the number of tracer packets generated and deployed, [0087]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to analyze data collected from intermediate nodes in the system/method of Cantwell-Arama as suggested by Cao in order to determine network conditions experienced by different hops along a path. One would be motivated to combine these teachings to more efficiently identify where on a network certain performance errors or conditions are occurring.
The apparatus of claim 12 comprises limitations equivalent to those of method claim 2, and therefore is rejected in view of the same rationale.
The apparatus of claim 19 comprises limitations equivalent to those of method claim 9, and therefore is rejected in view of the same rationale.
5. Claims 6 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Cantwell-Arama in view of Buriano et al. (US 2018/0351829).
Regarding claim 6, Cantwell-Arama do not explicitly disclose the method as in claim 5, wherein the device forms the updated probing strategy based in part on an amount of time that has elapsed since it formed the probing strategy.
Buriano teaches wherein a device forms an updated probing strategy based in part on an amount of time that has elapsed since it formed a probing strategy (The test scheduler unit 153 assigns to each agent unit 130(i) the typologies of the tests to be carried out, and the number of test repetitions to be carried out during corresponding time slots of a test time period. For example, the test time period may correspond to a day, [0115]; configure the agent units 130(i) by generating scripts which the agent units 130(i) are capable of automatically running starting from a predefined starting time and for a predefined duration, [0124]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to predefine testing times and durations in the system/method of Cantwell-Arama as suggested by Buriano in order to schedule and order network monitoring tests configured to be carried out by agents. One would be motivated to combine these teachings to efficiently run multiple network testing scripts in an organized sequence to emulate user behavior.
The apparatus of claim 16 comprises limitations equivalent to those of method claim 6, and therefore is rejected in view of the same rationale.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Nader et al. US 7,342,897 – transmitting tasks to a plurality of task types for probing a network.
Lloyd et al. US 7,363,367 – obtaining statistics to analyze delay jitter, and loss of a network path.
Scholte US 7,768,930 – requesting one of a plurality of tests to be performed on a packet switching network.
Prakash US 2006/0083168 – selecting a first or second mode to monitor packets and determine jitter.
Topaltzas et al. US 2009/0124250 – user selection of one of a plurality of communication tests to determine network throughput, error rate, latency, jitter, and others.
Park et al. US 2014/0119221 – setting different test conditions for test packets to measure delay, fitter and throughput.
Fablet et al. US 2016/0028646 – requesting a server to enter a probing mode to push data in order to estimate network bandwidth.
Regev et al. US 2016/0134864 – instructions to switch between network monitoring modes.
Ye et al. US 2021/0306881 – user flexibility to invoke simulation modes for monitoring network performance.
Filsfils et al. US 2022/0173992 – optimizing network path tracing and delay measurement techniques.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MADHU WOOLCOCK whose telephone number is (571)270-3629. The examiner can normally be reached Tuesday, Thursday 9-6 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chris Parry can be reached at 571-272-8328. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
MADHU WOOLCOCK
Examiner
Art Unit 2451
/MADHU WOOLCOCK/Primary Examiner, Art Unit 2451