Prosecution Insights
Last updated: April 19, 2026
Application No. 18/690,855

NETWORK MANAGEMENT APPARATUS, NETWORK MANAGEMENT METHOD, AND VIDEO IMAGE DISTRIBUTION SYSTEM

Non-Final OA §101§102§103
Filed
Mar 11, 2024
Examiner
BROUGHTON, KATHLEEN M
Art Unit
2661
Tech Center
2600 — Communications
Assignee
NEC Corporation
OA Round
1 (Non-Final)
83%
Grant Probability
Favorable
1-2
OA Rounds
2y 7m
To Grant
92%
With Interview

Examiner Intelligence

Grants 83% — above average
83%
Career Allow Rate
219 granted / 263 resolved
+21.3% vs TC avg
Moderate +8% lift
Without
With
+8.3%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
34 currently pending
Career history
297
Total Applications
across all art units

Statute-Specific Performance

§101
10.9%
-29.1% vs TC avg
§103
51.2%
+11.2% vs TC avg
§102
24.1%
-15.9% vs TC avg
§112
11.4%
-28.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 263 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment A Preliminary Amendment was made 03/11/2024 to amend the specification, drawings and claims. Claim amendments include claims 1-4, 6, 12-16, 18-20 of pending claims 1-20. Information Disclosure Statement The information disclosure statement (IDS) submitted on March is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is considered by examiner. Claim Objections Claims 1-4, 6, 13, 16, 19-20 are objected to because of the following informalities: after the phrase “execute the instructions to” the amendment punction is either missing or appears as a semi-colon, which should be consistently a colon. Please verify and update the punction as needed. Appropriate correction is required. Applicant is advised that should claims 1-6 be found allowable, claims 13-17, 20 will be objected to under 37 CFR 1.75 as being a substantial duplicate thereof. Claim Objections 13-17, 20 are duplicate claims of claims 1-6 with the only variation in the recited preamble (claims 1-6 each recite “A (or The) network management apparatus” whereas claims 13-17, 20 recite “A (or The) video image distribution system.” There is no weight given to the preamble in this situation that distinguishes the claims. When two claims in an application are duplicates or else are so close in content that they both cover the same thing, despite a slight difference in wording, it is proper after allowing one claim to object to the other as being a substantial duplicate of the allowed claim. See MPEP § 608.01(m). Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitations are: Claim 7 “acquisition step” performed by the acquisition unit 111 of Fig 1, with method S301 of Fig 2, described to correspond to a “quality storage unit” 101 Fig 8, and detailed in the specification ¶ [0014], [0021], [0045]-[0049] “first calculation step” performed by the first calculation unit 112 of Fig 1, with method S302 of Fig 2, described to correspond to calculation units 102, 103, 104 Fig 8, and detailed in the specification ¶ [0015], [0022], [0045], [0047], [0050]-[0054] “second calculation step” performed by the second calculation unit 113 of Fig 1, with method S303 of Fig 2, described to correspond to calculation units 105, 106 Fig 8, and detailed in the specification ¶ [0016], [0021], [0046], [0056]-[0067] Under 35 U.S.C. § 112(f), the broadest reasonable interpretation of the claims each incorporate particular detailed computer processing operations that are considered an improvement upon existing technological processes and therefore are statutory eligible. See Enfish, LLC v. Microsoft Corp., 822 F.3d 1327, 1336-37, 118 USPQ2d 1684, 1689-90 (Fed. Cir. 2016) and MPEP § 2106(II). Because these claim limitations are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, each are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof. If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-6, 13-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claim 1 recites a network management apparatus comprising: at least one memory storing instructions, and at least one processor configured to execute the instructions (generic computer devices to gather and analyze data associated with abstract ideas using mathematical concepts; see MPEP § 2106.04(a)) to; acquire, in order to recognize a target object from a video image, a required quality indicating quality of the video image required by a video image distribution apparatus configured to distribute the video image (considered insignificant pre-solution data gathering activity to identify statistical requirements; see MPEP § 2106.05(g)); calculate an analysis performance for analyzing the video image based on the acquired required quality, the analysis performance enabling the target object to be recognized (mathematical concept to determine a set of features; see MPEP § 2106.04(a)(2)(I)); and calculate a parameter related to the distribution of the video image based on the calculated analysis performance (mathematical concept to determine a set of features; see MPEP § 2106.04(a)(2)(I)). Claim 2 recites the network management apparatus according to claim 1 (as described above), wherein the at least one processor is further configured to execute the instructions to; acquire a video image quality of the video image (considered insignificant pre-solution data gathering activity to identify statistical requirements; see MPEP § 2106.05(g)), and calculate the analysis performance based on the acquired required quality and the acquired video image quality (mathematical concept to determine a set of features; see MPEP § 2106.04(a)(2)(I)). Claim 3 recites the network management apparatus according to claim 2 (as described above), wherein the at least one processor is further configured to execute the instructions to acquire a required recognition rate indicating a recognition rate required for the video image and a required amount of delay indicating an upper limit of an amount of delay required for the video image as the required quality, and acquires a frame rate of the video image as the video image quality (mathematical concept to determine a set of features; see MPEP § 2106.04(a)(2)(I)), and the recognition rate is a rate at which the target object is recognized in at least one video image frame among a plurality of video image frames (mathematical concept to determine a set of features; see MPEP § 2106.04(a)(2)(I)). Claim 4 recites the network management apparatus according to claim 3 (as described above), wherein the at least one processor is further configured to execute the instructions to; calculate the number of frames in the required delay indicating the number of frames of the video image frame generated within the required amount of delay based on the acquired required amount of delay and the acquired frame rate (mathematical concept to determine a set of features; see MPEP § 2106.04(a)(2)(I)), calculate a first function indicating a relationship between a recall rate and the number of frames of the video image frame when the required recognition rate is the acquired required recognition rate (mathematical concept to determine a set of features; see MPEP § 2106.04(a)(2)(I)), calculate a required recall rate indicating a recall rate required for one video image frame as the analysis performance based on the calculated number of frames in the required delay and the calculated first function (mathematical concept to determine a set of features; see MPEP § 2106.04(a)(2)(I)), and calculate a required video image bit rate indicating a video image bit rate required for the video image as the parameter based on the calculated required recall rate calculated by the first calculation unit and a second function indicating a relationship between a recall rate and a video image bit rate (mathematical concept to determine a set of features; see MPEP § 2106.04(a)(2)(I)). Claim 5 recites the network management apparatus according to claim 4 (as described above), wherein the second function is calculated by taking a variation in the recognition rate into account (mathematical concept to determine a set of features; see MPEP § 2106.04(a)(2)(I)). Claim 6 recites the network management apparatus according to claim 1 (as described above), wherein the at least one processor is further configured to execute the instructions to acquire the required quality in accordance with the target object (mathematical concept to determine a set of features; see MPEP § 2106.04(a)(2)(I)). Claim 13-17, 20 recite a video image distribution system comprising: duplicate claim limitations to claims 1-6 (discussed above in claim objections). Claim 18 recites the video image distribution system according to claim 16 (as described above), wherein the at least one processor is further configured to execute the instructions to: encode the video image based on the calculated required video image bit rate (mathematical concept to determine a set of features; see MPEP § 2106.04(a)(2)(I)); and distribute the encoded video image encoded by the encoder through a network (considered insignificant well known post-solution activity that is routine and well known activity in the art; see MPEP § 2106.05(g)). Claim 19 recites the video image distribution system according to claim 16 (as described above), wherein the at least one processor is further configured to execute the instructions to set a guaranteed band for the network used for the distribution of the video image based on the calculated required video image bit rate (mathematical concept to determine a set of features; see MPEP § 2106.04(a)(2)(I)). The claimed invention is directed to an abstract idea without significantly more. The claims recite mathematical concepts as outlined above and described in the MPEP 2106.04(a)(2)(I) with select limitations directed to extra-solution activity under MPEP 2106.04(g). MPEP 2106.04(a)(2)(I) states: It is important to note that a mathematical concept need not be expressed in mathematical symbols, because "[w]ords used in a claim operating on data to solve a problem can serve the same purpose as a formula." In re Grams, 888 F.2d 835, 837 and n.1, 12 USPQ2d 1824, 1826 and n.1 (Fed. Cir. 1989). See, e.g., SAP America, Inc. v. InvestPic, LLC, 898 F.3d 1161, 1163, 127 USPQ2d 1597, 1599 (Fed. Cir. 2018) (holding that claims to a ‘‘series of mathematical calculations based on selected information’’ are directed to abstract ideas); Digitech Image Techs., LLC v. Elecs. for Imaging, Inc., 758 F.3d 1344, 1350, 111 USPQ2d 1717, 1721 (Fed. Cir. 2014) (holding that claims to a ‘‘process of organizing information through mathematical correlations’’ are directed to an abstract idea); and Bancorp Servs., LLC v. Sun Life Assurance Co. of Can. (U.S.), 687 F.3d 1266, 1280, 103 USPQ2d 1425, 1434 (Fed. Cir. 2012) (identifying the concept of ‘‘managing a stable value protected life insurance policy by performing calculations and manipulating the results’’ as an abstract idea). Therefore, if a claim limitation, under its broadest reasonable interpretation, covers performance of the limitation based on mathematical concepts but for the recitation of generic computer components, then it falls within the “Mathematical Concepts” grouping of abstract ideas. Accordingly, these claims each recite an abstract idea. This judicial exception is not integrated into a practical application. The computer components are recited at a high-level of generality (i.e., generic computer components (memory, processor, and instructions including generic models for performing a general function of calculating quality metrics from image data, which is described with a high level of generality of automating a mathematical operation) such that it amounts to no more than mere instructions to apply the exception using a generic computer component. Accordingly, the computer components do not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea. Therefore, the aforementioned claims are directed to abstract ideas. The claims do not include additional elements that are sufficient to amount to significantly more than the judicial exception. As discussed above with respect to integration of the abstract idea into a practical application, the additional element of using a generic placeholder-related computer components, the memory to store instructions executed on a processor to perform mathematical calculations to determine mathematical relationships amounts to no more than mere instructions to apply the exception using a generic computer component. Mere instructions to apply an exception using a generic computer component cannot provide an invention concept. The claims are not patent eligible. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-2, 6, 13-14, 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Ozawa (US 2017/0353753). Regarding Claim 1, Ozawa teach a network management apparatus (server 300 (computer (700) with communication interface 305 (705); Fig 1, 3, 8 and ¶ [0086]-[0087]) comprising: at least one memory storing instructions (program stored on memory 702; Fig 8 and ¶ [0086]), and at least one processor configured to execute the instructions (CPU 701 executing program stored on memory 702; Fig 8 and ¶ [0086], [0091]) to; acquire, in order to recognize a target object from a video image (video images of object 100 are captured by cameras 200A-D and the server 300 acquires the associated image data; Fig 1 and ¶ [0029], [0072]), a required quality indicating quality of the video image required by a video image distribution apparatus configured to distribute the video image (the client apparatus 400 may contain a restriction range of the allowable viewpoint range based on the feature information of a media presentation description (MPD; ¶ [0021)], so it will serve a viewing purpose; Fig 1, 3, 4, 5A and ¶ [0033]-[0035], [0052], [0076]-[0077]); calculate an analysis performance for analyzing the video image based on the acquired required quality, the analysis performance enabling the target object to be recognized (the client apparatus 400 analyzes the selected image segment transmitted MPD data from apparatus 300, including the proper bit rate or resolution, which allow for viewing of the object 100; Fig 1, 3, 4, 5A and ¶ [0033]-[0035], [0053]-[0056], [0077]); and calculate a parameter related to the distribution of the video image based on the calculated analysis performance (the image segment representation data is used to calculate an adaptation rate data, representing different bit rates or resolutions of image segments based on the MPD (an additional parameter is the CPU utilization rate); Fig 1, 3, 4, 5A and ¶ [0034], [0055]-[0056], [0078]). Regarding Claim 2, Ozawa teach the network management apparatus according to claim 1 (as described above), wherein the at least one processor is further configured to execute the instructions (CPU 701 executing program stored on memory 702; Fig 8 and ¶ [0086], [0091]) to; acquire a video image quality of the video image (the video image segment quality data, such as resolution or bit rate, is acquired as feature information of the image capture information; ¶ [0031], [0051]), and calculate the analysis performance based on the acquired required quality and the acquired video image quality (the bit rate or resolution is calculated from the MPD data to determine the segment having the proper bit rate or resolution; ¶ [0033]-[0034], [0053]). Regarding Claim 6, Ozawa teach the network management apparatus according to claim 1 (as described above), wherein the at least one processor is further configured to execute the instructions to acquire the required quality in accordance with the target object (the target object is the virtual viewpoint used as the analyzed MPD data; ¶ [0024], [0027]-[0028], [0033]-[0034]). Regarding Claim 13, Ozawa teach a video image distribution system (server 300 (computer (700) with communication interface 305 (705); Fig 1, 3, 8 and ¶ [0086]-[0087]) comprising: claim limitations identical to claim 1 (as discussed above). Regarding Claim 14, Ozawa teach the video image distribution system according to claim 13 (as described above), wherein further limitations are claimed identical to claim 2 (as discussed above). Regarding Claim 20, Ozawa teach the video image distribution system according to claim 13 (as described above), wherein further limitations are claimed identical to claim 6 (as discussed above). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 3-5, 15-19 are rejected under 35 U.S.C. 103 as being unpatentable over Ozawa (US 2017/0353753) in view of Otsuka et al (JP 2021/057768, with foreign priority claimed by US 2022/0337851, and US ‘851 cited below as the translation). Regarding Claim 3, Ozawa teach the network management apparatus according to claim 2 (as described above), including the at least one processor configured to execute the instructions (CPU 701 executing program stored on memory 702; Fig 8 and ¶ [0086], [0091]). Ozawa does not teach to acquire a required recognition rate indicating a recognition rate required for the video image and a required amount of delay indicating an upper limit of an amount of delay required for the video image as the required quality, and acquires a frame rate of the video image as the video image quality, and the recognition rate is a rate at which the target object is recognized in at least one video image frame among a plurality of video image frames. Otsuka et al is analogous art pertinent to the technological problem addressed in this application and teaches to acquire a required recognition rate indicating a recognition rate required for the video image and a required amount of delay indicating an upper limit of an amount of delay required for the video image as the required quality (a delay of frame rate is produced from compression coding of visual data to frame buffer for transfer and decompression to display (recognition rate) by display device; Fig 5-7 and ¶ [0097]-[0099]), and acquires a frame rate of the video image as the video image quality (the frame rate is based on a displayable state of the decoding-decompressing of the transferred image data; Fig 5-7 and ¶ [0100]), and the recognition rate is a rate at which the target object is recognized in at least one video image frame among a plurality of video image frames (the display of the image data is based on the synchronization of the vertical and horizontal signal within an allowable range of the display; Fig 5-7 and ¶ [0101]-[0102]). It would have been obvious to one of ordinary skill in the art before the effective filing date of the current application to combine the teachings of Ozawa with Otsuka et al including to acquire a required recognition rate indicating a recognition rate required for the video image and a required amount of delay indicating an upper limit of an amount of delay required for the video image as the required quality, and acquires a frame rate of the video image as the video image quality, and the recognition rate is a rate at which the target object is recognized in at least one video image frame among a plurality of video image frames. By determining the recognition processing rate of transmitted image data, a delay in transmission may be calculated to determine coding size, thereby providing a means to adapt the transfer of data and improve the efficient in transfer and quality in the image data displayed, as recognized by Otsuka et al (¶ [0061], [0064]). Regarding Claim 4, Ozawa in view of Otsuka et al teach the network management apparatus according to claim 3 (as described above), wherein the at least one processor is further configured to execute the instructions (Otsuka et al, server 400 contains processors (CPU 402, GPU 404, encoder 408 and control section 412 to monitor/control data; Fig 3, 5 and ¶ [0062]-[0066]) to; calculate the number of frames in the required delay indicating the number of frames of the video image frame generated within the required amount of delay based on the acquired required amount of delay and the acquired frame rate (Otsuka et al a decoding decompression procession can be determined, influencing the delay of data from compression to display based on the frame data size; Fig 7 and ¶ [0098]-[0100], [0133]), calculate a first function indicating a relationship between a recall rate and the number of frames of the video image frame when the required recognition rate is the acquired required recognition rate (Otsuka et al a history of formation times of a predetermined number of transmitted partial images with associated formation time can be determined; Fig 6, 7 and ¶ [0106], [0133]), calculate a required recall rate indicating a recall rate required for one video image frame as the analysis performance based on the calculated number of frames in the required delay and the calculated first function (Otsuka et al, the data acquisition status specifying section 248 can acquire the formation time of corresponding partial image from formation time transmitted to determine if there is an increase in elapsed processing time; Fig 6, 7 and ¶ [0133]), and calculate a required video image bit rate indicating a video image bit rate required for the video image as the parameter based on the calculated required recall rate and a second function indicating a relationship between a recall rate and a video image bit rate (Otsuka et al, the image compression size (bit rate ¶ [0478]-[0481]) can be determined for the given display formation, which may change based on the processing time and transmission time; Fig 6, 7 and ¶ [0134]-[0137]). Regarding Claim 5, Ozawa in view of Otsuka et al teach the network management apparatus according to claim 4 (as described above), wherein the second function is calculated by taking a variation in the recognition rate into account (Otsuka et al, the transfer of image data is influenced by the data size, with processing variation influencing the acquisition; Fig 7; ¶ [0103]-[0104]). Regarding Claim 15, Ozawa teach the video image distribution system according to claim 14 (as described above), wherein further limitations are claimed identical to claim 3 (as discussed above). Regarding Claim 16, Ozawa teach the video image distribution system according to claim 15 (as described above), wherein further limitations are claimed identical to claim 4 (as discussed above). Regarding Claim 17, Ozawa teach the video image distribution system according to claim 16 (as described above), wherein further limitations are claimed identical to claim 5 (as discussed above). Regarding Claim 18, Ozawa in view of Otsuka et al teach the video image distribution system according to claim 16 (as described above), wherein the at least one processor is further configured to execute the instructions (Otsuka et al, server 400 contains processors (CPU 402, GPU 404, encoder 408 and control section 412 to monitor/control data; Fig 3, 5 and ¶ [0062]-[0066]) to: encode the video image based on the calculated required video image bit rate (the video encoder 408 encodes the image data (bit rate ¶ [0478]-[0481]) based on the control selection 412; Fig 3 and ¶ [0063]-[0066]); and distribute the encoded video image encoded by the encoder through a network (the encoded image data is output from interface 416 to the image processing apparatus 200 interface 202; Fig 3 and ¶ [0067]). Regarding Claim 19, Ozawa in view of Otsuka et al teach the video image distribution system according to claim 16 (as described above), wherein the at least one processor is further configured to execute the instructions (Otsuka et al, server 400 contains processors (CPU 402, GPU 404, encoder 408 and control section 412 to monitor/control data; Fig 3, 5 and ¶ [0062]-[0066]) to set a guaranteed band for the network used for the distribution of the video image based on the calculated required video image bit rate (Otsuka et al, encoding units 442, 444 uses a minimum resolution (guaranteed band based on size for distribution to maintain quality) for encoding; ¶ [0267], [0270]). Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Guo et al (US 2020/0288143) teach a video encoding method and system, including an encoding code rate control (bit allocation and bit control, applied to an IPPP format) and used to generate a compressed code stream of a first picture, where a motion estimation (influencing image distribution quality) is applied to determine the bit limits transferred. Any inquiry concerning this communication or earlier communications from the examiner should be directed to KATHLEEN M BROUGHTON whose telephone number is (571)270-7380. The examiner can normally be reached Monday-Friday 8:00-5:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, John Villecco can be reached at (571) 272-7319. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /KATHLEEN M BROUGHTON/Primary Examiner, Art Unit 2661
Read full office action

Prosecution Timeline

Mar 11, 2024
Application Filed
Feb 07, 2026
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602915
FEATURE FUSION FOR NEAR FIELD AND FAR FIELD IMAGES FOR VEHICLE APPLICATIONS
2y 5m to grant Granted Apr 14, 2026
Patent 12597233
SYSTEM AND METHOD FOR TRAINING A MACHINE LEARNING MODEL
2y 5m to grant Granted Apr 07, 2026
Patent 12586203
IMAGE CUTTING METHOD AND APPARATUS, COMPUTER DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12567227
METHOD AND SYSTEM FOR UNSUPERVISED DEEP REPRESENTATION LEARNING BASED ON IMAGE TRANSLATION
2y 5m to grant Granted Mar 03, 2026
Patent 12565240
METHOD AND SYSTEM FOR GRAPH NEURAL NETWORK BASED PEDESTRIAN ACTION PREDICTION IN AUTONOMOUS DRIVING SYSTEMS
2y 5m to grant Granted Mar 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
83%
Grant Probability
92%
With Interview (+8.3%)
2y 7m
Median Time to Grant
Low
PTA Risk
Based on 263 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month