Prosecution Insights
Last updated: April 19, 2026
Application No. 18/527,490

3D OBJECT RECOGNITION USING 3D CONVOLUTIONAL NEURAL NETWORK WITH DEPTH BASED MULTI-SCALE FILTERS

Non-Final OA §101§103§112§DP
Filed
Dec 04, 2023
Examiner
VARNDELL, ROSS E
Art Unit
2674
Tech Center
2600 — Communications
Assignee
Intel Corporation
OA Round
1 (Non-Final)
85%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
98%
With Interview

Examiner Intelligence

Grants 85% — above average
85%
Career Allow Rate
520 granted / 615 resolved
+22.6% vs TC avg
Moderate +13% lift
Without
With
+13.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
28 currently pending
Career history
643
Total Applications
across all art units

Statute-Specific Performance

§101
6.3%
-33.7% vs TC avg
§103
66.9%
+26.9% vs TC avg
§102
6.4%
-33.6% vs TC avg
§112
10.7%
-29.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 615 resolved cases

Office Action

§101 §103 §112 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 32, 39, and 45 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 32, 39, and 45 recite concatenating "the first feature vector and the second feature map" (emphasis added). The prior limitations generate "a second feature vector from the second feature map." The concatenation step should reference "the second feature vector," not "the second feature map." This appears to be a drafting inconsistency. The claim generates a second feature vector for this purpose but then concatenates a feature map. The claim recites generating "a second feature vector from the second feature map" and then "concatenating the first feature vector and the second feature map." It is unclear whether the concatenation step is intended to operate on the second feature vector or the second feature map, as these are distinct claim elements. The claim is therefore not reasonably certain in its scope. Double Patenting The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. Claims 26-45 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-21 of U.S. Patent No. 11,880,770. Although the claims at issue are not identical, they are not patentably distinct from each other because the independent claims of the instant application are broader versions o the patented claims. The omitted limitations required by the patented claims – 3D filters, applied to a 3D image segment, that the filters comprise 3D cells of the same spatial size with the first filter comprising more cells than the second, and that the second feature map bypasses the second convolutional layer – are such that the embodiment encompassed by the patented claims would fall within the scope of the instant application’s claims, and the broader claims of the instant application would have been obvious of the narrower patenting claims in view of the common disclosure. Claim Rejections - 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 26-45 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. The claim(s) recite(s) mathematical operations of applying convolutional filter matrices of different sizes to an input array, generating feature map arrays, routing feature maps through or bypassing additional layers, and concatenating feature vectors to generate an output prediction. These operations constitute mathematical concepts under Alice Corp. v. CLS Bank lnt'l, 573 U.S. 208 (2014), and Mayo Collaborative Servs. v. Prometheus Labs., Inc., 566 U.S. 66 (2012). This judicial exception is not integrated into a practical application because under the framework of Recentive Analytics, Inc. v. Fox Corp., No. 2022-2027 (Fed. Cir. 2024), defining a novel neural network architecture or training approach that improves ML performance, without more, does not constitute a practical application or technical improvement sufficient to confer eligibility. The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements (processor, CRM, convolutional layers, pooling) are well-understood, routine, and conventional components that do not add an inventive concept beyond the abstract idea. Step 1 -Statutory Category Claims 26-32: Process Claims 33-39: Machine (CRM) Claims 40-45: Machine All claims satisfy Step 1. Step 2A, Prong 1 – Judicial Exception Independent claims 26, 33, 40 all recite: Providing a 3D input to parallel convolutional layers with filters of different spatial sizes Generating feature maps from each filter Routing larger-filter feature maps through additional convolutional layers; smaller filter feature maps bypass those layers Concatenating resulting feature vectors at a fully connected layer Generating an object recognition output (prediction) The core operations are mathematical in nature – applying weighted filter matrices to input arrays, generating feature map arrays, and concatenating vectors. These are mathematical concepts (mathematical operations/relationships). Under Alice/Mayo Step 2A Prong 1, the claims recite an abstract idea. Under Recentive Analytics, Inc. v. Fox Corp. (Fed. Cir. 2024), training or applying a neural network, even with specific architectural choices, can constitute an abstract mathematical concept when the claims do not recite a specific technological improvement beyond the ML operations themselves. Step 2A Prong 1: YES, the claims recite abstract idea (mathematical operations/mathematical relationships comprising CNN filter application and feature map generation). Step 2A, Prong 2 - Practical Application This is where the claims have the most substance. The key question becomes: does the specific multiscale, depth-aware filter architecture with bypass routing constitute a technical improvement to computer/CNN technology, or merely an application of mathematical operations? The independent claims (26, 33, 40) do not recite the fixed physical cell size - that appears only in the specification and in the dependent claim 31, 38, and 44 context (cells of "the same spatial size," but not a fixed physical dimension). The independent claims are written at a high level of architectural generality. The "different spatial size" for filters is a mathematical relationship, not a physical constraint in the independent claims. Under Recentive Analytics, defining a novel CNN architecture to improve ML performance, without more, may still be abstract. The claims recite a specific technical implementation - parallel multi-scale filter paths with bypass routing that the specification characterizes as a technical improvement to CNN performance. This is closer to the eligible claims in PTO Example 47 (specific technical implementation improving computer functioning) than to the ineligible "train generic ML on domain data" pattern. However, because the independent claims do not recite the fixed physical spatial cell size that most strongly anchors the invention in physical 3D space, the independent claims present a meaningful § 101 rejection risk under Step 2A Prong 2. Step 2A Prong 2: NO for claims 26, 33, 40. The claims may be more defensible than a generic Recentive-style rejection, but the absence of the physical cell-size constraint in the independent claims weakens the Prong 2 argument. Step 2B- Inventive Concept The bypass routing architecture (small-filter feature maps skipping final convolutional layers) is not disclosed in the specification as well-understood, routine, or conventional. The spec presents it as a novel architectural contribution. Standard convolutional layers, pooling, and FC layers are individually conventional (WURC), but the specific combination, parallel filters of different spatial sizes with differential bypass paths and concatenation, may not be. Step 2B: NO, given that multi-scale CNN architectures (e.g., inception modules) were known. Evidence of WURC supports a Step 2B rejection. Claims Result Analysis 26, 33, 40 (independent) Ineligible See above 27, 34, 41 Ineligible Adds "different spatial size/ same spatial size" relationship – mathematical 28, 35, 42 Ineligible Adds "one or more other convolutional layers" – conventional layer type 29, 36, 42 Ineligible Adds third filter of different spatial size – mathematical 30, 37, 43 Ineligible Adds fourth feature map pathway 31, 38, 44 Ineligible Adds "plurality of cells having the same spatial size" – begins to recite physical spatial constraint; eligible if physical cell size (¶ [0028]) is added 32, 39, 45 Ineligible Concatenation step Note: Applicant may overcome this rejection by amending the independent claims to recite the fixed physical spatial cell size (e.g., cells of a fixed physical resolution such as (0. 1 m)3, per [0028]), which would more strongly anchor the mathematical operations to a specific physical 3D measurement framework and distinguish the claims from generic ML architectural claims. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Claims 26, 29-33, 36-40, 42-45 is/are rejected under 35 U.S.C. 103 as being unpatentable over Szegedy et al. (Going Deeper with Convolutions – hereinafter “Szegedy” or “Inception” or “GoogLeNet”) in view of Carreira et al. (US 2020/0125852 A1 – hereinafter “Carreira” or “DeepMind”). Claims 26, 33, and 40. An apparatus, comprising: a computer processor for executing computer program instructions; and one or more non-transitory computer-readable media storing computer program instructions executable by the computer processor to perform operations (Szegedy p. 6 “Although we used a CPU based implementation only, a rough estimate suggests that the GoogLeNet network could be trained to convergence using few high-end GPUs”) comprising: providing an input to a first convolutional layer in a neural network, the first convolutional layer having a first filter (Szegedy p. 3 “current incarnations of the Inception architecture are restricted to filter sizes, and 1x1, 3x3, 5x5”; p. 4, Fig. 2(a) shows parallel 1x1, 3x3, and 5x5 convolutions), generating, in the first convolutional layer, a first feature map from the input and the first filter (Szegedy p. 3 “the suggested architecture is a combination of all those layers with their output filter banks concatenated into a single output vector”; p. 4, Fig. 2(a) shows each parallel convolution branch produces its own output feeding into “Filter concatenation” Szegedy does not use the term “feature map.” The paper uses the terms “output filter banks” and “output vector.” The term “feature map” is standard art-recognized terminology for the output of a convolution layer. The specification ¶21 defines feature maps as outputs of convolution filtering and pooling, which is what Szegedy’s inception module produce.), providing the input to a second convolutional layer in the neural network, the second convolutional layer having a second filter, wherein the second filter has a different spatial size from the first filter (Szegedy p. 3 “current incarnations of the Inception architecture are restricted to filter sizes, and 1x1, 3x3, 5x5” – the 3x3 and 5x5 branches both receive the same input and have different spatial sizes from each other; p. 4, Fig. 2(a) shows each parallel convolution branch receives the same “Previous layer” input in parallel), generating, in the second convolutional layer, a second feature map from the input and the second filter (Szegedy same as above – p. 4, Fig. 2(a) shows each parallel convolution branch generates its own output map at a different spatial size), generating, in one or more other layers arranged after the first convolutional layer in the neural network, a third feature map from the first feature map (Szegedy pp. 3-4 larger filter output pass through subsequent Inception modules – “As these ‘Inception modules’ are stacked on top of each other, their output correlation statistics are bound to vary.” The output of one Inception module feed as the input to the next.; Table 1: inception (3a) output feeds inception (3b), which feeds inception (4a), etc. – multiple layers arranged in sequence each generating feature maps from the prior.), and generating an output of the neural network based on the third feature map and the second feature map, wherein the output of the neural network comprises a prediction made by the neural network from the input (Szegedy p. 4 “the suggested architecture is a combination of all those layers with their output filter banks concatenated into a single output vector forming the input of the next stage.” Table 1: avg pool → dropout (40%) → linear → softmax with output size 1x1x1000 – this is the prediction output; p. 6 “The ILSVRC 2014 classification challenge involves the task of classifying the image into one of 1000 leaf-node categories”). Szegedy discloses all of the subject matter as described above except for specifically teaching “feature maps.” However, Carreira in the same field of endeavor teaches feature maps (¶37 “feature maps from the upper layers of the 3D convolutional neural networks 110, 120 may be combined and provided to one or more further convolutional neural network layers implemented by combiner 130.”). Therefore, it would have been obvious to one of ordinary skill in the art to combine Szegedy and Carreira before the effective filing date of the claimed invention. The motivation for this combination of references would have been to apply the multi-scale parallel convolution architecture of Szegedy to 3D spatio-temporal data as taught by Carreira, since both reference originate from Google and address convolutional neural network architectures for feature extraction, and Carreira demonstrates that 3D CNNs generate feature maps that can be combined in the same manner as their 2D counterparts. Claims 29, 36, and 42. The combination of Szegedy and Carreira discloses the method of claim 26, further comprising: providing the input to a third convolutional layer in the neural network, the third convolutional layer having a third filter, wherein the third filter has a different spatial size from the first filter and different from the second filter; and generating, in the third convolutional layer, a third feature map from the input and the third filter (Szegedy p. 3 filter sizes 1x1, 3x3, and 5x5 produce feature maps of different spatial sizes). Claims 30, 37, and 43. The combination of Szegedy and Carreira discloses the apparatus of claim 42, wherein the operations further comprise: generating, in one or more additional layers arranged after the third convolutional layer in the neural network, a fourth feature map from the third feature map, wherein the output of the neural network is generated further based on the fourth feature map (Szegedy Table 1: output of inception (3a) contains three filter-size branches, feeds inception (3b), which in turn generates new feature maps. All of these ultimately contribute to the final softmax output. P. 4, “the next stage can abstract features from the different scales simultaneously” – each subsequent stage generates new feature maps from prior stage outputs, all contributing to the final prediction). Claims 31, 38, and 44. The combination of Szegedy and Carreira discloses the apparatus of claim 40, wherein the first filter and the second filter comprise a plurality of cells having the same spatial size, and the first filter comprises more cells than the second filter (Szegedy p. 3 filters of size 3x3 and 5x5 can be characterized as having cells (pixels) of the same spatial size, with the 5x5 filter having more cells (25) than the 3x3 filter (9). This is a geometric fact about filter definitions – a 5x5 filter has more cells of the same unit pixel size than a 3x3 filter.) Claims 32, 39, and 45. The combination of Szegedy and Carreira discloses the apparatus of claim 40, wherein generating the output of the neural network comprises: generating a first feature vector from the third feature map; generating a second feature vector from the second feature map; and concatenating the first feature vector and the second feature map (Szegedy p. 4 “the suggested architecture is a combination of all those layers with their output filter banks concatenated into a single output vector forming the input of the next stage.”; Fig. 2(a) “filter concatenation” block receives all parallel branch outputs and concatenates them; p. 5, Table 1, shows “avg pool → linear” receiving the combined feature representation). Claims 27-28, 34-35, and 41 is/are rejected under 35 U.S.C. 103 as being unpatentable over Szegedy in view of Carreira, further in view of Xu et al. (US 2019/0130573 A1 – hereinafter “Xu”). Claims 27, 34, and 41. The combination of Szegedy and Carreira discloses the apparatus of claim 40, wherein the first feature map has a different spatial size from the second feature map, and (Szegedy p. 3 filter sizes 1x1, 3x3, and 5x5 produce feature maps of different spatial sizes – the 3x3 and 5x5 branches both receive the same input and have different spatial sizes from each other; In Szegedy the “Filter concatenation” where all branches must have the same spatial dimensions. Szegedy achieves this through padding and stride choices, but not through an explicit teaching.). Szegedy discloses all of the subject matter as described above except for specifically teaching “the third feature map has a same spatial size as the second feature map.” However, Xu in the same field of endeavor teaches the third feature map has a same spatial size as the second feature map (Xu ¶63 “Padding, which is the process of adding a border of "0's" to the activation map before being forwarded to the next convolution layer ( and application of a different filter) can be performed to achieve the same size output image as input image. Padding is typically performed in CNNs to ensure that the image size does not shrink over every convolutional layer.”). Therefore, it would have been obvious to one of ordinary skill in the art to combine Szegedy and Xu before the effective filing date of the claimed invention. The motivation for this combination of references would have been to apply the well-known padding technique of Xu to the parallel branches of Szegedy Inception module to ensure the feature maps have the same spatial dimensions prior to concatenation, as padding for dimension preservation was a routine practice in convolutional neural networks at the time of filing. Claims 28 and 35. The combination of Szegedy and Carreira discloses the method of claim 27, wherein the one or more other layers comprise one or more other convolutional layers (Szegedy Table 1: between inception (3a) and the fully connected layers there are inception (3b), (4a), (4b), (4c), (4d), (4e), (5a), (5b) – all convolution layers). Conclusion The prior art made of record but not relied, yet considered pertinent to the applicant’s disclosure, is listed on the PTO-892 form. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Ross Varndell whose telephone number is (571)270-1922. The examiner can normally be reached M-F, 9-5 EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, O’Neal Mistry can be reached at (313)446-4912. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Ross Varndell/Primary Examiner, Art Unit 2674
Read full office action

Prosecution Timeline

Dec 04, 2023
Application Filed
Mar 19, 2026
Non-Final Rejection — §101, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603810
System and Method for Communications Beam Recovery
2y 5m to grant Granted Apr 14, 2026
Patent 12597238
AUTOMATIC IMAGE VARIETY SIMULATION FOR IMPROVED DEEP LEARNING PERFORMANCE
2y 5m to grant Granted Apr 07, 2026
Patent 12582348
DEVICE AND METHOD FOR INSPECTING A HAIR SAMPLE
2y 5m to grant Granted Mar 24, 2026
Patent 12579441
SYSTEMS AND METHODS FOR IMAGE RECONSTRUCTION
2y 5m to grant Granted Mar 17, 2026
Patent 12579786
SYSTEM AND METHOD FOR PROPERTY TYPICALITY DETERMINATION
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
85%
Grant Probability
98%
With Interview (+13.0%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 615 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month