Prosecution Insights
Last updated: April 19, 2026
Application No. 18/568,133

RECOGNITION DEVICE, TERMINAL APPARATUS, RECOGNIZER CONSTRUCTING APPARATUS, RECOGNIZER MODIFYING APPARATUS, CONSTRUCTION METHOD, AND MODIFICATION METHOD

Non-Final OA §101§102§103
Filed
Dec 07, 2023
Examiner
ROSARIO, DENNIS
Art Unit
2676
Tech Center
2600 — Communications
Assignee
Kyocera Corporation
OA Round
1 (Non-Final)
69%
Grant Probability
Favorable
1-2
OA Rounds
3y 8m
To Grant
98%
With Interview

Examiner Intelligence

Grants 69% — above average
69%
Career Allow Rate
385 granted / 557 resolved
+7.1% vs TC avg
Strong +29% interview lift
Without
With
+28.6%
Interview Lift
resolved cases with interview
Typical timeline
3y 8m
Avg Prosecution
34 currently pending
Career history
591
Total Applications
across all art units

Statute-Specific Performance

§101
16.5%
-23.5% vs TC avg
§103
40.3%
+0.3% vs TC avg
§102
24.6%
-15.4% vs TC avg
§112
13.6%
-26.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 557 resolved cases

Office Action

§101 §102 §103
DETAILED ACTION Claims 14,16 and 6-13 and 15 objected to because of the following informalities: Claims 1-16 not rejected under 35 U.S.C. 101 because the claimed invention is directed to improving the computing field not without significantly more (streamlined analysis) in view of applicant’s disclosure of information processing at [0103]: Claim(s) 1,5 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation. Claim(s) 1,5 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation. Claim(s) 2 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 further in view of Dai et al. (US 8,105,777 B1): Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 further in view of FAUDEMAY (WO 99/40539 A1) with machine translation: Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 further in view of MAMORU et al. (KR 10-2010-0100933 A) with SEARCH machine translation: Claim(s) 14,16 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 in view of ZHU et al. (CN 101964059 A) with machine translation further in view of JEAN-PIERRE et al. (FR 2703804 A1) with machine translation as applied in claims 6,12,15 below, further in view of MAMORU et al. (KR 10-2010-0100933 A) with SEARCH machine translation as applied in claim 4, above: Claim(s) 6,12 and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 in view of ZHU et al. (CN 101964059 A) with machine translation further in view of JEAN-PIERRE et al. (FR 2703804 A1) with machine translation: Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 in view of ZHU et al. (CN 101964059 A) with machine translation further in view of JEAN-PIERRE et al. (FR 2703804 A1) with machine translation as applied in claim 6,12,15 further in view of MAMORU et al. (KR 10-2010-0100933 A) with SEARCH machine translation: Claim(s) 8 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 in view of ZHU et al. (CN 101964059 A) with machine translation further in view of JEAN-PIERRE et al. (FR 2703804 A1) with machine translation as applied in claims 6,12,15 further in view of Bequet et al. (US 2020/0133977 A1): Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 in view of ZHU et al. (CN 101964059 A) with machine translation further in view of JEAN-PIERRE et al. (FR 2703804 A1) with machine translation as applied in claims 6,12,15 further in view of Bequet et al. (US 2020/0133977 A1) as applied in claim 8 further in view of MAMORU et al. (KR 10-2010-0100933 A) with SEARCH machine translation as applied in claim 4: Claim(s) 10,11 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 in view of ZHU et al. (CN 101964059 A) with machine translation further in view of JEAN-PIERRE et al. (FR 2703804 A1) with machine translation as applied in claims 6,12,15 further in view of Bequet et al. (US 2020/0133977 A1) as applied in claim 8 further in view of MASAMITSU et al (WO 2020/138479 A1) with machine translation: Claim(s) 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 in view of ZHU et al. (CN 101964059 A) with machine translation further in view of JEAN-PIERRE et al. (FR 2703804 A1) with machine translation as applied in claims 6,12,15 further in view of Brun et al. (US 2013/0230257 A1): PNG media_image1.png 770 962 media_image1.png Greyscale Claim Objections Claims 14,16 and 6-13 and 15 objected to because of the following informalities: PNG media_image2.png 770 373 media_image2.png Greyscale Claim 14 is objected like claim 6. Claim 16 is objected like claim 6. Claim 6’s last limitation is objected for not having “plural indentations to further segregate subcombinations or related steps” (see (k) CLAIM OR CLAIMS below): PNG media_image3.png 453 843 media_image3.png Greyscale Thus, claims 7,8,9,10,11,12,13 objected Claim 15 objected like claim 6. Appropriate correction is required (see (k) CLAIM OR CLAIMS below): Content of Specification (a) TITLE OF THE INVENTION: See 37 CFR 1.72(a) and MPEP § 606. The title of the invention should be placed at the top of the first page of the specification unless the title is provided in an application data sheet. The title of the invention should be brief but technically accurate and descriptive, preferably from two to seven words. It may not contain more than 500 characters. (b) CROSS-REFERENCES TO RELATED APPLICATIONS: See 37 CFR 1.78 and MPEP § 211 et seq. (c) STATEMENT REGARDING FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT: See MPEP § 310. (d) THE NAMES OF THE PARTIES TO A JOINT RESEARCH AGREEMENT. See 37 CFR 1.71(g). (e) INCORPORATION-BY-REFERENCE OF MATERIAL SUBMITTED ON A READ-ONLY OPTICAL DISC, AS A TEXT FILE OR AN XML FILE VIA THE PATENT ELECTRONIC SYSTEM: The specification is required to include an incorporation-by-reference of electronic documents that are to become part of the permanent United States Patent and Trademark Office records in the file of a patent application. See 37 CFR 1.77(b)(5) and MPEP § 608.05. See also the Legal Framework for Patent Electronic System posted on the USPTO website (https://www.uspto.gov/sites/default/files/documents/2019LegalFrameworkPES.pdf) and MPEP § 502.05 (f) STATEMENT REGARDING PRIOR DISCLOSURES BY THE INVENTOR OR A JOINT INVENTOR. See 35 U.S.C. 102(b) and 37 CFR 1.77. (g) BACKGROUND OF THE INVENTION: See MPEP § 608.01(c). The specification should set forth the Background of the Invention in two parts: (1) Field of the Invention: A statement of the field of art to which the invention pertains. This statement may include a paraphrasing of the applicable U.S. patent classification definitions of the subject matter of the claimed invention. This item may also be titled “Technical Field.” (2) Description of the Related Art including information disclosed under 37 CFR 1.97 and 37 CFR 1.98: A description of the related art known to the applicant and including, if applicable, references to specific related art and problems involved in the prior art which are solved by the applicant’s invention. This item may also be titled “Background Art.” (h) BRIEF SUMMARY OF THE INVENTION: See MPEP § 608.01(d). A brief summary or general statement of the invention as set forth in 37 CFR 1.73. The summary is separate and distinct from the abstract and is directed toward the invention rather than the disclosure as a whole. The summary may point out the advantages of the invention or how it solves problems previously existent in the prior art (and preferably indicated in the Background of the Invention). In chemical cases it should point out in general terms the utility of the invention. If possible, the nature and gist of the invention or the inventive concept should be set forth. Objects of the invention should be treated briefly and only to the extent that they contribute to an understanding of the invention. (i) BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING(S): See MPEP § 608.01(f). A reference to and brief description of the drawing(s) as set forth in 37 CFR 1.74. (j) DETAILED DESCRIPTION OF THE INVENTION: See MPEP § 608.01(g). A description of the preferred embodiment(s) of the invention as required in 37 CFR 1.71. The description should be as short and specific as is necessary to describe the invention adequately and accurately. Where elements or groups of elements, compounds, and processes, which are conventional and generally widely known in the field of the invention described, and their exact nature or type is not necessary for an understanding and use of the invention by a person skilled in the art, they should not be described in detail. However, where particularly complicated subject matter is involved or where the elements, compounds, or processes may not be commonly or widely known in the field, the specification should refer to another patent or readily available publication which adequately describes the subject matter. (k) CLAIM OR CLAIMS: See 37 CFR 1.75 and MPEP § 608.01(m). The claim or claims must commence on a separate sheet or electronic page (37 CFR 1.52(b)(3)). Where a claim sets forth a plurality of elements or steps, each element or step of the claim should be separated by a line indentation. There may be plural indentations to further segregate subcombinations or related steps. See 37 CFR 1.75 and MPEP 608.01(i) - (p). (l) ABSTRACT OF THE DISCLOSURE: See 37 CFR 1.72 (b) and MPEP § 608.01(b). The abstract is a brief narrative of the disclosure as a whole, as concise as the disclosure permits, in a single paragraph preferably not exceeding 150 words, commencing on a separate sheet following the claims. In an international application which has entered the national stage (37 CFR 1.491(b)), the applicant need not submit an abstract commencing on a separate sheet if an abstract was published with the international application under PCT Article 21. The abstract that appears on the cover page of the pamphlet published by the International Bureau (IB) of the World Intellectual Property Organization (WIPO) is the abstract that will be used by the USPTO. See MPEP § 1893.03(e). (m) SEQUENCE LISTING: See 37 CFR 1.821 - 1.825 and MPEP §§ 2421 - 2431. The requirement for a sequence listing applies to all sequences disclosed in a given application, whether the sequences are claimed or not. See MPEP § 2422.01. I see claim 6’s (14,15,16’s) last limitation as this: until the temporary lowermost-layer classifier satisfies the predetermined condition, repeat12 determination3 of a certain criterion that enables classification of all target objects each determined to belong to a certain category classified by a classifier in a layer immediately above the temporary lowermost-layer classifier,4 determination5 of lower-order categories to which all the target objects respectively belong based on the certain (reference) criterion,6 and replacement7 of the temporary lowermost-layer classifier with an intermediate-layer classifier constructed based on an image of each of all the target objects belonging to the determined lower-order categories and the lower-order categories;8 and construction9 of a temporary lowermost-layer classifier that classifies all the target objects belonging to the respective categories into respective pieces of the identification information, based on the image and the identification information of each of all the target objects belonging to the respective categories classified by the intermediate-layer classifier. Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “an acquisition unit configured to acquire” in claim 1; “an output10 device configured to report” in claim 5 “an acquisition unit configured to acquire” in claims 6,13; and “an acquisition unit configured to acquire” in claim 14. Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function (“acquire”; “report”), and equivalents thereof: PNG media_image4.png 239 870 media_image4.png Greyscale PNG media_image5.png 573 981 media_image5.png Greyscale If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. This application includes one or more claim limitations that use the word “means” or “step” but are nonetheless not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph because the claim limitation(s) recite(s) sufficient structure (“circuitry”), materials, or acts (“provide & acquire”) to entirely perform the recited function (“configured to function”; “configured to provide”). Such claim limitation(s) is/are: “a controller11 configured to function” in claim 1; “a communication unit configured to provide…and acquire”: acts: provide & acquire unit in claim 5: PNG media_image6.png 301 841 media_image6.png Greyscale “a controller12 configured to function” in claims 6,9,10,11,12,13; and “a controller configured to modify” in claim 14. Because this/these claim limitation(s) is/are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are not being interpreted to cover only the corresponding structure, material, or acts described in the specification as performing the claimed function, and equivalents thereof. If applicant intends to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to remove the structure, materials, or acts that performs the claimed function; or (2) present a sufficient showing that the claim limitation(s) does/do not recite sufficient structure, materials, or acts to perform the claimed function. 35 USC § 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1-16 not rejected under 35 U.S.C. 101 because the claimed invention is directed to improving the computing13 field not without significantly more (streamlined analysis) in view of applicant’s disclosure of information processing14 at [0103]: PNG media_image7.png 1227 1051 media_image7.png Greyscale Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claim(s) 1,5 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation. Claim(s) 1,5 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation. PNG media_image8.png 768 380 media_image8.png Greyscale Re 1. (Original), Hidetoshi discloses A recognition device comprising: an acquisition unit (“having a radar15 wave”, pg. 7, last txt blk) configured to acquire an (“optical”16, pg. 7, 4th txt blk) image (as for use on weather maps); and a controller (“in parallel”, pg. 12, 9th txt blk) configured to function as an object recognizer (via “A target identifying apparatus”, pg. 3, [0014]) that estimates a target object (via “extracts, as a case system estimation result, target17 information”, pg. 4, 6th txt blk) appearing in the (radar-weather map) image by causing multiple classifiers18 hierarchized (i.e., hierarchized classifying via “hierarchized” “concepts”19, pg. 4, 3rd txt blk) in multiple layers (forming the “target/ target layer”, pg. 11, 4th txt blk) to classify the target object in order (from 1st to 3rd layers), wherein the multiple (distinguishing-classifying) classifiers comprise: an uppermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) that classifies the target object appearing in the (radar) image into any one of multiple (“classification node”, pg. 11, 4th txt blk) categories; and multiple lower-layer classifiers (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) each of which performs (node) (conceptual-)classification20 of a (“helicopter”-)category21 classified22 by an upper-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) into a lower-order (“reconnaissance helicopter”-)category, the lower-layer classifiers (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) comprise one or more lowermost-layer (conceptual-nodal) classifiers each of which classifies the target object (via “extracts, as a case system estimation result, target23 information”, pg. 4, 6th txt blk) into the lower-order (“reconnaissance helicopter”-)category which is identification (“result”, pg. 3 [0014], last S) information of the target object, and a number (three: fig. 7) of layers from the uppermost-layer classifier to the one or more lowermost- layer classifiers is24 different (as shown in fig. 7) between at least two target objects (fig. 7: “Target Purpose”25 twice) having different26 (or “unique”27 having no equal) identification information (“to each model”, pg. 2, 5th txt blk) among (“simulated”, pg. 12, 9th txt blk) target objects (30) estimated (via “extracts, as a case system estimation result, target28 information”, pg. 4, 6th txt blk) by the object recognizer (via “A target identifying apparatus”, pg. 3, [0014] via: PNG media_image9.png 1167 1124 media_image9.png Greyscale Re 5. (Currently Amended), Hidetoshi discloses A terminal apparatus comprising: an image capturing (“storage”, pg. 9) unit (2); a communication unit (or “output”29 “computer”, pg. 5, 3rd txt blk) configured to provide an image generated (“17 is generated. That is…summarized”, pg. 10, 8th txt blk, via fig. 5:17: fig. 6B: integrated summarized/generated image) by the image capturing unit (2) to the recognition device according to claim 1 and acquire the identification information of a target (F-16) object appearing in the (weather-radar) image; and an output (computer) device configured to report (via transmission) the identification information (as reported) PNG media_image10.png 658 1083 media_image10.png Greyscale PNG media_image11.png 1380 862 media_image11.png Greyscale Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claim(s) 2 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 further in view of Dai et al. (US 8,105,777 B1): PNG media_image12.png 768 438 media_image12.png Greyscale Re 2. (Original), Hidetoshi teaches The recognition device according to claim 1, wherein a number (as shown in fig. 7) of (airplane) categories (conceptually) classified by at least one or some of the multiple (conceptual) classifiers is (via looking to fig. 7) equal (via “is”) to or less than (relative to each other as shown in fig. 7) a first threshold (“predetermined”, pg. 3 [0014], 1st text blk) value. Hidetoshi does not teach “first threshold”. Dai teaches “first threshold”, c. 37,ll. 37-45. Since Hidetoshi teaches similarity, one of skill in the art of similarity can make Hidetoshi’s be as Dai’s predictably recognizing the change “that results in the fewest misclassifications”, Dai, c. 37,ll. 37-45. Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 further in view of FAUDEMAY (WO 99/40539 A1) with machine translation: PNG media_image13.png 768 540 media_image13.png Greyscale Re 3. (Currently Amended), Hidetoshi teaches The recognition device according to claim 1 st txt blk: fig. 3:ST2: “Perform time series/integration processing on the monitoring records to generate unconfirmed target records”) in a classified category (via fig. 3:ST3: a classification step) of a (conceptual) feature quantity used for classification by at least one or some of the multiple (conceptual-node) classifiers is equal to or less than (relative to each other as shown in fig. 7) a second threshold (“predetermined”, pg. 3 [0014], 1st text blk) value (via: PNG media_image14.png 1436 833 media_image14.png Greyscale Hidetoshi does not teach “second threshold”. FAUDEMAY teaches a second threshold (“with a neighboring region”, pg. 3, 8th txt blk, as shown in figure 1: the sky). Since Hidetoshi teaches similarity and airplanes, one of skill in the art of similarity and airplanes can make Hidetoshi’s be as FAUDEMAY’s (“airplane in the sky”, pg. 10, 5th txt blk) predictably recognizing the change “to segment the images into significant objects” “with significant semantic value”, pg. 3, 4th txt blk, providing significant semantic meaning (via a radar “image” “label”, FAUDEMAY, pg. 14, 9th txt blk) to different types of aircraft in radar images like a weather map: PNG media_image15.png 2758 862 media_image15.png Greyscale Claim(s) 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 further in view of MAMORU et al. (KR 10-2010-0100933 A) with SEARCH machine translation: PNG media_image16.png 768 540 media_image16.png Greyscale Re 4. (Currently Amended), Hidetoshi teaches The recognition device according to claim 1, wherein a correct answer (“improved” “identification”, pg. 2, 4th txt blk) rate (“of the characteristic values”, pg. 11, 2nd txt blk) of all target (purpose) objects (figs. 5,8: D5a,30) classified by at least one or some of the one or more lowermost-layer classifiers (i.e., six hierarchized classifying via “hierarchized” “concepts”30, pg. 4, 3rd txt blk) is equal to or more than (relative to four upper conceptual classifications in fig. 7) a third threshold (“predetermined”, pg. 3 [0014], 1st text blk) value (via: PNG media_image17.png 1206 1082 media_image17.png Greyscale PNG media_image9.png 1167 1124 media_image9.png Greyscale Hidetoshi does not teach “correct answer…third threshold”. Mamoru teaches: correct answer (“rate from 100%”, pg. 4, 4th txt blk)…third threshold (“value of the classification probability”, pg., 16, last txt blk). Since Hidetoshi teaches an improved identification rate (represented in fig. 7 as target purpose) via classification nodes, one of skill in the art of classification can make Hidetoshi’s classification be as Mamoru’s predictably recognizing the change “setting…a highly accurate classification result”, Mamoru, pg. 17, 5th txt blk, due to advancement of technology creating new airplanes and thus new document31 label-names (like F-16) of airplanes in addition to said “F15” and “B747”: PNG media_image18.png 1167 1123 media_image18.png Greyscale Claim(s) 14,16 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 in view of ZHU et al. (CN 101964059 A) with machine translation further in view of JEAN-PIERRE et al. (FR 2703804 A1) with machine translation as applied in claims 6,12,15 below, further in view of MAMORU et al. (KR 10-2010-0100933 A) with SEARCH machine translation as applied in claim 4, above: PNG media_image19.png 770 794 media_image19.png Greyscale Claim 14 is rejected like claim 6 (reproduced below) with the exception of “new… modify… new… new…new”: Re 14. (Currently Amended), Hidetoshi of the combination of HIDETOSHI, ZHU, JEAN-PIERRE teaches A recognizer modifying apparatus comprising: An (optical-radar) acquisition unit configured to acquire at least an (optical) image and identification information (via said ID rate) of a new target (airplane) object; and a controller (in parallel) configured to modify the object recognizer (via “A target identifying apparatus”, pg. 3, [0014]) in the recognition device according to claim 1 by using the new target (airplane) object, wherein the (parallel) controller is configured to: cause the identification information (via said ID rate) of the new target (airplane) object to be estimated (via “extracts, as a case system estimation result, target32 information”, pg. 4, 6th txt blk) based on the (optical-radar) image by using the object recognizer (via “A target identifying apparatus”, pg. 3, [0014]); specify a lowermost-layer (F-15) classifier that has classified (node-clustered) the identification information (ID as a tree); replace a temporary lowermost-layer (F-15) classifier with the lowermost-layer (F-16) classifier, the temporary lowermost-layer (F-15) classifier being constructed based on the (Computer-Vision: CV) image and the identification (rate) information of each of all target (airplane) objects (in the tree) and the new target (airplane) object that are (node) classified by the lowermost-layer (F-17) classifier; fix the temporary lowermost-layer (B747) classifier as a lowermost-layer (B-747) classifier when the temporary lowermost-layer (F-15) classifier satisfies (“shape characteristics”, pg. 11, 2nd txt blk) a predetermined condition; and until the temporary lowermost-layer (F-17) classifier satisfies (“shape characteristics”, pg. 11, 2nd txt blk) the predetermined condition, repeat determination of a certain criterion (said “predetermined reference”, pg. 11, 2nd txt blk) that enables classification of all target objects each determined to belong to a certain category classified by a classifier in a layer immediately above the temporary lowermost-layer classifier, determination of (third) lower-order categories to which all the target objects respectively belong based on the certain criterion, and replacement of the temporary lowermost-layer classifier with an (second) intermediate-layer classifier constructed based on an image of each of all the target objects belonging to the determined lower-order categories and the lower-order categories; and construction of a temporary lowermost-layer (quality) classifier that classifies all the target objects belonging to the respective categories into respective pieces of the identification information, based on the image and the identification information of each of all the target objects belonging to the respective categories classified by the intermediate-layer classifier (via: Re 6. (Original), HIDETOSHI teaches An information processing apparatus (as indicated by the information connections in fig. 7) that constructs an object recognizer (via “A target identifying apparatus”, pg. 3, [0014]) that estimates identification information (via “extracts, as a case system estimation result, target information”, pg. 4, 6th txt blk) identifying a target object appearing in an image by causing multiple classifiers hierarchized (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) in multiple layers (forming the “target/ target layer”, pg. 11, 4th txt blk) to classify the target object in order (from 1st to 3rd layers), the recognizer constructing apparatus (as indicated by the information connections in fig. 7) comprising: an acquisition (“processor”, pg. 12, 9th txt blk) unit configured to acquire at least an image and identification information of each of multiple target objects; and a controller (“processor”, pg. 12, 9th txt blk) configured to construct multiple classifiers (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk), based on the image and the identification (rate) information of each of the multiple (“simulated”, pg. 12, 9th txt blk) target objects (30), wherein the multiple (distinguishing-classifying) classifiers comprise: an uppermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) that classifies, based on the (optical-radar) image acquired by the acquisition unit, the (“simulated”, pg. 12, 9th txt blk) target object in the (optical-radar) image into a (“classification node”, pg. 11, 4th txt blk) category; and a lowermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) that classifies the (“simulated”, pg. 12, 9th txt blk) target object (via “extracts, as a case system estimation result, target information”, pg. 4, 6th txt blk) belonging to a (“helicopter”-)category classified by an upper-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) into any piece (or parts as shown in fig. 7) of the identification (rate) information, and the controller (“processor”, pg. 12, 9th txt blk) is configured to: construct the uppermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) that determines, based on an initial (“reference”, pg. 11, 2nd txt blk) criterion, categories to which the multiple (“simulated”, pg. 12, 9th txt blk) target objects (30) respectively belong, and that classifies the (“simulated”, pg. 12, 9th txt blk) target objects (30) to the determined categories; construct, based on the image and the identification (rate) information of each of all the (“simulated”, pg. 12, 9th txt blk) target objects (30) belonging to the respective categories classified by the uppermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk), a temporary lowermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) that classifies all the (“simulated”, pg. 12, 9th txt blk) target objects belonging to the respective categories into respective pieces (or parts as shown in fig. 7) of the identification (rate) information; fix the temporary lowermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) as a lowermost-layer classifier when the temporary lowermost-layer classifier satisfies (“shape characteristics”, pg. 11, 2nd txt blk) a predetermined condition; and until the temporary lowermost-layer classifier (shape-feature-)satisfies the predetermined condition, repeat determination of a certain (said “predetermined reference”, pg. 11, 2nd txt blk) criterion that enables classification of all (airplane) target objects each determined to belong to a certain (node) category classified by a (quality) classifier in a (top) layer immediately above the temporary (third) lowermost-layer (quality) classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk), determination of (third) lower-order categories to which all the target (airplane) objects respectively belong based on the certain (reference) criterion, and replacement of the temporary lowermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) with an (second) intermediate-layer classifier (between the 1st and 2nd quality classifiers: fig. 7) constructed based on an (airplane) image of each of all the target objects belonging to the determined lower-order (node) categories and the lower-order categories; and construction of a temporary lowermost-layer (quality) classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) that classifies all the target (airplane) objects (by quality-features) belonging to the respective categories into respective pieces (or parts as shown in fig. 7) of the identification (rate) information, based on the (airplane) image and the identification (rate) information of each of all the target (airplane) objects belonging to the respective (node) categories classified by the (second) intermediate-layer (quality-feature) classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk). Hidetoshi of the combination of HIDETOSHI, ZHU, JEAN-PIERRE does not teach the additional differences relative to claim 6 of: new (target object)… modify (the object recognizer)… new (target object)… new (target object)… new (target object). Mamoru teaches the additional difference of claim 14: new (“technology”, pg. 7, 4th txt blk; “terms”, pg. 13, last txt blk; “term…new group of the ‘New Technology Information’ category”, pg. 17, 5th txt blk) (target object)… modify (“By first setting33 a flag”, pg. 17, 5th txt blk) (the object recognizer)… new (technology) (target object)… new (category) (target object)… new (term) (target object). Similar to claim 4, Since Hidetoshi teaches an improved identification rate (represented in fig. 7 as target purpose) via classification nodes, one of skill in the art of classification can make Hidetoshi’s classification be as Mamoru’s predictably recognizing the change “setting…a highly accurate classification result”, Mamoru, pg. 17, 5th txt blk, due to advancement of technology creating new airplanes and thus new document34 label-names (like F-16) of airplanes in addition to said “F15” and “B747”: PNG media_image18.png 1167 1123 media_image18.png Greyscale Claim 16 is rejected like claim 6 except for the additional differences of “new… modifying…new…modifying …new…new”: 16. (Currently Amended) A modification method comprising: acquiring at least an (Computer Vision) image and (improved-rate) identification information of a new target (airplane) object; and modifying the object recognizer (via “A target identifying apparatus”, pg. 3, [0014]) in the recognition device according to claim 1 by using the new target object, wherein the modifying of the (F-16) object recognizer (via “A target identifying apparatus”, pg. 3, [0014]) comprises: causing the identification (rate) information of the new target (F-16) object to be estimated (via “extracts, as a case system estimation result, target35 information”, pg. 4, 6th txt blk) based on the (camera) image by using the object recognizer (via “A target identifying apparatus”, pg. 3, [0014]); specifying a lowermost-layer (F-15) classifier that has classified the identification (rate) information; replacing a temporary lowermost-layer (F-15) classifier with the lowermost-layer (F-16) classifier, the temporary lowermost-layer (F-15) classifier being constructed based on the (CV) image and the identification (rate) information of each of all target (tree) objects (fig. 7) and the new target (F-16) object that are (node) classified by the lowermost-layer (F-15) classifier; fixing the temporary lowermost-layer (quality) classifier as a lowermost-layer (F-14) classifier when36 the temporary lowermost-layer (F-15) classifier satisfies (“shape characteristics”, pg. 11, 2nd txt blk) a predetermined condition; and until37 the temporary lowermost-layer classifier satisfies the predetermined condition, repeating3839 determination of a certain criterion that enables classification of all target objects each determined to belong to a certain category classified by a classifier in a layer immediately above the temporary lowermost-layer classifier, determination of lower-order categories to which all the target objects respectively belong based on the certain criterion, and replacement of the temporary lowermost-layer classifier with an intermediate-layer classifier constructed based on an image of each of all the target objects belonging to the determined lower-order categories and the lower-order categories; and construction of a temporary lowermost-layer classifier that classifies all the target objects belonging to the respective categories into respective pieces of the identification information, based on the image and the identification information of each of all the target objects belonging to the respective categories classified by the intermediate-layer classifier (via the rejection of claim 6, re-reproduced below: Re 6. (Original), HIDETOSHI teaches An information processing apparatus (as indicated by the information connections in fig. 7) that constructs an object recognizer (via “A target identifying apparatus”, pg. 3, [0014]) that estimates identification information (via “extracts, as a case system estimation result, target information”, pg. 4, 6th txt blk) identifying a target object appearing in an image by causing multiple classifiers hierarchized (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) in multiple layers (forming the “target/ target layer”, pg. 11, 4th txt blk) to classify the target object in order (from 1st to 3rd layers), the recognizer constructing apparatus (as indicated by the information connections in fig. 7) comprising: an acquisition (“processor”, pg. 12, 9th txt blk) unit configured to acquire at least an image and identification information of each of multiple target objects; and a controller (“processor”, pg. 12, 9th txt blk) configured to construct multiple classifiers (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk), based on the image and the identification (rate) information of each of the multiple (“simulated”, pg. 12, 9th txt blk) target objects (30), wherein the multiple (distinguishing-classifying) classifiers comprise: an uppermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) that classifies, based on the (optical-radar) image acquired by the acquisition unit, the (“simulated”, pg. 12, 9th txt blk) target object in the (optical-radar) image into a (“classification node”, pg. 11, 4th txt blk) category; and a lowermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) that classifies the (“simulated”, pg. 12, 9th txt blk) target object (via “extracts, as a case system estimation result, target information”, pg. 4, 6th txt blk) belonging to a (“helicopter”-)category classified by an upper-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) into any piece (or parts as shown in fig. 7) of the identification (rate) information, and the controller (“processor”, pg. 12, 9th txt blk) is configured to: construct the uppermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) that determines, based on an initial (“reference”, pg. 11, 2nd txt blk) criterion, categories to which the multiple (“simulated”, pg. 12, 9th txt blk) target objects (30) respectively belong, and that classifies the (“simulated”, pg. 12, 9th txt blk) target objects (30) to the determined categories; construct, based on the image and the identification (rate) information of each of all the (“simulated”, pg. 12, 9th txt blk) target objects (30) belonging to the respective categories classified by the uppermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk), a temporary lowermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) that classifies all the (“simulated”, pg. 12, 9th txt blk) target objects belonging to the respective categories into respective pieces (or parts as shown in fig. 7) of the identification (rate) information; fix the temporary lowermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) as a lowermost-layer classifier when the temporary lowermost-layer classifier satisfies (“shape characteristics”, pg. 11, 2nd txt blk) a predetermined condition; and until the temporary lowermost-layer classifier (shape-feature-)satisfies the predetermined condition, repeat determination of a certain (said “predetermined reference”, pg. 11, 2nd txt blk) criterion that enables classification of all (airplane) target objects each determined to belong to a certain (node) category classified by a (quality) classifier in a (top) layer immediately above the temporary (third) lowermost-layer (quality) classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk), determination of (third) lower-order categories to which all the target (airplane) objects respectively belong based on the certain (reference) criterion, and replacement of the temporary lowermost-layer classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) with an (second) intermediate-layer classifier (between the 1st and 2nd quality classifiers: fig. 7) constructed based on an (airplane) image of each of all the target objects belonging to the determined lower-order (node) categories and the lower-order categories; and construction of a temporary lowermost-layer (quality) classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk) that classifies all the target (airplane) objects (by quality-features) belonging to the respective categories into respective pieces (or parts as shown in fig. 7) of the identification (rate) information, based on the (airplane) image and the identification (rate) information of each of all the target (airplane) objects belonging to the respective (node) categories classified by the (second) intermediate-layer (quality-feature) classifier (i.e., hierarchized classifying via “hierarchized” “concepts”, pg. 4, 3rd txt blk). Hidetoshi of the combination of HIDETOSHI, ZHU, JEAN-PIERRE does not teach the additional differences in claim 16 relative to claim 6: “new (target object)… modifying (the object recognizer)… new (target object)… modifying (of the (F-16) object recognizer) … new (target object)… new (target object)”. Mamoru teaches the additional difference of claim 16: new (“technology”, pg. 7, 4th txt blk; “terms”, pg. 13, last txt blk; “term…new group of the ‘New Technology Information’ category”, pg. 17, 5th txt blk) (target object)… modifying (“By first setting40 a flag”, pg. 17, 5th txt blk) (the object recognizer)… new (target object)… modifying (“By first setting41 a flag”, pg. 17, 5th txt blk) (of the (F-16) object recognizer) … new (target object)… new (target object). Similar to claim 4, Since Hidetoshi teaches an improved identification rate (represented in fig. 7 as target purpose) via classification nodes, one of skill in the art of classification can make Hidetoshi’s classification be as Mamoru’s predictably recognizing the change “setting…a highly accurate classification result”, Mamoru, pg. 17, 5th txt blk, due to advancement of technology creating new airplanes and thus new document42 label-names (like F-16) of airplanes in addition to said “F15” and “B747”: Claim(s) 6,12 and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over HIDETOSHI (JP 2003-279647 A) with SEARCH machine translation as applied in claim 1 in view of ZHU et al. (CN 101964059 A) with machine translation further in view of JEAN-PIERRE et al. (FR 2703804 A1) with machine translation: PNG media_image20.png 770 795 media_image20.png Greyscale Claim 6 is rejected like claim 1: Re 6. (Original), HIDETOSHI teaches An information processing apparatus (as indicated by the information connections in fig. 7) that constructs43 an object recognizer (via “A target identifying apparatus”, pg. 3, [0014]) that estimates ide
Read full office action

Prosecution Timeline

Dec 07, 2023
Application Filed
Nov 14, 2025
Non-Final Rejection — §101, §102, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586184
METHODS AND APPARATUS FOR ANALYZING PATHOLOGY PATTERNS OF WHOLE-SLIDE IMAGES BASED ON GRAPH DEEP LEARNING
2y 5m to grant Granted Mar 24, 2026
Patent 12585733
SYSTEMS AND METHODS OF SENSOR DATA FUSION
2y 5m to grant Granted Mar 24, 2026
Patent 12536786
IMAGE LOCALIZATION USING A DIGITAL TWIN REPRESENTATION OF AN ENVIRONMENT
2y 5m to grant Granted Jan 27, 2026
Patent 12518519
PREDICTOR CREATION DEVICE AND PREDICTOR CREATION METHOD
2y 5m to grant Granted Jan 06, 2026
Patent 12518404
SYSTEMS AND METHODS FOR MACHINE LEARNING BASED PHYSIOLOGICAL MOTION MEASUREMENT
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
69%
Grant Probability
98%
With Interview (+28.6%)
3y 8m
Median Time to Grant
Low
PTA Risk
Based on 557 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month