DETAILED ACTION
Claims 1,4,5,6,7,8,9,23,24 and 10,13,14,15,16,17,18 and 19,21,22 rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more:
Claim(s) 1,2,4,6,7,9,24 and 10,11,13,15,16,18 and 19,21,22 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by NIELSEN et al. (US 2016/0379074 A1):
Claim(s) 5 and 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over NIELSEN et al. (US 2016/0379074 A1), as applied in claims 1,2,4,6,7,9 and 10,11,13,15,16,18 and 19,21,22, in view of Raitoharju et al. (Binomial Gaussian mixture filter):
Claim(s) 8 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over NIELSEN et al. (US 2016/0379074 A1), as applied in claims 1,2,4,6,7,9 and 10,11,13,15,16,18 and 19,21,22, in view of Correa et al. (Estimating Detection Statistics within a Bayes-Closed Multi-Object Filter):
Claim(s) 23 is/are rejected under 35 U.S.C. 103 as being unpatentable over NIELSEN et al. (US 2016/0379074 A1), as applied in claims 1,2,4,6,7,9 and 10,11,13,15,16,18 and 19,21,22, further in view of CHUN et al. (US 2019/0266756 A1):
Response to Amendment
The amendment was received 12/30/2025. Claims pending 1,2,4,5,6,7,8,9,23,24 and 10,11,13,14,15,16,17,18 and 19,21,22 and claims canceled: 3,12,20:
PNG
media_image1.png
1279
300
media_image1.png
Greyscale
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1,4,5,6,7,8,9,23,24 and 10,13,14,15,16,17,18 and 19,21,22 rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more:
PNG
media_image2.png
1281
454
media_image2.png
Greyscale
Step 0: establish broadest reasonable interpretation: shown in footnotes throughout this Office action
Step 1: Claim 1 a method; claim 10 a manufacture; claim 19 a machine
Step 2A, prong 1:
The claim(s) (claim 1 representative) recite(s):
--receiving a…result…
a first probability satisfying a criterion, the first probability having been produced by updating a second probability…
the second probability not satisfying the criterion, the second probability… being based on a probability an object category--:
1. (Currently Amended: Representative) A method comprising:
sending, from a first device performing a visual search12 on an object in a scene, a first image of the scene and a second image of the scene after the first image of the scene; and
receiving a result of the visual search generated from a second device in response to a first probability satisfying a criterion, the first probability having been produced by updating a second probability based on the second image of the scene in response to the second probability not satisfying the criterion, the second probability being based on a probability of the object in the first image belonging to an object category.
Step 2A, prong 2:
This judicial exception is not integrated into a practical application because the additional elements (not in bold in representative claim 1 above):
--sending, from a first device performing a visual search34 on an object in a scene, a first image of the scene and a second image of the scene after the first image of the scene; and…
of the visual search generated from a second device in response to…
based on the second image of the scene in response to…
of the object in the first image belonging to--.; and
claim 23’s “compression”
considered with the abstract:
--receiving a…result…
a first probability satisfying a criterion, the first probability having been produced by updating a second probability…
the second probability not satisfying the criterion, the second probability… being based on a probability an object category—
is not improving the function of a computer or improving a technical field or improving technology (or Digital Technology via applicant’s disclosure [0001]: “search”5) in view of applicant’s disclosure, [0003]:
PNG
media_image3.png
842
717
media_image3.png
Greyscale
In contrast, dependent claims 2 and 11: “threshold” reflect the improvement to searching in applicant’s specification’s [0003], above, when respectively considered with claims 1 and 10. Thus, limitations from [0003] are not read into claims 1,2 or 10,11.
Step 2B:
The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements:
--sending, from a first device performing a visual search67 on an object in a scene, a first image of the scene and a second image of the scene after the first image of the scene; and…
of the visual search generated from a second device in response to…
based on the second image of the scene in response to…
of the object in the first image belonging to--; and
claim 23’s “compression”
considered individually or with the abstract:
--receiving a…result…
a first probability satisfying a criterion, the first probability having been produced by updating a second probability…
the second probability not satisfying the criterion, the second probability… being based on a probability an object category—
adheres to the conventional in view of applicant’s disclosure at:
[0002][0017][0018][0024][0030]:
PNG
media_image4.png
747
689
media_image4.png
Greyscale
Example8 of a standard, usual, common, or customary compression:
PNG
media_image5.png
472
754
media_image5.png
Greyscale
Standard, usual, common, customary example: “JPEG”9 need not be explained in applicant disclosure:
PNG
media_image6.png
983
759
media_image6.png
Greyscale
Response to Arguments
Applicant's arguments filed 12/30/2025 have been fully considered but they are not persuasive:
Claim Rejection – 35 USC 103
Applicants state, pages 11-13, that Nielsen (US 2016/0379074 A1) does not teach:
“a probability of the object in the first image belonging to an object category”.
The examiner respectfully disagrees since Nielsen teaches:
a (“classifi-cation” [0536] 1st S) probability10 of the (blob) object11 in the first image belonging to an (or one) object category (or for example “Blob type 4” [0540]):
PNG
media_image7.png
984
814
media_image7.png
Greyscale
Due to the amendment (12/30/2025) of claim 1, Gallagher (US 2007/0177805 A1) as applied under 35 USC 103 in the Office action of 09/30/2025 at page 25 is no longer required to teach the 35 USC 103 difference of “a search result” of claim 1 since “
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1,2,4,6,7,9,24 and 10,11,13,15,16,18 and 19,21,22 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by NIELSEN et al. (US 2016/0379074 A1):
PNG
media_image8.png
1279
379
media_image8.png
Greyscale
Re 1. (Currently Amended), Nielsen discloses A method comprising:
sending (via arrows in fig. 22), from a first12 (interconnected) device (or any one of “server computers 110” [0221] last S:fig. 1:110: containing inside: fig. 22:1002B: “Birds-eye view processing”) performing13 a visual search14 (in “view” “searchable” “IBTF records”, [0482] 2nd S: fig. 26:1030: “IBTF”, via “birds-eye-view processing submodule 1002B…search for data of object 112A”, [0484] 1st S) on an object in a scene (or “rooms” [0475] with exits and entrances: fig. 24), a first (899th via fig. 13: “900”) image (via fig. 25:104A,B) of the scene and a second (900th) image (via fig. 25:104A,B) of the scene after the first (899th) image of the scene (
PNG
media_image9.png
761
857
media_image9.png
Greyscale
)
; and
receiving (via arrows in fig. 5B) a 15 result16 (or any of Nielsen’s disclosed words ending in “ed”17, such as “extracted blob track data”, [0484] 2nd S, wherein the “blob” of the result of “extracted blob track data” is understood to be found/searched) of18 the visual search (in “view” “searchable” “IBTF records”, [0482] 2nd S: fig. 26:1030: “IBTF”, via “birds-eye-view processing submodule 1002B…search for data of object 112A”, [0484] 1st S: fig. 25:11A: a blob-dot) generated1920 from a second device21 (or computer 108, containing any other one of “server computers 110” [0221] last S:fig. 1:110: containing inside: fig. 22:1002B: “Birds-eye view processing”, relative to a camera 104: fig. 1) in (the fig. 5B:226: “Yes”) response (fig. 5B:234,236,237,240) to a first (via fig. 13: “900”) probability satisfying a (probability22 threshold) criterion2324 (fig. 5B:226: “Yes”),
the first (via fig. 13: “900”) probability having been produced by updating (fig. 5B:236: “Updating FCC-tag associations”) a second (or 899th via fig. 13: “900”) probability based on the second (900th) image (represented as “BBTP” in equation (28) [0429]: “bounding box tracking point (BBTP) of an FCC in captured images” [0113]) of the (“background” [0315] 3rd S) scene in response (via fig. 5B:228,230,222,224 is in response to fig. 5B:226: “No”) to the second (or 899th via fig. 13: “900”) probability not satisfying (via fig. 5B:226: “No”) the criterion,
the second (or 899th via fig. 13: “900”) probability (via fig. 5B:226: “No”: “not be sufficient…probability of…less than 1” [0510]) being based on a (“classification” [0536] 1st S: fig. 26:1002A: “Camera view processing”) probability (and then followed by another type of probability--association-probability—via said fig. 5B:226: “No”: “not be sufficient… association… probability of…less than 1” [0510]: fig. 26:148: “Network Arbitrator”) of the (blob) object25 in the first (899th via fig. 13: “900”) image belonging2627 to (comprised by said “probability”-“blobs”-“types”-“classification system”28 [0536] 1st S: fig. 26:1002A: “Camera view processing”) an object category (or a blobs-category comprised by a “blob”-“classification system” [0536] 1st S: fig. 26:1002A: “Camera view processing”).
Re 2. (Previously Presented), Neilsen discloses The method as in claim 1, wherein the (network-threshold) criterion includes a probability (said computer vision (CV) match probability, [0442] last S) being greater (via fig. 5B:226: “>”) than or equal to a threshold.
Re 4. (Currently Amended), Nielsen discloses The method as in claim [[3]] 1, wherein, after receiving the second (n) image of the scene, the 29 [00016]), and
wherein updating the second30 probability (i.e., said fig. 5B:236: updating an associated blob-tag probability, [0326][0520]) includes:
multiplying (variable “p” with “variable “p” over time “t” via equations (28) and (29)) the prior (pdf) distribution (at time “t-1”) by a current probability (pdf) distribution (at “t”), the current probability distribution (at “t”) representing a distribution of probabilities that
Re 6. (Currently Amended), Nielsen discloses The method as in claim [[3]] 1, wherein, after receiving the second image of the scene, the second (normal) probability distribution is a prior (pdf) distribution (at time “t-1”), the prior (pdf) distribution (at time “t-1”) being based on a (input and output) value (of the probability density function) of a first (cardinal) parameter (or the first (of two) mentioned cardinal parameter as “unknown parameters” [0396] last S) and a value of a second (cardinal) parameter (or the second (of said two) mentioned cardinal parameter as “design parameters” [0402]),
wherein the method further comprises:
generating a current probability (density) distribution (at time “t”), the current probability distribution representing a distribution of probabilities that parameters of the current probability distribution have particular (input “x”/output “y”) values (of the probability density function, y=f(x,y)) given a probability that the object included in the second image of the scene belongs to the (common) object category, the current probability (density) distribution being based on (said input/output) a value (“x” or “y”) of a third (cardinal) parameter (or third mentioning of “parameter” as “settable decay parameter” [0414] 2nd S) and a value of a fourth (cardinal) parameter (or a fourth mentioning of “parameter” as “model parameters” [0414] last S), and wherein updating the second31 (CV) probability includes:
adding (via the probability-sum function of the probability density function32) the (input “x”) value of the first (mentioned) parameter and the value of the third (mentioned) parameter and adding (via said sum function) the (“x” input) value of the second (mentioned) parameter and the value of the fourth (mentioned) parameter.
Re 7. (Currently Amended), Neilsen discloses The method as in claim [[3]] 1, wherein, after receiving the second image of the scene, the
wherein updating the second33 probability includes:
in response to the object being determined as being included in the (common) object category, incrementing (or “increase” probability [0656]) the (“y” output) value of the first parameter and not incrementing (via lowering said probability) the (“y” output) value of the second parameter; and34
in response to the object being determined as being included in the object category, incrementing the value of the second parameter and not incrementing the value of the first parameter (given that Neilsen teaches Markush alternative (A), the Markush element (A and B) is taught).
Re 9. (Currently Amended) Neilsen discloses The method as in claim 1, wherein the (color displayed) of the visual search comprises data (via Neilsen’s cameras of fig. 1:104) about the object not contained (since the movement is plotted data) in the first image of the scene, the (display) of the visual search comprising data (of said cameras) from at least one of the world wide web (via a “cloud” “database”, Neilsen [0347]) or a (said cloud) database.
Re 24. (New), Nielsen discloses The method as in claim 1, wherein the probability is a function of a (“maximum” [0430]) probability distribution (fig. 17B: “Maximum of pdf” “which becomes a ‘narrow spike’ “35 [0432] last S: fig. 1: “computer vision processing block 146 determines a probability density function (PDF)”36 [0373] ) over37383940 (non-max) probabilities (that are under the maximum-spike) of the object41 in the first image belonging to the object category.
Claim 10 (Currently Amended) is rejected similar to claim 1 with the difference being the claimed “a fine object class” taught by Neilsen as classifying finely textured & low-pass filtered blobs as “Blob type 1”42, [0536], “finely textured…with a low pass spatial filter” [0624] (“finely textured…blob” : cumulative adjective: maps to Markush Alternative (A): “category” as modified by cumulative adjectives (“fine object”): fine-object category, discussed in the footnotes below):
Re 10 (Currently Amended), Nielsen discloses A computer program product comprising a nontransitory storage medium, the computer program product including code that, when executed by processing circuitry of a computer, causes the processing circuitry to perform a method, the method comprising:
receiving, from a first device (fig. 1:110: a search-server) performing a visual search (in “view” “searchable” “IBTF records”, [0482] 2nd S: fig. 26:1030: “IBTF”, via “birds-eye-view processing submodule 1002B…search for data of object 112A”, [0484] 1st S) on an object in a scene, a first image of the scene and a second image of the scene after the first image of the scene;
generating a first probability (fig. 5B:224: “Network arbitrator Component: Calculating FFC-tag association probabilities”) based on the first image of the scene, the first probability being based on a (foreground-feature-cluster -FFC-classification) probability (via fig. 26:1002A: “Camera view processing”) of the object belonging to a coarse object category (via said blob classification system: fig. 26:1002A: “Camera view processing”);
in response to determining that the first (classification-FFC) probability does not satisfy a first criterion (fig. 5B:226: “Network arbitrator Component: An FCC-tag association probability > threshold?”: “No”), updating (“constantly…as new images and/or tag measurements are made available to the system” [0014] 3rd S: fig. 5B: “Network arbitrator Component: Can tag devices provide further observations?”) the first (classification-FFC) probability based on the second (new) image of the scene to produce a second (updated-classification-FFC) probability;
after determining that the second (updated-classification-FFC) probability satisfies the first criterion (fig. 5B:226: “Network arbitrator Component: An FCC-tag association probability > threshold?”: “Yes”: fig. 26:148: “Network Arbitrator”), determining a likelihood (via figs. 5A,5B: triangle-connector “D”) that43 the (“fine…textured” [0624] 2nd S-blob) object belongs to44 a fine (&4546) object4748 category4950 (via fine-texture “blob classification”-“classification probability” [0536] 1st S: fig. 26:1002A: “Camera view processing”: fig. 5A:206.208,210: “Computer vision processing block:…”); and
in response to determining that the likelihood (or “blob classification”-“classification probability” [0536] 1st S) that the (“fine…textured” [0624] 2nd S-blob: fig. 6C:276) object belongs to the fine object category (via said fine-blob classification) satisfies a second criterion (via said fig. 5B:226 for another FFC: fig. 6C:276,286: two FFCs {Foreground Feature Cluster}):
receiving (via “feedback to computer vision process” [0649] penult S: i.e., fig. 22:1002A: “Camera view processing” or via a file: fig. 26:1032: EBTF: External Blob Track File) a ed”51, such as “search…finds…extracted blob track data into an external blob track file (EBTF)” [0484] 1st 2nd Ss: figs. 26,29:1032: “EBTF”) of the visual search (in “view” “searchable” “IBTF records”, [0482] 2nd S: fig. 26:1030: “IBTF”, via “birds-eye-view processing submodule 1002B…search for data of object 112A”, [0484] 1st S) associated with the (“fine…textured” [0624] 2nd S-blob) object from a second (camera52) device (comprising “the camera view bounding box” [0649] 2nd to last S: fig. 19:BBTP: Bounding Box Tracking Point); and
sending (back to itself, i.e., any of said cloud servers) the53) of the visual search (in “view” “searchable” “IBTF records”, [0482] 2nd S: fig. 26:1030: “IBTF”, via “birds-eye-view processing submodule 1002B…search for data of object 112A”, [0484] 1st S) to the first device (said fig. 1:110 or fig. 2:108: a search-server each comprising or containing within said “feedback to computer vision process” [0649] penult S: i.e., fig. 22:1002A: “Camera view processing”)
Claim 11 is rejected similar to claim 2.
11. (Previously Presented) The computer program product as in claim 10, wherein the first criterion includes a probability being greater than or equal to a threshold.
Claim 13 is rejected similar to claim 4.
13. (Previously Presented)The computer program product as in claim [[12]] 10, wherein, after receiving the second image of the scene, the
wherein updating the first probability includes:
multiplying the prior distribution by a current probability distribution, the current probability distribution representing a distribution of probabilities that parameters of the current probability distribution have particular values given a probability that the object included in the second image of the scene belongs to the coarse object category.
Claim 15 is rejected similar to claim 6.
15. (Previously Presented) The computer program product as in claim [[12]] 10 , wherein, after receiving the second image of the scene, the
wherein the method further comprises:
generating a current probability distribution, the current probability distribution representing a distribution of probabilities that parameters of the current probability distribution have particular values given a probability that the object included in the second image of the scene belongs to the coarse object category, the current probability distribution being based on a value of a third parameter and a value of a fourth parameter, and
wherein updating the first probability includes:
adding the value of the first parameter and the value of the third parameter and adding the value of the second parameter and the value of the fourth parameter.
Claim 16 is rejected similar to claim 7.
16. (Previously Presented) The computer program product as in claim [[12]] 10 , wherein, after receiving the second image of the scene, the
wherein updating the first probability includes:
in response to the object included in the scene being classified as belonging to the coarse object category, incrementing the value of the first parameter and not incrementing the value of the second parameter; and
in response to the object included in the scene being classified as not belonging to the coarse object category, incrementing the value of the second parameter and not incrementing the value of the first parameter.
Claim 18 is rejected similar to claim 9.
18. (Currently Amended) The computer program product as in claim 10, wherein the of the visual search comprises data about the object not contained in the first image of the scene, the of the visual search comprising data from at least one of the world wide web or a database.
Claim 19 is rejected similar to claim 1.
19. (Currently Amended) An electronic apparatus, the electronic apparatus
comprising:
memory; and
processing circuitry coupled to the memory, the processing circuitry being configured to:
receive, from a first device performing a visual search on an object in a scene, a first image of the scene and a second image of the scene after the first image of the scene;
generate a first probability based on the first image of the scene, the first probability being based on a probability of the object in the first image belongs to an object category;
in response to determining that the first probability does not satisfy a criterion54, update the first probability based on the second image of the scene to produce a second probability; and
after determining that the second probability satisfies the criterion;
receiving a of the visual search associated with the object from a second device;
sending the of the visual search to the first device
Claim 21 is rejected similar to claim 6.
21. (Currently Amended) The electronic apparatus as in claim [[20]] 19, wherein, after receiving the second image of the scene, the
wherein the processing circuitry is further configured to:
generate a current probability distribution, the current probability distribution representing a distribution of probabilities that parameters of the current probability distribution have particular values given a probability that the object included in the second image of the scene belongs to the object category, the current probability distribution being based on a value of a third parameter and a value of a fourth parameter, and
wherein the processing circuitry configured to update the first probability is further configured to:
add the value of the first parameter and the value of the third parameter and adding the
Claim 22 is rejected similar to claim 7.
22. (Currently Amended) The electronic apparatus as in claim [[20]] 19, wherein, after receiving the second image of the scene, the
wherein the processing circuitry configured to update the first probability is further configured to:
in response to the object being classified as belonging to the object category, increment the value of the first parameter and not incrementing the value of the second parameter; and
in response to the object being classified as not belonging to the object category, increment the value of the second parameter and not incrementing the value of the first parameter.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 5 and 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over NIELSEN et al. (US 2016/0379074 A1), as applied in claims 1,2,4,6,7,9 and 10,11,13,15,16,18 and 19,21,22, in view of Raitoharju et al. (Binomial Gaussian mixture filter):
PNG
media_image10.png
1279
567
media_image10.png
Greyscale
Re 5. (Original), Nielsen teaches The method as in claim 4, wherein the current probability distribution (or pdf at time “t”) is a binomial distribution. Nielsen does not teach “a binomial distribution”. Raitoharju teaches in page 4 the pdf (probability density/mass function) with binomial distribution C:
PNG
media_image11.png
1211
970
media_image11.png
Greyscale
Since Nielsen suggests other pdfs by suggesting alternative names for the pdf in [0424] such as the name probability mass function (pmf), one of skill in the art of pdfs (i.e., pmfs) can make Nielsen’s be as Raitoharju’s and predictably recognize or look forward to the change resulting in an accurate pdf and thus accurate object tracking in the presence of noise (Raitoharju: pg. 15, lcol, section 6 Conclusion, last S & page 1, section 1 Introduction, 1st S.
Claim 14 is rejected similar to claim 5.
14. (Original) The computer program product as in claim 13, wherein the current probability distribution is a binomial distribution.
Claim(s) 8 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over NIELSEN et al. (US 2016/0379074 A1), as applied in claims 1,2,4,6,7,9 and 10,11,13,15,16,18 and 19,21,22, in view of Correa et al. (Estimating Detection Statistics within a Bayes-Closed Multi-Object Filter):
PNG
media_image12.png
1279
567
media_image12.png
Greyscale
Re 8. (Currently Amended), Nielsen teaches The method as in claim [[3]] 1, wherein the is a beta distribution[[s]].
Nielsen does not teach “beta distributions”; however, Correa teaches a function “q” as a function of “Beta distribution” (a,b) (2nd pg, rcol, 2nd from last S) with a normalized distribution probability density function f(x) (pdf f(x): 2nd pg, rcol, 1st full para, 1st S below eqn (4) & normal distribution pdf f(x): 7th pg, rcol, 1st full S) via the 7th page, lcol:
PNG
media_image13.png
1532
950
media_image13.png
Greyscale
Since Nielsen suggests other pdfs by suggesting alternative names for the pdf in [0424] such as the name probability mass function (pmf), one of skill in the art of pdfs (i.e., pmfs) can make Nielsen’s pdf location “P” of equations (26)(27(28)(29) in Neilsen’s pages 23,24 or the zero-mean location pdf-“P” normal distribution of Neilsen’s equation (17) in Neilsen’s page 22 be as Correa’s Beta-pdf equation “q” and predictably recognize or look forward to the change “correctly” (Correa, last page, lcol, 1st para, last S) detecting targets “required for effective” (Correa, Abstract, 1st S) tracking.
Claim 17 is rejected similar to claim 8.
17. (Currently Amended) The computer program product as in claim [[12]] 10, wherein the is a beta distribution[[s]].
Claim(s) 23 is/are rejected under 35 U.S.C. 103 as being unpatentable over NIELSEN et al. (US 2016/0379074 A1), as applied in claims 1,2,4,6,7,9 and 10,11,13,15,16,18 and 19,21,22, further in view of CHUN et al. (US 2019/0266756 A1):
PNG
media_image14.png
1279
567
media_image14.png
Greyscale
Re 23. (New), NIELSEN teaches The method as in claim 1, further comprising:
capturing an image of the scene at a first time (“thus captured at a known time instant” [0226] 1st S);
performing a compression55 operation (“of the network arbitrator component 148 and the tag arbitrator component 152” [0318] 1st S) on the image of the scene to produce the first image (via “a first imaging device” [0230] 1st S);
capturing another image (via “a second, neighboring imaging device” [0230] 3rd S) of the scene at a second time after the first time (“for the next frame time (i.e., the time instant the next image frame to be captured) based on information of the current and historical image frames” [0649] 3rd S); and
performing the compression operation (“of the network arbitrator component 148 and the tag arbitrator component 152” [0318] 1st S) on the another image (via “a second, neighboring imaging device” [0230] 3rd S) of the scene, data (“ also called tag measurements or tag observations, and establishes communication with the computer cloud 108” [0234] 2nd S) lost56 in the first image (via “a first imaging device” [0230] 1st S) due to the compression operation (“of the network arbitrator component 148 and the tag arbitrator component 152” [0318] 1st S) being57 (figs. 39A,39B are as shown) different (being different is understood given a first image in one time and second image in another time) from data (“also called tag measurements or tag observations, and establishes communication with the computer cloud 108” [0234] 2nd S) lost58 in the second image (via “a second, neighboring imaging device” [0230] 3rd S) due to the compression operation (“of the network arbitrator component 148 and the tag arbitrator component 152” [0318] 1st S).
Nielsen does not teach the difference of claim 23 of:
a compression59 (operation)…60
the compression61 (operation)…
(data)62 lost63 (in the first image)…
the compression64 (operation)…
(data)65 lost66 (in the second image).
CHUN teaches the difference of claim 23:
a compression67 (comprising “the compression property information corresponding to the first image” [0063] 3rd S: fig. 4:212: “ENCODER”) (operation)…68
the compression69 (comprising “the compression property information corresponding to the prior image” [0063] 4th S: fig. 4:212: “ENCODER”) (operation)…
(data)70 lost71 (“generated while the prior image is compressed” [0063] last S) (in the first image)…
the compression72 (comprising “ a predetermined compression property without referring to the compression property information corresponding to the prior image” [0063] 5th S: fig. 4:212: “ENCODER” ) (operation)…
(data)73 lost74 (of which “the encoder 212 may generate compression loss data for the first image and second image and identify the compression property information of each of the first image and the second image based on the compression loss data.” [0064] 1st S: fig. 12:1210 “COMPESS KTH BLOCK OF CURRENT FRAME TO GENERATE FIRST COMPRESSION LOSS DATA CORRESPONDING TO KTH BLOCK”: Fig. 18: “N frame”) (in the second image).
Since Nielsen teaches frames one of skill in the art can make Nielsen’s be as CHUN’s seeing in the change a quality pre-processed (CHEN fig. 2:220:ISP: embedded Image Signal Processors) compressed75 (CHUN fig. 2:212: “ENCODER”) image (CHUN fig. 4: 220: “ISP: right-side “N frame”: fig. 3:3221: “small RAW”) that can be transmitted (as indicated in CHUN’s arrows in figs. 2,3) faster than “transferring raw images” (CHUN [0006] 1st S: fig. 3: 322: “RAW”, non-ISP pre-processed and uncompressed) speeding the transmission of data such as text or visual images, or to minimize the memory resources needed to store such data.
Conclusion
The prior art “nearest to the subject matter defined in the claims” (MPEP 707.05) made of record and not relied upon is considered pertinent to applicant's disclosure.
The following table lists several references that are relevant to the subject matter claimed and disclosed in this Application. The references are not relied on by the Examiner, but are provided to assist the Applicant in responding to this Office action.
Citation
Relevance
Matsugu (US 2002/0038294 A1)
Matsugu teaches, [0436] 2nd S:
“the probability of existence of an object that belongs to the category”
as the closest to the claimed “a probability of the object in the first image belonging to an object category” of claim 1.
Desai et al. (US 11,176,423 B2)
Desai teaches, c. 13,ll. 10-15:
“the generic machine learning model outputs a prediction vector represented by the probabilities {p.sub.1, p.sub.2, . . . p.sub.S} of that object image belonging to S different recognizable classes”
as the closest to the claimed “a probability of the object in the first image belonging to an object category” of claim 1.
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DENNIS ROSARIO whose telephone number is (571)272-7397. The examiner can normally be reached Monday-Friday, 9AM-5PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Henok Shiferaw can be reached at 571-272-4637. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DENNIS ROSARIO/Examiner, Art Unit 2676
/Henok Shiferaw/Supervisory Patent Examiner, Art Unit 2676
1 search: the act of searching; careful examination or investigation. (Dictionary.com)
2 search: Digital Technology. the act or process of electronically retrieving data, web pages, database records, or other information from files, databases, etc., as in A search of the article turned up two references to my company. (Dictionary.com)
3 search: the act of searching; careful examination or investigation. (Dictionary.com)
4 search: Digital Technology. the act or process of electronically retrieving data, web pages, database records, or other information from files, databases, etc., as in A search of the article turned up two references to my company. (Dictionary.com)
5 search: Digital Technology. the act or process of electronically retrieving data, web pages, database records, or other information from files, databases, etc., as in A search of the article turned up two references to my company. (Dictionary.com)
6 search: the act of searching; careful examination or investigation. (Dictionary.com)
7 search: Digital Technology. the act or process of electronically retrieving data, web pages, database records, or other information from files, databases, etc., as in A search of the article turned up two references to my company. (Dictionary.com)
8 example: a pattern or model, as of something to be imitated or avoided, wherein model is defined: . a standard or example for imitation or comparison, where standard is defined: usual, common, or customary. (Dictionary.com)
9 JPEG: Scientific
A. Short for Joint Photographic Experts Group.
B. A standard algorithm for the compression of digital images, making it easier to store and transmit them.
C. A digital image that has been compressed using this algorithm. (Dictionary.com)
10 CLAIM SCOPE: “probability” is not modified by the participle “belonging” under the broadest reasonable interpretation, where SCOPE is defined: Linguistics, Logic. the range of words or elements of an expression over which a modifier (e.g., a patent examiner) or operator (e.g., me) has control. (Dictionary.com)
11 CLAIM SCOPE: “object” is modified by the adjective “belonging” under the broadest reasonable interpretation
12 Regarding “first” of “first device” via applicant’s disclosure:
[0085] In addition, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other steps may be provided, or steps may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems. Accordingly, other implementations are within the scope of the following claims.
In the following some examples are described.
13 There is no result of “performing”; thus, “performing” is more directed to the action of “perform” than the result itself (performed).
14 search: the act of searching; careful examination or investigation.(Dictionary.com): there is no explicit result of searching
15 “
16 BROAD CLAIM LANGUAGE: something that ensues from an action, policy, course of events, etc; outcome; consequence (Dictionary.com)
17 -ed: a suffix forming the past participle of weak verbs (he had crossed the river ), and of participial adjectives indicating a condition or quality resulting from the action of the verb (inflated balloons ). (Dictionary.com)
18 of: (used to indicate possession, connection, or association). (Dictionary.com)
19 CLAIM SCOPE: generated: adjective (of what?: generated-“result”? generated-“search”?), wherein SCOPE is defined: Linguistics, Logic. the range of words or elements of an expression over which a modifier (e.g., a patent examiner) or operator (or a human-reader of this) has control. (Dictioary.com)
20 CLAIM SCOPE: I (the modifier or operator) pick the adjective “generated” to modify the claimed noun “result”: “generated” – “result” under the broadest reasonable interpretation of claim 1.
21 Regarding “second” of “second device” via applicant’s disclosure:
[0085] In addition, the logic flows depicted in the figures do not require the particular order shown, or sequential order, to achieve desirable results. In addition, other steps may be provided, or steps may be eliminated, from the described flows, and other components may be added to, or removed from, the described systems. Accordingly, other implementations are within the scope of the following claims.
In the following some examples are described.
22 probability: statistics a measure or estimate of the degree of confidence one may have in the occurrence of an event, measured on a scale from zero (impossibility) to one (certainty), wherein measure is defined: any standard of comparison, estimation, or judgment. (Dictionary.com)
23 criterion: a standard of judgment or criticism; a rule or principle for evaluating or testing something. (Dictionary.com)
24 The words “measure” and “criterion” are identities, wherein identity is defined: Logic. an assertion that two terms (“measure” & “criterion”) refer to the same thing (a standard of judgement). (Dictionary.com)
25 CLAIM SCOPE (partial-range scope-modification): “object” is modified by “belonging” under the broadest reasonable interpretation of claim 1.
26 “belonging” is a present participle (adjective) participating in the action of “being based”, wherein present participle is defined: a participial form of verbs (belong) used adjectivally when the action it describes is contemporaneous with that of the main verb (or “based” of “being based”) of a sentence (claim 1) and also used in the formation of certain compound tenses. In English this form ends in -ing Compare gerund (Dictionary.com)
27 CLAIM SCOPE: the clause “the second probability being based on a probability of the object in the first image” is the scope-range that can be adjectivally modified by the adjective(s) (object category) “belonging”: e.g., full-range scope modification: --the second object category belonging probability being based on a object category belonging probability of the object category belonging object in the first object category belonging image--: the full-range scope modification example is not the broadest reasonable interpretation of claim 1.
28 (blobs) classification: systematic (blobs) placement in (blobs) categories, wherein placement is defined: the act of placing or the state of being placed, wherein place is defined: to put or set in a particular or appropriate (blobs) place, wherein appropriate is defined: right or suitable; fitting, wherein right is defined: appropriate, suitable, fitting, or proper, wherein proper is defined: belonging to or characteristic of a (blobs) person or (blobs) thing, wherein (blobs) thing is defined: an (blobs) object, (blob) fact, (blob) affair, (blob) circumstance, or (blob) concept considered as being a separate (blob) entity, wherein in is defined: at a (blob) place where there is, wherein at is defined: towards, wherein towards is defined with regard to (Dictionary.com)
29pdf: probability density function: statistics a function representing the relative distribution of frequency of a continuous random variable from which parameters such as its mean and variance can be derived and having the property that its integral from a to b is the probability that the variable lies in this interval. Its graph is the limiting case of a histogram as the amount of data increases and the class intervals decrease in size Also called density function Compare cumulative distribution function frequency distribution
30 delete “second”?
31 delete “second”?
32 Probability density function: Also called frequency function. a function of a discrete variable whose sum over a discrete set gives the probability of occurrence of a specified value.
33 delete “second”?
34 and: (used to connect [Markush] alternatives [A and B]) (Dictionary.com)
35 spike: an abrupt increase or rise, wherein rise is defined: elevation or increase in rank, fortune, influence, power, etc.., wherein elevation is defined: the height to which something is elevated or to which it rises, wherein elevated is defined: raised up, especially above the ground or above the normal level, wherein above is defined: in or to a higher place than; over. (Dictionary.com)
36 probability density function: Also called: density function. statistics a function representing the relative distribution of frequency of a continuous random variable from which parameters such as its mean and variance can be derived and having the property that its integral from a to b is the probability that the variable lies in this interval. Its graph is the limiting case of a histogram as the amount of data increases and the class intervals decrease in size Compare cumulative distribution function frequency distribution
37 CLAIM INTERPRETATION of “over”: see applicant’s fig. 5 (using the best of my ability of not reading limitations from applicant’s figure 5 into claim 24)
38 over: from one side to the other of; to the other side of; across. (Dictionary.com)
39 over: in excess of; more than. (Dictionary.com)
40 over: above in degree, quantity, etc.. (Dictionary.com)
41 object is CLAIM-SCOPE modified by the adjective “belonging” under the broadest reasonable interpretation of claim 24 consistent with applicant’s specification and drawings.
42 type: class, wherein class is defined: a number of persons or things regarded as forming a group by reason of common attributes, characteristics, qualities, or traits, wherein qualities is defined: character with respect to fineness, or grade of excellence, wherein fineness is defined: the state or quality of being fine. (Dictionary.com)
43 that :: at which : in which : on which : by which : with which : to which (merriam-webster.com)
44 Markush element of Markush alternatives follows: [(A)(B)(C)(D)]
45 and: (used to connect grammatically coordinate words, phrases, or clauses) along or together with; as well as; in addition to; besides; also; moreover. (Dictionary.com): regarding coordinate words see: Nordquist: “Coordinate Adjectives: Definition and Examples”
46 IMPLICIT [[[MARKSUH ELEMENT]]]: and: (used to connect alternatives): [[[(A) cumulative adjective: “fine-object”; (B) coordinate adjective #1: “fine”; (C) coordinate adjective #2: “object”; (D) compound phrase: “fine-object-category”]]] (Dictionary.com)
47 CLAIM SCOPE: “fine object”: applicant’s disclosure:
[0084] While certain features of the described implementations have been illustrated as described herein, many modifications, substitutions, changes and equivalents will now occur to those skilled in the art. It is, therefore, to be understood that the appended claims are intended to cover all such modifications and changes as fall within the scope of the implementations. It should be understood that they have been presented by way of example only, not limitation, and various changes in form
and details may be made. Any portion of the apparatus and/or methods described herein may be combined in any combination, except mutually exclusive combinations. The implementations described herein can include various combinations and/or sub-combinations of the functions, components and/or features of the different implementations described.--, wherein “scope” is defined: Linguistics, Logic. the range of words or elements of an expression (anywhere in applicant’s disclosure) over which a modifier (e.g., a patent examiner) or operator (or a human-reader of this) has control. (Dictioary.com)
48 “fine object” was originally interpreted as a cumulative adjective (“object” itself is further modified as being a fine; however, “fine” and “object” can each be a coordinate adjective (said coordinate words): fine-category/ object-category consistent with applicant’s use of “fine” and “object” in applicant’s disclosure, wherein coordinate is defined: Grammar. of the same rank in grammatical construction, as Jack and Jill in the phrase Jack and Jill, or got up and shook hands in the sentence He got up and shook hands. (Dictionary.com)
49 There are multiple ways to interpret “fine object category” consistent with applicant’s disclosure (Markush alternatives (A)(B)(C)(D) of a Markush element [(A)(B)(C)(D)]):
(A) “category” as modified by cumulative adjectives (“fine object”): fine-object category: the art currently reads on this Markush alternative (A);
(B) “category” as modified by a coordinate adjective number 1 (“fine”): fine-category;
(C) “category” as modified by a coordinate adjective number 2 (“object”): object-category;
(D) “category” being a truncated compound phase: fine-object-category
50 Since Nielsen teaches the cumulative adjective Markush alternative (A), the Markush element [(A)(B)(C)(D)] is taught under the broadest reasonable interpretation of claim 10.
51 -ed: a suffix forming the past participle of weak verbs (he had crossed the river ), and of participial adjectives indicating a condition or quality resulting from the action of the verb (inflated balloons ).
52 camera: a device for capturing a photographic image or recording a video, using film or digital memory. (Dictionary.com)
53 -ed: a suffix forming the past participle of weak verbs (he had crossed the river ), and of participial adjectives indicating a condition or quality resulting from the action of the verb (inflated balloons ).
54 35 USC 101: IMPROPERLY READING LIMITATIONS from applicant’s disclosure into claim 19: “criterion” is the closest to the disclosed search: threshold (or the disclosed match): applicant’s fig. 4:408: “Is peak of updated beta distribution greater than threshold?”: The word “criterion” does not clearly reflect this disclosed search: i.e, applicant’s fig. 4:408: “Is peak of updated beta distribution greater than threshold?”, wherein threshold is defined (consistent with applicant’s disclosure of “machine learning engine” [0002]): psychol the strength at which a stimulus is just perceived Compare absolute threshold difference threshold , wherein perceived is defined: detected, discerned, or recognized, often without corroboration by others, wherein detect is defined: to discover the existence of, wherein discover is defined: to see, get knowledge of, learn of, find, or find out; gain sight or knowledge of (something previously unseen or unknown), wherein find is defined: to locate, attain, or obtain by search or effort. (Dictionary.com) Is “criterion” a threshold? Not clear that “criterion” is a psychology-“threshold”. Any other claims have “threshold”? Claims 2 and 11: claims 2 and 11 may be statutory under 35 USC 101.
55 compression: adjective: Computers. relating to the process of reducing the storage space required for data by changing its format. (Dictionary.com)
56 adjective
57 “being” (is) essentially means look at a figure (Dictionary.com)
58 adjective
59 compression: adjective: Computers. relating to the process of reducing the storage space required for data by changing its format. (Dictionary.com)
60 ellipses (…) represent claim limitations already taught
61 compression: adjective: Computers. relating to the process of reducing the storage space required for data by changing its format. (Dictionary.com)
62 (italics) represent claim limitations already taught
63 adjective
64 compression: adjective: Computers. relating to the process of reducing the storage space required for data by changing its format. (Dictionary.com)
65 asjective
66 adjective
67 compression: adjective: Computers. relating to the process of reducing the storage space required for data by changing its format. (Dictionary.com)
68 ellipses (…) represent claim limitations already taught
69 compression: adjective: Computers. relating to the process of reducing the storage space required for data by changing its format. (Dictionary.com)
70 (italics) represent claim limitations already taught
71 adjective
72 compression: adjective: Computers. relating to the process of reducing the storage space required for data by changing its format. (Dictionary.com)
73 asjective
74 adjective
75 compression: The re-encoding of data (usually the binary data used by computers) into a form that uses fewer bits of information than the original data. Compression is often used to speed the transmission of data such as text or visual images, or to minimize the memory resources needed to store such data. (Dictionary.com)