DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Terminal Disclaimer
The terminal disclaimer filed on 1/26/2026 disclaiming the terminal portion of any patent granted on this application which would extend beyond the expiration date of Patent No. 12,167,060 has been reviewed and is accepted. The terminal disclaimer has been recorded.
Response to Arguments
Applicant's arguments filed 1/26/2026 have been fully considered but they are not persuasive.
Applicant has amended the claims to recite “wherein the perceptual value comprises a vector representing the content item in a perceptual space”. The Examiner has rejected the term perceptual value using the values presented in Figure 1, wherein table 26a and Paragraph 0084 teach determining class label and score vectors. Further, Paragraph 0085 clearly indicates that these two vectors represent the perceptual content in a video scene and a value between 0 and 1 indicating the probability that a particular shot includes content associated with a predefined scene class. Therefore, table 26a clearly represents a perceptual value comprising vectors that represent the content item (in the video scene) in a perceptual space (the viewer seeing a mountain or sky). The rejection has been updated below.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-3, 5-11 and 13-19 are rejected under 35 U.S.C. 103 as being unpatentable over Dunlop et al. (U.S. Patent Application Publication 2013/0259375) in view of Strauss et al. (U.S. Patent No. 10,616,255) in further view of Duffy et al. (U.S. Patent No. 11,496,286).
Referring to claim 1, Dunlop discloses receiving, by one or more processors, a set of content items from a device (see Paragraph 0082 for receiving a video frame within a plurality of received video frames).
Dunlop also discloses determining, by the one or more processors, an affinity value for each content item in the set of content items, wherein the affinity value represents a user preference for each content item (see Paragraph 0086 for determining a threshold value that is based on a system operator/user’s preference/affinity to classify scenes in a video content item).
Dunlop also discloses determining, by the one or more processors, a classification code for each content item in the set of content items based on a perceptual value, wherein the perceptual value comprises a vector representing the content item in a perceptual space (see Figure 1, table 26a and Paragraph 0084 for determining which class label, representing a group of content relating to mountains or indoor, is to be used to create the affinity vector described below, based on the affinity value/threshold and further note table 26a in Figure 1 for the correlation between the class labels and scores 32 and further note the Examiner’s rebuttal above).
Dunlop also discloses creating, by the one or more processors, an affinity vector relating the affinity value to the classification code for each content item in the set of content items (see Paragraphs 0087-0088 and Figure 1 for creating an affinity vector (shot #1) in table 26b using the class labels and threshold value).
Dunlop also discloses transmitting, by the one or more processors, the affinity vector to a server (see Paragraph 0089 and Figure 1 for transmitting the affinity vectors in table 26b from server 10 to server 12).
Dunlop fails to teach obscuring, by the one or more processors, the affinity vector by performing one or more differential privacy operations.
Duffy discloses obscuring, by the one or more processors, the affinity vector by performing one or more differential privacy operations (see Column 3, Lines 25-28).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention, to modify the vector obscuring process, as taught by Dunlop and Strauss, using the differential privacy functionality, as taught by Duffy, for the purpose of using user data to generate data sets for use on a client device, while protecting user privacy (see Column 1, Lines 16-18 of Duffy).
Referring to claim 2, Dunlop discloses that the user preference is indicated by a user interaction with a respective content item of the set of content items, wherein the user interaction represents an indication of like or dislike of the respective content item (see Paragraph 0086 for determining a threshold value that is based on a system operator/user’s preference/affinity to classify scenes in a video content item).
Referring to claim 3, Dunlop discloses that the perceptual value represents a projection of each content item onto a lower-order perceptual space (see scores 32 in Figure 1, wherein the scores represent a projection of a content item (Indoor) onto a lower-order perceptual space (score is 0.01 which is a lower order score)).
Referring to claim 5, Dunlop discloses that creating the affinity vector further comprises generating a code-affinity data set that indicates an amount of user affinity for each of the set of content items having an associated classification code (see Figure 1 for the shots 26b containing classes that include one or more classifications that read on a code-affinity data set that indicates an amount of user affinity for each of the set of content items having an associated classification code by each shot having classifications such as sky, snow and forest).
Referring to claim 6, Dunlop discloses that the code-affinity data set includes a vector length of k, wherein k represents a number of individual classification codes (see Figure 1 for shots 26B which contain multiple classifications, wherein k is a length based on how many classifications are includes in each shot).
Referring to claim 7, Dunlop also discloses creating, by the one or more processors, one or more content-type specific affinity vectors and combining, by the one or more processors, at least two of the one or more content-type specific affinity vectors to concatenate different content-types (see table 26b having classes that indicate content-type specific affinity vectors and combining two or more of the content-type specific affinity vectors (Sky, Snow, Forest) in shot #2).
Referring to claim 8, Duffy discloses adding, by the one or more processors, noise to the affinity vector to prevent recognition of individual content items (see Column 8, Lines 58-63).
Referring to claims 9-11 and 13-16, see the rejection of claims 1-3 and 5-8 respectively.
Referring to claims 17-19, see the rejection of claims 1-3, respectively.
Claims 4, 12 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Dunlop et al. (U.S. Patent Application Publication 2013/0259375) in view of Duffy et al. (U.S. Patent No. 11,496,286) in further view of Chaddha (U.S. Patent No. 6,404,923).
Referring to claim 4, Dunlop and Duffy disclose all of the limitations of claim 1, but fail to teach determining the classification code based on the perceptual value using a codebook.
Chaddha discloses determining a classification code based on a perceptual value using a codebook (see Column 3, Line 33 through Column 4, Line 17 and Column 5, Lines 6-10).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention, to modify the classification system, as taught by Dunlop and Duffy, using the codebook, as taught by Chaddha, for the purpose of providing a more efficient image classification technique (see Column 2, Lines 37-38 of Chaddha).
Referring to claims 16 and 20, see the rejection of claim 4.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JASON P SALCE whose telephone number is (571)272-7301. The examiner can normally be reached 5:30am-10:00pm M-F (Flex Schedule).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Nathan Flynn can be reached at 571-272-1915. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Jason Salce/Senior Examiner, Art Unit 2421
Jason P Salce
Senior Examiner
Art Unit 2421
February 23, 2026