DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
2. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
3. Claims 21-24, 28-31 and 35-38 are rejected under 35 U.S.C. 102(a)(2) as being anticipate by Parkard et al. (US Patent No. 10,433,030 B2).
In considering claim 21, Parkard et al. discloses all the claimed subject matter, note 1) the claimed receiving, by one or more processors, a highlight request and one or more user parameters from a user device corresponding to a user is met by step 422 that the user 250 requests a customized highlight sequence and step 410 obtains personal characteristics of user 250 can take place after step 422 (Fig. 4A, col. 20, line 13 to col. 21, line 53), 2) the claimed in response to receiving the highlight request, analyzing, by the one or more processors, a video feed based on the one or more user parameters to identify one or more highlights is met by step 412 determines the length of time available for the customized highlight sequence (Fig. 4A, col. 21, line 32 to col. 22, line 3), 3) the claimed wherein the analyzing includes: generating, by the one or more processors, a set of excitement levels for one or more events within each of the one or more highlights is met by step 413 determines dynamic excitement levels for the selected events (Fig. 4A, col. 21, line 32 to col. 22, line 3), 4) the claimed assigning, by the one or more processors, a highlight length for each of the one or more highlights based on the set of excitement levels for the one or more events within each of the one or more highlights is met by once occurrences have been selected 415, a determination is made 416 as to the start/end times for highlights that include the selected occurrences and a change in excitement level may be used to determine suitable start/end points for the highlight (Fig. 4A, col. 22, line 4 to col. 23, line 46), and 5) the claimed storing, by the one or more processors, the one or more highlights on the user device is met by the highlights from event content 259 which stores at client-based storage device 258 (Figs. 2A-2B, col. 14, line 13 to col. 15, line 35).
In considering claim 22, the claimed the computer- implemented method further comprising: determining, by the one or more processors, one or more additional user parameters, wherein the determining is based on observed user behavior or one or more observed actions is met by personal characteristics can be used to determine what teams, sports, leagues, players, games, television programs, online content streams, other relevant digital assets, etc. user 250 may be interested in, so as to provide a customized highlight sequence according to such interests (Fig. 4A, col. 20, line 13 to col. 21, line 53).
In considering claim 23, the claimed wherein the observed user behavior or the one or more observed actions include one or more website visitation patterns, one or more television watching patterns, one or more music listening patterns, one or more online purchases, one or more previous highlight identification parameters, one or more previous highlights, or metadata viewed by the user is met by tracking behavior, such as for example website visitation patterns, viewing patterns, purchasing patterns, movement/travel, communications (inbound and/or outbound), and/or the like interests (Fig. 4A, col. 20, line 13 to col. 21, line 53).
In considering claim 24, the claimed wherein the one or more user parameters include an event, a game, a team, an available time amount, or requested metadata is met by identifying team, player, and sport affinity from profiles of user 250, such as from a profile created and maintained by a social network or other third-party source (Fig. 4A, col. 20, line 13 to col. 21, line 53).
Claims 28-31 are rejected for the same reason as discussed in claims 21-24, respectively.
Claims 35-38 are rejected for the same reason as discussed in claims 21-24, respectively.
Claim Rejections - 35 USC § 103
4. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
5. Claims 25-27, 32-34 and 39-40 are rejected under 35 U.S.C. 103 as being unpatentable Parkard et al. (US Patent No. 10,433,030 B2) in view of Relyea et al. (US Patent No. 8,522,300 B2).
In considering claim 25, Parkard et al. disclose all the limitations of the instant invention as discussed in claim 21 above, except for providing the claimed wherein the analyzing the video feed based on the one or more user parameters to identify the one or more highlights includes: extracting, by the one or more processors, metadata corresponding to the one or more highlights. Relyea et al. teaches that feature extractor 130 may receive the media content (and any metadata provided by provider 110, 112, 114), perform operations to extract the relevant metadata, and populate fantasy highlights database 142 with event information (Fig. 2, col. 5, line 52 to col. 6, line 44). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the extracting unit as taught by Relyea et al. into Parkard et al.’s system in order to allow the user to view the video with shorter time by watching only scenes where only events appear.
In considering claim 26, the claimed the computer- implemented method further comprising: compiling, by the one or more processors, a subset of the one or more highlights and the corresponding extracted metadata into a highlight show is met by feature extractor 130 may receive the media content (and any metadata provided by provider 110, 112, 114), perform operations to extract the relevant metadata, and populate fantasy highlights database 142 with event information and the highlights manager 172 which generates highlight messages for distribution to users (Fig. 2, col. 5, line 52 to col. 6, line 44 of Relyea et al.).
The motivation to combine the references has been discussed in claim 25 above.
In considering claim 27, the claimed the computer- implemented method further comprising: outputting, by the one or more processors, the highlight show to a display of the user device is met by the highlight sequence is presented 418 to user 250, this is done by displaying a video (with accompanying audio) to user 250, containing the highlight sequence (Fig. 4A, col. 23, line 64 to col. 24, line 13 of Parkard et al.).
The motivation to combine the references has been discussed in claim 25 above.
Claims 32-34 are rejected for the same reason as discussed in claims 25-27, respectively.
Claims 39-40 are rejected for the same reason as discussed in claims 25-26, respectively.
Conclusion
6. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Smith et al. (US Patent No. 10,602,235 B2) disclose video segment detection and replacement.
7. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TRANG U TRAN whose telephone number is (571)272-7358. The examiner can normally be reached on M-F 10:00AM- 6:00PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JOHN W. MILLER can be reached on 571-272-7353. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
January 7, 2026
/TRANG U TRAN/Primary Examiner, Art Unit 2422