DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 02/04/2025 is being considered by the examiner.
Oath/Declaration
Applicant is advised to file an oath or declaration for the instant application.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-3, 8-9, 11, 12, 14-15, 20-22 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Xu et al. US Patent Publication No. 2024/0205507.
Regarding Claim 1, Xu discloses a method for displaying a live stream page, comprising:
displaying a live stream page, the live stream page comprising a first area and a second area, wherein the first area is configured to display a live stream video [Figure 2];
displaying a target interaction component in the second area based on content of the live stream video displayed in the first area [Figure 2 & 3 first identifier 201, such as the virtual icon “Viewing angle”];
and in response to a triggering operation for the target interaction component performing a component function corresponding to the target interaction component live stream video [0081; in response to the user clicking on the first identifier 201, such as the virtual icon “Viewing angle”, on the first video livestreaming page. The plurality of second identifiers 30 may be a viewing angle identifier 301, a viewing angle identifier 302, etc., namely, Viewing angle 1, Viewing angle 2, . . . , and Viewing angle N. The viewing angle identifiers such as Viewing angle 1, Viewing angle 2 to Viewing angle N correspond one by one to the above-mentioned plurality of livestreaming video streams with different viewing angles].
Regarding Claims 2, 14, and 22, Xu discloses a method, an electronic device and a non-transitory computer-readable storage medium wherein displaying the target interaction component in the second area based on the content of the live stream video displayed in the first area comprises:
obtaining live stream information, the live stream information representing the content of the live stream video currently displayed in the live stream page;
determining at least one target interaction component based on the live stream information;
displaying the at least one target interaction component in the second area [Figure 7 & [0111-0113].
Regarding Claims 3 and 15, Xu discloses a method and an electronic device, wherein the live stream information comprises live stream event information, the live stream event information represents a live stream event occurring in the live stream video and determining the at least one target interaction component based on the live stream information comprises:
determining the target interaction component based on a target live stream event corresponding to the live stream event information in the live stream information [0084; As shown in FIG. 4, the smartphone switches to the second livestreaming video stream corresponding to “Viewing angle 2” in response to the user clicking on the target identifier such as “Viewing angle 2” on the first video livestreaming page, thus implementing to switch to play the second video corresponding to the second livestreaming video stream in the first livestreaming room 202, i.e., switching from a video image of one viewing angle to a video image of another viewing angle, for example, switching from a video image of a main stage in the party to a close-up video image of a certain audience at the auditorium below the stage. The user can of course choose other viewing angle identifiers by clicking according to personal needs].
Regarding Claims 8 and 20, Xu discloses a method and an electronic device,
wherein after determining the at least one target interaction component based on the live stream information, and the method further comprises:
determining a fixation manner of the target interaction component based on the live stream information, wherein the fixation manner is a resident state or an adjustable state; and
in response to a component movement operation, adjusting a position of the target interaction component in the adjustable state within the second area [Figures 2 -3 in response to the user clicking on the first identifier 201, such as the virtual icon “Viewing angle”, viewing angle identifier 301, a viewing angle identifier 302, etc., appear].
Regarding Claims 9 and 21, Xu discloses a method and an electronic device, wherein displaying the target interaction component in the second area based on the content of the live stream video comprises:
displaying a preview component in the second area based on the content of the live stream video [Figure 4]; and
in response to a triggering on the preview component, displaying the at least one target interaction component in the second area [0081].
Regarding Claim 11, Xu discloses an electronic device [Figure 10], comprising:
a processor, and a memory communicatively connected to the processor [Figure 10];
the memory storing computer-executable instructions [Figure 10];
the processor executing the computer-executable instructions stored in the memory to implement a method for displaying a live stream page comprising [0064]:
displaying a live stream page, the live stream page comprising a first area and a second area, wherein the first area is configured to display a live stream video [Figure 2];
displaying a target interaction component in the second area based on content of the live stream video displayed in the first area [Figure 2 & 3 first identifier 201, such as the virtual icon “Viewing angle”];
and in response to a triggering operation for the target interaction component, performing a component function corresponding to the target interaction component [0081; in response to the user clicking on the first identifier 201, such as the virtual icon “Viewing angle”, on the first video livestreaming page. The plurality of second identifiers 30 may be a viewing angle identifier 301, a viewing angle identifier 302, etc., namely, Viewing angle 1, Viewing angle 2, . . . , and Viewing angle N. The viewing angle identifiers such as Viewing angle 1, Viewing angle 2 to Viewing angle N correspond one by one to the above-mentioned plurality of livestreaming video streams with different viewing angles].
Regarding Claim 12, Xu discloses a non-transitory computer-readable storage medium, wherein the computer-readable storage medium stores computer-executable instructions, the computer- executable instructions, when executed by a processor, implement a method for displaying a live stream page comprising [0045]:
displaying a live stream page, the live stream page comprising a first area and a second area, wherein the first area is configured to display a live stream video [Figure 2];
displaying a target interaction component in the second area based on content of the live stream video displayed in the first area [Figure 2 & 3 first identifier 201, such as the virtual icon “Viewing angle”];
and in response to a triggering operation for the target interaction component, performing a component function corresponding to the target interaction component [0081; in response to the user clicking on the first identifier 201, such as the virtual icon “Viewing angle”, on the first video livestreaming page. The plurality of second identifiers 30 may be a viewing angle identifier 301, a viewing angle identifier 302, etc., namely, Viewing angle 1, Viewing angle 2, . . . , and Viewing angle N. The viewing angle identifiers such as Viewing angle 1, Viewing angle 2 to Viewing angle N correspond one by one to the above-mentioned plurality of livestreaming video streams with different viewing angles].
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 4 and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Xu et al. US Patent Publication No. 2024/0205507 in view of Anders et al. US Patent Publication No. 2020/0128286.
Regarding Claims 4 and 16, Xu discloses the method and electronic device of claims 2 and 14 respectively, however Xu fails to disclose determining a target number of times for triggering the target interaction event in the live streaming corresponding to the live stream page.
In an analogous art, Anders discloses a method and electronic device wherein the live stream information further comprises live streaming information, the live streaming information represents a number of times a target interaction event is triggered in the live streaming [Figure 2];
and determining the target interaction component based on the target live stream event corresponding to the live stream event information in the live stream information comprises:
determining, based on the live streaming information in the live stream information, a target number of times for triggering the target interaction event in the live streaming corresponding to the live stream page [0027; product search program 200 monitors a user interface of a mobile device and detects a user interaction of the user clicking a “like” button during a product review of a blogger on a social media broadcast occurring on a web browser. Accordingly, product search program 200 determines that a user interaction with the live streaming event is occurring];
determining the target interaction component based on the target number of times for triggering the target interaction event and the target live stream event corresponding to the live stream event information [0032; present an option to purchase the identified product within the captured portion of the live streaming event of audio/video source that the user liked].
Therefore, it would have been obvious to one of ordinary skill in the art to combine the teachings of Xu and Anders, before the effective filing date of the invention, in order to identify a first product that is depicted in the captured portion of the live stream that corresponds to the user interaction [Anders 0005].
Claims 5-7 and 17-19 are rejected under 35 U.S.C. 103 as being unpatentable over Xu et al. US Patent Publication No. 2024/0205507 in view of Taylor et al. US Patent No. 10,440,436.
Regarding Claims 5 and 17, Xu discloses the method and electronic device of claims 2 and 14 respectively, however Xu fails to disclose determining a priority of the respective target interaction component based on the live stream information.
In an analogous art, Taylor discloses a method and an electronic device wherein the method further comprises:
determining a priority of the respective target interaction component based on the live stream information [Figure 5 box 524];
and determining presentation information of the respective target interaction component based on the priority of the respective target interaction component, the presentation information representing a display position of the target interaction component in the second area, and/or a time sequence for displaying the target interaction component within a target time period [Col. 11 lines 50-65 time based association of items].
Therefore, it would have been obvious to one of ordinary skill in the art to combine the teachings of Xu and Taylor, before the effective filing date of the invention, in order to synchronize interactive content with a live video stream [Taylor Abstract].
Regarding Claims 6 and 18, the combination of Xu and Taylor discloses a method and an electronic device wherein the live stream information comprises a streaming duration of the live stream video;
and determining the priority of the respective target interaction component based on the live stream information comprises:
determining the priority of the respective target interaction component based on the streaming duration [Taylor Figure 6B and Col. 14 lines 18-30; In box 618, the content access application 263 generates a timeline user interface showing the items featured in each video segment 242 of the live video stream 103. For purposes of the timeline user interface, the segments used in some instances may correspond to larger shopping segments that span multiple video segments 242].
Regarding Claims 7 and 19, the combination of Xu and Taylor discloses a method and an electronic device wherein after determining the at least one target interaction component based on the live stream information [Taylor Figure 7], the method further comprises:
determining a display mode of the target interaction component based on the live stream information, wherein the display mode is an embedded display or a suspension display [Taylor Col. 3 lines 5-25; The player interface 102 may include various player controls 106 that may allow a viewer to jump to an earlier point in the live video stream, pause the live video stream, stop the live video stream, adjust the volume of the live video stream, and so on. One or more graphical overlays 109 may be superimposed over a portion of the frame of the live video stream, where a selection of a graphical overlay 109 may cause an interactive action relative to one or more items to be performed];
displaying the at least one target interaction component in the second area, comprises:
displaying the respective target interaction component in the second area based on the display mode of the respective target interaction component [Taylor Figure 3D].
Relevant Prior Art
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Casassovici US Patent Publication No. 20220072419 – A method for providing a real-time interactive platform for live streams is provided, the method comprising steps: (a) providing a server having a hardware processor connected to a data repository, the server configured to host an overlay extension for a live stream via the Internet providing the real-time interactive platform for a plurality of users; (b) providing a data collector within an oracle, wherein the oracle is configured to interface the live stream to obtain statistics from the live stream; (c) generating timestamped events, via the data collector, wherein the events described what is occurring on the live stream in real-time; and, (d) turning the events into gameplay interactions, via a game logic algorithm, for the plurality of users.
Chang et al. US Patent Publication No. 2024/0098331 - receiving downstream data including a first video stream and first content information, the first video stream being transmitted from a first device and the first content information describing a graphical object; receiving first interaction information indicating a first user interaction feedback to the graphical object at the first device, generating a first content texture including the graphical object, an appearance of the graphical object being determined using the first content information and the first user interaction feedback included in the first interaction information; generating a video frame including the first content texture and a second content texture, the second content texture being generated from the first video stream; displaying the video frame at the first device and a second device; receiving a second user interaction feedback to the video frame at the second device; and transmitting second interaction information indicating the user interaction feedback.
Cutaia et al. US Patent Publication No. 2024/0284017 - an interaction encoder for transmitting interactive live video streams, including a processor, and a memory, containing an interaction encoding application, where the interaction encoding application directs the processor to obtain media data comprising a video track, receive a request to generate an interaction object in the video track at a given timestamp of the video stream, generate interaction data based on the request, where the interaction data includes a data structure capable of being decoded in near-real time to produce the interaction object in the video track at the timestamp when played back, and transmit the interaction data to an interaction decoder.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SAHAR A RIAZ whose telephone number is (571)270-3005. The examiner can normally be reached M-F 9 am to 5 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Benjamin Bruckart can be reached at 571-272-3982. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SAHAR AQIL RIAZ/Examiner, Art Unit 2424