DETAILED ACTION
1. This Office Action is sent in response to Applicant’s communication received on 01/09/2025 for application number 19/014,411. The Office herby acknowledges receipt of the following and placed of record in file: Specification, Drawings, Abstract, Oath/Declaration, and claims.
Notice of Pre-AIA or AIA Status
2. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
3. Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed.
Information Disclosure Statement
4. The information disclosure statement (IDS) submitted on 01/09/2025, 01/21/2025 is in accordance with provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 103
5. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
6. Claim(s) 1-4, 7-12 and 15-18 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kitaya et al., [US Pub. No.: 2019/0208142 A1] in view Suzuki [US Pub.No.: 2014/0270692 A1].
Re. Claim 1, Kitaya et al., [US Pub. No.: 2019/0208142 A1] discloses:
A display terminal [display unit 100 for displaying still images and moving images |0023, 0027, Fig. 1C] comprising circuitry configured to: receive an instruction to play back a wide-view image having a wide angle of view [display is configured to playback wide field view images |0023,0027-0029, 0035], the wide-view image being a moving image [the wide field images to be displayed include still and moving images |0027]and having been recorded for distribution to a communication terminal [The communication unit 54 transmits and receives video and audio signals captured by the imaging units 22a and 22b and images recorded on the recording medium 90, |0031, 0042], in response to the instruction to play back the wide-view image [The mode change switch 60 switches an operation mode of the system control unit 50 to a moving image playback mode, |0035], control a display to display a predetermined-area image representing a predetermined area of the wide-view image [During playback a control unit can perform clip processing on moving images for example the range of a frame 901, which is a partial range in FIG. 9A (1), is enlarged and displayed on the display 205.|Abstract, Fig.7, Figs. 9A-9C, 0029, 0137], based on point-of-view information for specifying the predetermined area of the wide-view image having been distributed and displayed at the communication terminal [The orientation detection unit 213 detects the orientation of the display control device 200 with respect to the direction of gravity, and the tilts of the orientation with respect to the yaw, roll, and pitch axes. Whether the display control device 200 is held landscape, held portrait, directed upward, directed downward, or obliquely oriented can be determined based on the orientation detected by the orientation detection unit 213. For example the orientation of the display control device 200 also changes. For VR display, the orientation detection unit 213 detects the change in the orientation of the display control device 200, and the CPU 201 performs VR display processing based on the change in orientation. Examiner interpreted the orientation detection unit 213 detects the orientation of the display control device 20… Is equivalent to applicants point-of-view information as described in paragraphs 0089,0151 of applicants specification |0028, 0054, 0063], receive a screen operation on the predetermined-area image being displayed [The terminal device 2 includes a touch panel as an input unit |0091, See also the information processing system 1 accepts an input (touch input) on the screen (on the touch panel 12) where a panoramic video is displayed. In FIG. 4, a user can input a comment by handwriting directly on the screen where a panoramic video is displayed.| 0108], the screen operation indicating a shift in virtual point of view in the wide-view image [a position detection unit for detecting a position at which an input has been made on a predetermined input surface |0091], and change display of the predetermined-area image [Fig.9a-c area of image displayed and changed], to another predetermined-area image representing a predetermined area of the wide-view image based on another point-of-view information for specifying the shifted virtual point of view [Figs 9a-c the area of the displayed image is changed, see also FIG. 9A (2) illustrates a display example in a case where the range of a frame 901, which is a partial range in FIG. 9A (1), is enlarged and displayed on the display 205],
Kitaya does not distinctly disclose:
wherein the communication terminal includes a first communication terminal and a second communication terminal, the predetermined-area image includes: a first predetermined-area image representing a first predetermined area of the wide-view image being displayed at the first communication terminal; and a second predetermined-area image representing a second predetermined area of the wide-view image being displayed at the second communication terminal, and the circuitry is further configured to display the first predetermined-area image and the second predetermined-area image on a single screen.
However in the same field of endeavor Suzuki [US Pub. No.: 2014/0270692 A1] discloses:
wherein the communication terminal includes a first communication terminal [Fig.2 terminal device 2 is communication terminal. See also 0105] and a second communication terminal [Where one panoramic video is played simultaneously on a plurality of display devices the plurality of display devices is equivalent to a plurality of communication devices |0105], the predetermined-area image includes: a first predetermined-area image representing a first predetermined area of the wide-view image being displayed at the first communication terminal [While a panoramic video is played, the information processing system 1 accepts an input of a comment on the panoramic video displayed on the terminal device 2. FIG. 4 shows an example of an operation for a user to input a comment. In the present embodiment, the information processing system 1 accepts an input (touch input) on the screen (on the touch panel 12) where a panoramic video is displayed. |0108];
and a second predetermined-area image representing a second predetermined area of the wide-view image being displayed at the second communication terminal, and the circuitry is further configured to display the first predetermined-area image and the second predetermined-area image on a single screen [While a panoramic video is played, the information processing system 1 accepts an input of a comment on the panoramic video displayed on the terminal device 2. FIG. 4 shows an example of an operation for a user to input a comment. In the present embodiment, the information processing system 1 accepts an input (touch input) on the screen (on the touch panel 12) where a panoramic video is displayed. |0108].
Therefore, it would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention
Re. Claim 2, Kitaya discloses:
The display terminal according to claim 1, wherein the circuitry is further configured to receive the point-of-view information from the communication terminal [The communication unit 54 is connected wirelessly or by a wired cable, and transmits and receives video signals and audio signals… wherein a detected orientation is included in a received video signal |Fig. 2B el 213, Fig.1C el 54, 0042].
Re. Claim 3, the rejection of claim 1 is incorporated herein.
Suzuki meets the claim limitations, as follows:
The display terminal according to claim 1, wherein the first predetermined-area image and the second predetermined-area image displayed on the single screen have a same elapsed time in playback of the moving image [In FIG. 24, a panoramic video 56 of a display range determined based on history information is displayed on the monitor 4, together with a panoramic video 55 of a display range such that the viewing direction of the virtual camera 22 is facing in the front direction.].
Re. Claim 4, the rejection of claim 1 is incorporated herein.
Suzuki meets the claim limitations, as follows:
The display terminal according to claim 1, wherein the circuitry is further configured to display one of the first predetermined-area image and the second predetermined-area image in a larger size than the other of the first predetermined-area image and the second predetermined-area image [In FIG. 24, a panoramic video 56 of a display range determined based on history information is displayed on the monitor 4, together with a panoramic video 55, wherein video is enlarged].
Re. Claim 7, Kitaya discloses:
The display terminal according to claim 1, wherein the wide-view image has a viewing angle in a wider range than a display range that is displayable on the display at a time [FIG. 9A (1), is enlarged and displayed on the display 205… Examiner interpreted a enlarged range displayed at a time is equivalent to a wider range displayed at a time based on applicants specification at paragraph 0407|Fig.9A, 0133].
Re. Claim 8, Kitaya discloses:
The display terminal according to claim 7, wherein the wide-view image includes a spherical image in equirectangular projection, an omnidirectional image, a hemispherical image, a three-dimensional panoramic image, a two-dimensional panoramic image, or a virtual reality (VR) image [VR images include an omnidirectional image (entire celestial sphere image) captured by an omnidirectional camera (entire celestial sphere camera), and a panoramic image having a video range (effective video range) wider than a display range that can be displayed on a display unit at a time |0027].
Re. Claim 9, Kitaya discloses:
A displaying [display unit 100 for displaying still images and moving images |0023, 0027, Fig. 1C] method comprising: receiving an instruction to play back a wide-view image having a wide angle of view [display is configured to playback wide field view images |0023,0027-0029, 0035], the wide-view image being a moving image [the wide field images to be displayed include still and moving images |0027] and having been recorded for distribution to a communication terminal [The communication unit 54 transmits and receives video and audio signals captured by the imaging units 22a and 22b and images recorded on the recording medium 90, |0031, 0042];
in response to the instruction to play back the wide-view image [The mode change switch 60 switches an operation mode of the system control unit 50 to a moving image playback mode, |0035], displaying, on a display, a predetermined-area image representing a predetermined area of the wide-view image [During playback a control unit can perform clip processing on moving images for example the range of a frame 901, which is a partial range in FIG. 9A (1), is enlarged and displayed on the display 205.|Abstract, Fig.7, Figs. 9A-9C, 0029, 0137], based on point-of-view information for specifying the predetermined area of the wide-view image having been distributed and displayed at the communication terminal [The orientation detection unit 213 detects the orientation of the display control device 200 with respect to the direction of gravity, and the tilts of the orientation with respect to the yaw, roll, and pitch axes. Whether the display control device 200 is held landscape, held portrait, directed upward, directed downward, or obliquely oriented can be determined based on the orientation detected by the orientation detection unit 213. For example the orientation of the display control device 200 also changes. For VR display, the orientation detection unit 213 detects the change in the orientation of the display control device 200, and the CPU 201 performs VR display processing based on the change in orientation. Examiner interpreted the orientation detection unit 213 detects the orientation of the display control device 20… Is equivalent to applicants point-of-view information as described in paragraphs 0089,0151 of applicants specification |0028, 0054, 0063];
receiving a screen operation on the predetermined-area image being displayed [The terminal device 2 includes a touch panel as an input unit |0091, See also the information processing system 1 accepts an input (touch input) on the screen (on the touch panel 12) where a panoramic video is displayed. In FIG. 4, a user can input a comment by handwriting directly on the screen where a panoramic video is displayed.| 0108], the screen operation indicating a shift in virtual point of view in the wide-view image [a position detection unit for detecting a position at which an input has been made on a predetermined input surface |0091];
changing display of the predetermined-area image [Fig.9a-c area of image displayed and changed], to another predetermined-area image representing a predetermined area of the wide-view image based on another point-of-view information for specifying the shifted virtual point of view [Figs 9a-c the area of the displayed image is changed, see also FIG. 9A (2) illustrates a display example in a case where the range of a frame 901, which is a partial range in FIG. 9A (1), is enlarged and displayed on the display 205],
Kitaya does not distinctly disclose:
wherein the communication terminal includes a first communication terminal and a second communication terminal, the predetermined-area image includes: a first predetermined-area image representing a first predetermined area of the wide-view image being displayed at the first communication terminal; and a second predetermined-area image representing a second predetermined area of the wide-view image being displayed at the second communication terminal, and the circuitry is further configured to display the first predetermined-area image and the second predetermined-area image on a single screen.
However in the same field of endeavor Suzuki [US Pub.No.: 2014/0270692 A1] discloses:
wherein the communication terminal includes a first communication terminal [Fig.2 terminal device 2 is communication terminal. See also 0105] and a second communication terminal [Where one panoramic video is played simultaneously on a plurality of display devices the plurality of display devices is equivalent to a plurality of communication devices |0105], the predetermined-area image includes: a first predetermined-area image representing a first predetermined area of the wide-view image being displayed at the first communication terminal [While a panoramic video is played, the information processing system 1 accepts an input of a comment on the panoramic video displayed on the terminal device 2. FIG. 4 shows an example of an operation for a user to input a comment. In the present embodiment, the information processing system 1 accepts an input (touch input) on the screen (on the touch panel 12) where a panoramic video is displayed. |0108];
and a second predetermined-area image representing a second predetermined area of the wide-view image being displayed at the second communication terminal, and the circuitry is further configured to display the first predetermined-area image and the second predetermined-area image on a single screen [While a panoramic video is played, the information processing system 1 accepts an input of a comment on the panoramic video displayed on the terminal device 2. FIG. 4 shows an example of an operation for a user to input a comment. In the present embodiment, the information processing system 1 accepts an input (touch input) on the screen (on the touch panel 12) where a panoramic video is displayed. |0108].
Therefore, it would have been obvious to one with ordinary skill in the art before the effective filing date of the claimed invention
Re. Claim 10, Kitaya discloses:
The displaying method according to claim 9, further comprising receiving the point-of-view information from the communication terminal [The communication unit 54 is connected wirelessly or by a wired cable, and transmits and receives video signals and audio signals… wherein a detected orientation is included in a received video signal |Fig. 2B el 213, Fig.1C el 54, 0042].
Re. Claim 11, the rejection of claim 9 is incorporated herein.
Suzuki meets the claim limitations, as follows:
The display method according to claim 9, wherein the first predetermined-area image and the second predetermined-area image displayed on the single screen have a same elapsed time in playback of the moving image [In FIG. 24, a panoramic video 56 of a display range determined based on history information is displayed on the monitor 4, together with a panoramic video 55 of a display range such that the viewing direction of the virtual camera 22 is facing in the front direction.].
Re. Claim 12, the rejection of claim 9 is incorporated herein.
Suzuki meets the claim limitations, as follows:
The display method according to claim 9, further comprising displaying one of the first
predetermined-area image and the second predetermined-area image in a larger size than the other of the first predetermined-area image and the second predetermined-area image The display terminal according to claim 1, wherein the circuitry is further configured to display one of the first predetermined-area image and the second predetermined-area image in a larger size than the other of the first predetermined-area image and the second predetermined-area image [In FIG. 24, a panoramic video 56 of a display range determined based on history information is displayed on the monitor 4, together with a panoramic video 55, wherein video is enlarged].
Re. Claim 15, Kitaya discloses:
This claim is interpreted and rejected for the same reason set forth in claim 1 & 9, including a non-transitory recording medium storing a plurality of instructions which, when executed by one or more processors, cause the processors to perform a displaying method [VR display refers to a display method that can change the display range of a VR image where a video image within the field of view range according to the orientation of the display device is displayed. |0028].
Re. Claim 16, Kitaya discloses:
The non-transitory recording medium according to claim 15, wherein the method further comprises receiving the point-of-view information from the communication terminal [The communication unit 54 is connected wirelessly or by a wired cable, and transmits and receives video signals and audio signals… wherein a detected orientation is included in a received video signal |Fig. 2B el 213, Fig.1C el 54, 0042].
Re. Claim 17, the rejection of claim 15 is incorporated herein.
Suzuki meets the claim limitations, as follows:
The non-transitory recording medium according to claim 15, wherein the first predetermined-area image and the second predetermined-area image displayed on the single screen have a same elapsed time in playback of the moving image [In FIG. 24, a panoramic video 56 of a display range determined based on history information is displayed on the monitor 4, together with a panoramic video 55 of a display range such that the viewing direction of the virtual camera 22 is facing in the front direction].
Re. Claim 18, the rejection of claim 15 is incorporated herein.
Suzuki meets the claim limitations, as follows:
The non-transitory recording medium according to claim 15, wherein the method further comprises displaying one of the first predetermined-area image and the second predetermined-area image in a larger size than the other of the first predetermined-area image and the second predetermined-area image [In FIG. 24, a panoramic video 56 of a display range determined based on history information is displayed on the monitor 4, together with a panoramic video 55, wherein video is enlarged].
Allowable Subject Matter
7. Claims 5-6, 13-14, 19-20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
US 20200280671 A1: terminal for mediating a communication between a data generation device that generates target data, and a server that controls service content usable with the data generation device, includes circuitry configured to acquire, from the data generation device, device identification information identifying the data generation device.
US 20110043663 A1: an image processing unit generates image data of a first image size from original image data of a portion corresponding to a first area within the imaging area. A magnification generation unit generates image data of a second image size from original image data of a portion corresponding to a second area as a partial area within the first area.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HOWARD D BROWN JR whose telephone number is (571)272-4371. The examiner can normally be reached Monday - Friday 7:30AM - 5:00PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sathyanarayanan Perungavoor can be reached at 5712727455. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
HOWARD D. BROWN JR
Primary Examiner
Art Unit 2488
/HOWARD D BROWN JR/Examiner, Art Unit 2488