Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-3 and 6-11 are rejected under 35 U.S.C. 103 as being unpatentable over Hideki et al (JP 2015-186148) in view of Page (US 2006/0073438).
As per claim 1 Hideki et al discloses: A video distribution device 200 which distributes a video to a plurality of display control devices 300 { [page 6 of translation, sixth full paragraph] Further, in the multi-view image reproduction data transmission means, the control unit 101 of the multi-view
image data management server 200 transmits the multi-view image data to the image reproduction terminals of other users other than the user of the image reproduction terminal 300 via the network 500.} , the video distribution device 200 comprising: a processor 101; and a memory 102 storing a program which, when executed by the processor 101 {figure 3}, causes the video distribution device 200 to acquire a plurality of videos generated by simultaneously performing capturing at different angles from a predetermined place using a plurality of imaging devices {figure 2 & [page 2 of translation, line 13] The multi-viewpoint camera 100a is configured by arranging a large number of cameras around an object to be imaged, and simultaneously images the object to be imaged from various angles. In the case of the present embodiment, as shown in FIG. 2, 24 cameras are arranged at equal angles (15 ° intervals) on the circumference so as to surround a subject to be photographed (center position in FIG. 2).}, distribute the plurality of videos as one set to the plurality of display control devices 300 { [page 6 of translation, second full paragraph] Accordingly, the control unit 101 of the multi-viewpoint image data management server 200 performs content distribution only to an information processing terminal having the same user group ID as the user of the image reproduction terminal 300.}, receive, from a first display control device, display state information indicating display states of the plurality of videos included in the one set { figure 5 }, and transmit the display state information to a second display control device in response to a request from the second control device different from the first control device { [ page 5 of translation, paragraph 4 ] Multi-view image data is transmitted to the playback terminal 300 (multi-view image data transmitting means).}, wherein the second display control device displays the plurality of videos included in the one set on a screen simultaneously , according to the display state information { [ page 6 of translation, full paragraphs 6 & 7] Further, in the multi-view image reproduction data transmission means, the control unit 101 of the multi-view image data management server 200 transmits the multi-view image data to the image reproduction terminals of other users other than the user of the image reproduction terminal 300 via the network 500. And the camera work information may be transmitted. Thereby, another user can reproduce a multi-viewpoint image with the same camera work as the user of the image reproduction terminal 300, and can share content among users. }.
Regarding claim 1 Hideki et al is silent as to: wherein the display arrangement information includes a plurality of timecode values for each video from the plurality of videos, each timecode value having an associated position value for the plurality of videos in the display arrangement information in order to change display states of the plurality of videos in accordance with the position value included in the display arrangement information. With respect to claim 1 Page discloses: [0057] In a set of embodiments, the method 400 further comprises saving the position of the aimpoint (block 420). In some embodiments, the position of the aimpoint is saved in an aimpoint history log, perhaps along with a chronological identifier (which might be a time stamp, frame number, etc.), which can allow for analysis of aimpoint position over a period of time. The procedures at blocks 405-420 then might be repeated, thereby capturing a plurality of video frames and/or developing a historical record of aimpoint positions in each of the plurality of video frames.
It would have been obvious to a person having ordinary skill in the art at the time the invention was effectively filed to provide the video distribution device of Hideki et al with the display arrangement information includes a plurality of timecode values for each video from the plurality of videos, each timecode value having an associated position value for the plurality of videos in the display arrangement information in order to display and change positions of the plurality of videos in accordance with the position value included in the display arrangement information as taught by Page. The rationale is as follows: one of ordinary skill in the art at the time the invention was effectively filed would have been motivated to provide a video distribution device with the display arrangement information includes a plurality of timecode values for each video from the plurality of videos, each timecode value having an associated position value for the plurality of videos in the display arrangement information in order to display and change positions of the plurality of videos in accordance with the position value included in the display arrangement information so as to improve “tracking and/or identification”. See [0071] of Page.
As per claim 2 Hideki et al discloses: The video distribution device 200 according to claim 1, wherein the second display control device requests one of a plurality of display state information, and the requested display state information is transmitted to the second display control device. { [ page 7 of translation, full paragraph 7] The control unit 301 of the image playback terminal 300 requests the multi-view image data management server 200 to receive multi-view image data, and after user authentication in the multi-view image data management server 200, the multi-view image data management server 200 receives the multi-view image data. Image data is received (multi-view image data receiving means), and the received multi-view image data is stored in the storage unit 302 (multi-view image data storage means). }
As per claim 3 Hideki et al discloses: The video distribution device 200 according to claim 1, wherein, when the program is executed by the processor 101, the program further causes the video distribution device 200 to count a number of times of the request from the second display control device for each of a plurality of display state information { [ the paragraph bridging pages 3 & 4 of translation ] The control unit 101 of the control terminal 100b stores the generated multi-viewpoint image data in the storage unit 102 (multi-viewpoint image data storage unit). The total number of still images included in the stored multi-viewpoint image data is, for example, 24 (number of cameras) × 100 (number of time series) = 2400 if the number of time series is 100. When the multi-viewpoint camera 100a captures a still image (still picture), the number of time series is 1, so 24 (number of cameras) × 1 (number of time series) = 24.}.
As per claim 6 Hideki et al discloses: A display control device comprising: a processor 101; and a memory 102 storing a program which, when executed by the processor 101 {figure 3}, causes the display control device to receive from a video distribution device 200 a plurality of videos as one set, the plurality of videos being generated by simultaneously performing capturing at different angles from a predetermined place using a plurality of imaging devices {figure 2 & [page 2 of translation, line 13] The multi-viewpoint camera 100a is configured by arranging a large number of cameras around an object to be imaged, and simultaneously images the object to be imaged from various angles. In the case of the present embodiment, as shown in FIG. 2, 24 cameras are arranged at equal angles (15 ° intervals) on the circumference so as to surround a subject to be photographed (center position in FIG. 2).}, perform control to display the plurality of videos on a screen simultaneously, change display states of the plurality of videos included in the one set in response to a user's operation, generate, for the plurality of videos included in the one set, { [page 6 of translation, second full paragraph] Accordingly, the control unit 101 of the multi-viewpoint image data management server 200 performs content distribution only to an information processing terminal having the same user group ID as the user of the image reproduction terminal 300.} display state information indicating the changed display states, and transmit the generated display state information to the video distribution device 200 device { [ page 5 of translation, paragraph 4 ] Multi-view image data is transmitted to the playback terminal 300 (multi-view image data transmitting means).}.
Regarding claim 6 Hideki et al is silent as to: wherein the display arrangement information includes a plurality of timecode values for each video from the plurality of videos, each timecode value having an associated position value for the plurality of videos in the display arrangement information in order to change display states of the plurality of videos in accordance with the position value included in the display arrangement information. With respect to claim 6 Page discloses: [0057] In a set of embodiments, the method 400 further comprises saving the position of the aimpoint (block 420). In some embodiments, the position of the aimpoint is saved in an aimpoint history log, perhaps along with a chronological identifier (which might be a time stamp, frame number, etc.), which can allow for analysis of aimpoint position over a period of time. The procedures at blocks 405-420 then might be repeated, thereby capturing a plurality of video frames and/or developing a historical record of aimpoint positions in each of the plurality of video frames.
It would have been obvious to a person having ordinary skill in the art at the time the invention was effectively filed to provide the video distribution device of Hideki et al with the display arrangement information includes a plurality of timecode values for each video from the plurality of videos, each timecode value having an associated position value for the plurality of videos in the display arrangement information in order to display and change positions of the plurality of videos in accordance with the position value included in the display arrangement information as taught by Page. The rationale is as follows: one of ordinary skill in the art at the time the invention was effectively filed would have been motivated to provide a video distribution device with the display arrangement information includes a plurality of timecode values for each video from the plurality of videos, each timecode value having an associated position value for the plurality of videos in the display arrangement information in order to display and change positions of the plurality of videos in accordance with the position value included in the display arrangement information so as to improve “tracking and/or identification”. See [0071] of Page.
As per claim 7 Hideki et al discloses: The display control device according to claim 6, wherein when the program is executed by the processor 101, the program further causes the display control device to request the display state information indicating display states of the plurality of videos included in the one set { [page 6 of translation, second full paragraph] Accordingly, the control unit 101 of the multi-viewpoint image data management server 200 performs content distribution only to an information processing terminal having the same user group ID as the user of the image reproduction terminal 300.}, and acquire the requested display state information, and the control is performed to display the one set of the plurality of videos on the screen, according to the display state information. {figure 5}
As per claim 8 Hideki et al discloses: A video distribution method for distributing a video to a plurality of display control devices 300, the video distribution method comprising: acquiring a plurality of videos generated by simultaneously performing capturing at different angles from a predetermined place using a plurality of imaging devices {figure 2 & [page 2 of translation, line 13] The multi-viewpoint camera 100a is configured by arranging a large number of cameras around an object to be imaged, and simultaneously images the object to be imaged from various angles. In the case of the present embodiment, as shown in FIG. 2, 24 cameras are arranged at equal angles (15 ° intervals) on the circumference so as to surround a subject to be photographed (center position in FIG. 2).}; distributing the plurality of videos as one set to the plurality of display control devices 300 { [page 6 of translation, second full paragraph] Accordingly, the control unit 101 of the multi-viewpoint image data management server 200 performs content distribution only to an information processing terminal having the same user group ID as the user of the image reproduction terminal 300.}; receiving, from a first display control device, display state information indicating display states of the plurality of videos included in the one set { figure 5 }; and transmitting the display state information to a second display control device in response to a request from the second control device different from the first control device { [ page 5 of translation, paragraph 4 ] Multi-view image data is transmitted to the playback terminal 300 (multi-view image data transmitting means).}, wherein the second display control device displays the plurality of videos included in the one set on a screen simultaneously, according to the display state information { [ page 6 of translation, full paragraphs 6 & 7] Further, in the multi-view image reproduction data transmission means, the control unit 101 of the multi-view image data management server 200 transmits the multi-view image data to the image reproduction terminals of other users other than the user of the image reproduction terminal 300 via the network 500. And the camera work information may be transmitted. Thereby, another user can reproduce a multi-viewpoint image with the same camera work as the user of the image reproduction terminal 300, and can share content among users. }.
Regarding claim 8 Hideki et al is silent as to: wherein the display arrangement information includes a plurality of timecode values for each video from the plurality of videos, each timecode value having an associated position value for the plurality of videos in the display arrangement information in order to change display states of the plurality of videos in accordance with the position value included in the display arrangement information. With respect to claim 8 Page discloses: [0057] In a set of embodiments, the method 400 further comprises saving the position of the aimpoint (block 420). In some embodiments, the position of the aimpoint is saved in an aimpoint history log, perhaps along with a chronological identifier (which might be a time stamp, frame number, etc.), which can allow for analysis of aimpoint position over a period of time. The procedures at blocks 405-420 then might be repeated, thereby capturing a plurality of video frames and/or developing a historical record of aimpoint positions in each of the plurality of video frames.
It would have been obvious to a person having ordinary skill in the art at the time the invention was effectively filed to provide the video distribution device of Hideki et al with the display arrangement information includes a plurality of timecode values for each video from the plurality of videos, each timecode value having an associated position value for the plurality of videos in the display arrangement information in order to display and change positions of the plurality of videos in accordance with the position value included in the display arrangement information as taught by Page. The rationale is as follows: one of ordinary skill in the art at the time the invention was effectively filed would have been motivated to provide a video distribution device with the display arrangement information includes a plurality of timecode values for each video from the plurality of videos, each timecode value having an associated position value for the plurality of videos in the display arrangement information in order to display and change positions of the plurality of videos in accordance with the position value included in the display arrangement information so as to improve “tracking and/or identification”. See [0071] of Page.
As per claim 9 Hideki et al discloses: A display control method comprising: receiving from a video distribution device 200 a plurality of videos as one set, the plurality of videos being generated by simultaneously performing capturing at different angles from a predetermined place using a plurality of imaging devices {figure 2 & [page 2 of translation, line 13] The multi-viewpoint camera 100a is configured by arranging a large number of cameras around an object to be imaged, and simultaneously images the object to be imaged from various angles. In the case of the present embodiment, as shown in FIG. 2, 24 cameras are arranged at equal angles (15 ° intervals) on the circumference so as to surround a subject to be photographed (center position in FIG. 2).}; performing control to display the plurality of videos on a screen simultaneously { [page 6 of translation, second full paragraph] Accordingly, the control unit 101 of the multi-viewpoint image data management server 200 performs content distribution only to an information processing terminal having the same user group ID as the user of the image reproduction terminal 300.}; changing display states of the plurality of videos included in the one set in response to a user's operation {figure 5}; generating, for the plurality of videos included in the one set, display state information indicating the changed display states {figure 5}; and transmitting the generated display state information to the video distribution device 200 { [ page 5 of translation, paragraph 4 ] Multi-view image data is transmitted to the playback terminal 300 (multi-view image data transmitting means).}.
Regarding claim 9 Hideki et al is silent as to: wherein the display arrangement information includes a plurality of timecode values for each video from the plurality of videos, each timecode value having an associated position value for the plurality of videos in the display arrangement information in order to change display states of the plurality of videos in accordance with the position value included in the display arrangement information. With respect to claim 9 Page discloses: [0057] In a set of embodiments, the method 400 further comprises saving the position of the aimpoint (block 420). In some embodiments, the position of the aimpoint is saved in an aimpoint history log, perhaps along with a chronological identifier (which might be a time stamp, frame number, etc.), which can allow for analysis of aimpoint position over a period of time. The procedures at blocks 405-420 then might be repeated, thereby capturing a plurality of video frames and/or developing a historical record of aimpoint positions in each of the plurality of video frames.
It would have been obvious to a person having ordinary skill in the art at the time the invention was effectively filed to provide the video distribution device of Hideki et al with the display arrangement information includes a plurality of timecode values for each video from the plurality of videos, each timecode value having an associated position value for the plurality of videos in the display arrangement information in order to display and change positions of the plurality of videos in accordance with the position value included in the display arrangement information as taught by Page. The rationale is as follows: one of ordinary skill in the art at the time the invention was effectively filed would have been motivated to provide a video distribution device with the display arrangement information includes a plurality of timecode values for each video from the plurality of videos, each timecode value having an associated position value for the plurality of videos in the display arrangement information in order to display and change positions of the plurality of videos in accordance with the position value included in the display arrangement information so as to improve “tracking and/or identification”. See [0071] of Page.
As per claim 10 Hideki et al discloses: A non-transitory computer readable medium that stores a program, wherein the program causes a computer to execute the video distribution method according to claim 8. { [Page 2, last 3 paragraphs] The CPU calls a program stored in the storage unit 102, ROM, storage medium, or the like to a work memory area on the RAM and executes it, and executes all the processing of the control terminal 100b via the bus 108. The ROM is a non-volatile memory, and permanently stores programs such as a boot program and BIOS for the control terminal 100b, data, and the like. The RAM is a volatile memory, and temporarily stores a program, data, and the like loaded from the storage unit 102, ROM, storage medium, and the like, and includes a work area used by the control unit 101 for performing various processes. }
As per claim 11 Hideki et al discloses: A non-transitory computer readable medium that stores a program, wherein the program causes a computer to execute the display control method according to claim 9. { [Page 2, last 3 paragraphs] The CPU calls a program stored in the storage unit 102, ROM, storage medium, or the like to a work memory area on the RAM and executes it, and executes all the processing of the control terminal 100b via the bus 108. The ROM is a non-volatile memory, and permanently stores programs such as a boot program and BIOS for the control terminal 100b, data, and the like. The RAM is a volatile memory, and temporarily stores a program, data, and the like loaded from the storage unit 102, ROM, storage medium, and the like, and includes a work area used by the control unit 101 for performing various processes. }
Response to Arguments
Applicant's arguments filed August 18, 2025 have been fully considered but they are not persuasive. Applicant asserts in the second through fourth full paragraph on page 2 the following:
Page merely discloses a position of aimpoint is saved in an aimpoint history log along an identifier. The procedure may be repeated, thereby capturing a plurality of video frames and/or developing a historical record of aimpoint positions in each of the plurality of video frames. Page discloses nothing more than repeatedly saving positions in a log to capture video frames and to develop a record of the positions in each of the video frames. The claimed invention, on the other hand, recites display information having timecode value for each video of a plurality of videos. Each timecode value has an associated position value for the plurality of videos in the display information to change the display states of the plurality of videos according to the position value.
In the second and third paragraph on page 2, applicant describes what page discloses in [0057]. This disclosure of Page is not unlike what is claimed and described in the fourth paragraph on page 2. [0057] of Page states “The procedures at blocks 405-420 then might be repeated, thereby capturing a plurality of video frames and/or developing a historical record of aimpoint positions in each of the plurality of video frames.” The claimed timecodes are the disclosed aimpoints and the disclosed capturing a plurality of frames is the claimed change the display states. Therefore, contrary to applicant’s assertion Hideki as modified by Page does disclose the newly added limitations.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DAVID D DAVIS whose telephone number is (571)272-7572. The examiner can normally be reached Monday - Friday, 8 a.m. - 4 p.m..
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ke Xiao can be reached on 571-272-7776. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DAVID D DAVIS/Primary Examiner, Art Unit 2627
DDD