DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-20 are pending.
Claim Rejections - 35 USC § 103
The following is a quotation of pre-AIA 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action:
(a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102 of this title, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negatived by the manner in which the invention was made.
Claim(s) 1-16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Dutta Choudhury et al (US20220327718A1) in view of Monroe (US20050207487A1).
Regarding claims 1 and 12, Dutta Choudhury teaches an electronic device comprising:
a first image sensor configured to acquire image data;
(Dutta Choudhury, Fig. 2, “first sensor 210 may generate a first video stream that includes anchor frame 225”, [0041])
at least one object sensor configured to acquire motion information about an object comprised in the image data;
(Dutta Choudhury, Fig. 2, “second sensor 215 may generate a second video stream that includes motion frames 235”, [0041])
memory storing instructions; and at least one processor, wherein when the instructions are executed by the at least one processor, cause the electronic device to:
(Dutta Choudhury, Fig. 7, "The device 705 may include components for bi-directional voice and data communications including components for transmitting and receiving communications, such as a multimedia manager 720, an I/O controller 710, a memory 715, a processor 725", [0085]; " The apparatus may include a processor, memory coupled with the processor, and instructions stored in the memory. The instructions may be executable by the processor to cause the apparatus to ..."; hardware components of a processor, coupled memory, and executable instructions)
acquire image data at a first rate through the first image sensor;
(Dutta Choudhury, "capture from a first sensor of the device a first set of video frames at a first frame rate", [0006]; "first sensor 210 may capture one or more video frames at a first frame rate 220 (e.g., 24 fps, 30 fps)", [0041]; acquiring the first set of image data at a specific first frame rate)
acquire, through the at least one object sensor, second data on the object at a second rate higher than the first rate,
(Dutta Choudhury, "capture from a second sensor of the device a second set of video frames at a second frame rate different from the first frame rate", [0006]; "while second sensor 215 may capture one or more video frames at second frame rate 230 (e.g., 60 fps, 120 fps, 240 fps, 480 fps, 960 fps, etc.) that is different than the first frame rate.", [0041]; acquiring the second data (motion frames) at a second rate that is substantially higher than the first rate)
wherein the second data is acquired within an exposure period of the image data and being temporally synchronized with the image data,
(Dutta Choudhury, "frame capture system 200 depicts video image frames simultaneously captured by first sensor 210 and second sensor 215 within a given time period ... With an exposure time at or relatively near 0.033 seconds for anchor frame 225 and an exposure time at or relatively near 0.00833 seconds for each of the four frames of motion frames 235", [0042]; the second sensor's frames are temporally synchronized and captured within the exact exposure time window of the first sensor's anchor frame)
wherein the second data includes a plurality of motion data acquired during an exposure period of a single frame of the image data;
(Dutta Choudhury, "during a time period of 0.033 seconds, first sensor 210 may capture a single anchor frame 225, while second sensor 215 may capture four motion frames 235.", [0042]; acquiring a plurality of motion frames (four frames) during the single exposure period (0.033 seconds) of the anchor frame)
by correlating the acquired image data with at least one frame included in the acquired second data based on respective times at which the image data and the second data are acquired, acquire time-relationship information between the acquired image data and the second data,
(Dutta Choudhury, "each anchor frame (e.g., anchor frame 225) and each motion frame (e.g., motion frames 235) may include a timestamp that indicates when a given frame is captured.", [0049]; "device 205 may determine, based on the respective timestamps, whether anchor frame 225 occurs before, at the same time, or after a given frame of motion frames 235.", [0050]; correlating the frames using timestamps to establish the time-relationship information between the acquired image data and the motion data)
wherein the time-relationship information specifies temporal correspondence between the single frame and the plurality of motion data; and
(Dutta Choudhury, Figs. 2-3; "anchor frame captured at t0, first motion frame captured at a first timing offset relative to t0, second motion frame captured at a second timing offset relative to t0, etc.", [0049]; "a single anchor frame (e.g., first anchor frame 310) may be analyzed in relation to the multiple motion frames that correspond to that single anchor frame (e.g., first motion frame 305-a to Lth motion frame 305-L)", [0058]; timing offsets specify the temporal correspondence between one anchor frame (captured at t0) and each of the plurality of motion frames (captured at first timing offset, second timing offset, etc., relative to t0); the timing offsets inherently specify the temporal correspondence between the single anchor frame and the plurality of motion frames)
Dutta Choudhury does not expressly disclose but Monroe teaches:
store a package file comprising the acquired image data, second data and the time-relationship information.
(Monroe, Figs. 1, 8b, 8c; "a multiplexer for merging the compressed full motion video image data signal and the compressed still frame image data signal into a single, combined image data signal", [claim 1]; "Upon detection of a trigger event, the system additionally captures, compresses and sends to the network compressed motion video information and a time stamp which indicates the exact time the trigger event occurred", [0088]; "Such timestamping is also useful for temporal correlation of archived events, as stored in the local storage 18 or on a network-based server", [0089]; "Local storage 18 is provided for storing the image signal prior to transmission", [0085]; storing a combined/package signal containing both still frame image data and full motion video data together via a multiplexer, along with timestamps; the combined data signal in Monroe (still image data + motion video data + time stamp) is analogous to a "package file comprising the acquired image data, second data and the time-relationship information.")
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention was made to incorporate the teachings of Monroe into the system or method of Dutta Choudhury in order to store the original image data, motion data, and time-relationship (timestamp) information together as a combined package/file, as Monroe demonstrates the well-known technique of combining multiple data streams with timestamps in a single stored/transmitted package for later retrieval and analysis. The combination of Dutta Choudhury and Monroe also teaches other enhanced capabilities.
Regarding claim 2, the combination of Dutta Choudhury and Monroe teaches its/their respective base claim(s).
The combination further teaches the electronic device of claim 1, wherein the at least one processor is configured to:
acquire a first frame comprising the image data, and
acquire a plurality of second frames comprising the second data for a time period to acquire the first frame.
(Dutta Choudhury, Fig. 2, “during a time period of 0.033 seconds, first sensor 210 may capture a single anchor frame 225, while second sensor 215 may capture four motion frames 235”, [0042])
Regarding claim 3, the combination of Dutta Choudhury and Monroe teaches its/their respective base claim(s).
The combination further teaches the electronic device of claim 2, wherein the second data comprises information usable to perform image processing on the image data.
(Dutta Choudhury, Fig. 2, “during a time period of 0.033 seconds, first sensor 210 may capture a single anchor frame 225, while second sensor 215 may capture four motion frames 235”, [0042])
Regarding claims 4, 10, 13 and 20, the combination of Dutta Choudhury and Monroe teaches its/their respective base claim(s).
The combination further teaches the electronic device of claim 1, wherein the package file is transmitted to an external electronic device or a server.
(Dutta Choudhury, Monroe, see comments on claim 1)
Regarding claim 5, the combination of Dutta Choudhury and Monroe teaches its/their respective base claim(s).
The combination further teaches the electronic device of claim 1, wherein the package file comprises management information indicating a data format of the second data.
(Monroe, see comments on claim 17)
Regarding claims 6 and 16, the combination of Dutta Choudhury and Monroe teaches its/their respective base claim(s).
The combination further teaches the electronic device of claim 1, wherein the at least one processor performs image processing on the image data based on the package file.
(Dutta Choudhury, Monroe, see comments on claim 1; multimedia manager 420 (Dutta Choudhury, Fig. 4) may be in the server 110 and receives the muxed file (Monroe, Fig. 2, mux 15) for data processing)
Regarding claims 7 and 14, the combination of Dutta Choudhury and Monroe teaches its/their respective base claim(s).
The combination further teaches the electronic device of claim 1, wherein the package file is stored in the memory.
(Dutta Choudhury, Monroe, see comments on claim 1; Monroe, Fig. 2, the muxed file usually is stored in memory before it is transmitted)
Regarding claim 8, the combination of Dutta Choudhury and Monroe teaches its/their respective base claim(s).
The combination further teaches the electronic device of claim 1, wherein the at least one object sensor comprises a second image sensor distinct from the first image sensor.
(Dutta Choudhury, Monroe, see comments on claim 1; Dutta Choudhury, Fig. 2, second sensor 215 for the motion frames 235 is different from the first sensor for anchor frame 225)
Regarding claim 9, the combination of Dutta Choudhury and Monroe teaches its/their respective base claim(s).
The combination further teaches the electronic device of claim 8, wherein
the first image sensor acquires the image data at a first frame rate, and
the second image sensor acquires the motion information of the object at a second frame rate higher than the first frame rate.
(Dutta Choudhury, Monroe, see comments on claim 1)
Regarding claim 11, the combination of Dutta Choudhury and Monroe teaches its/their respective base claim(s).
The combination further teaches the electronic device of claim 10, wherein the at least one external device includes at least one of a tablet, a notebook PC, and a server of the same user.
(Dutta Choudhury, Monroe, see comments on claim 1; Dutta Choudhury, Fig. 1, device 105 and server 110 can be owned by a same user)
Regarding claim 15, the combination of Dutta Choudhury and Monroe teaches its/their respective base claim(s).
The combination further teaches the method of claim 12, further comprising:
acquiring image data at a first frame rate through the first image sensor; and
acquiring motion information of the object at a second frame rate higher than the first frame rate through the at least one object sensor, the at least one object sensor including a second image sensor distinct from the first image sensor.
(Dutta Choudhury, Monroe, see comments on claims 8)
Allowable Subject Matter
Claim(s) 17-20 is/are allowed.
Response to Arguments
Applicant's arguments filed on 1/22/2026 with respect to one or more of the pending claims have been fully considered but they are not persuasive.
Regarding claim(s) 1 and 12, Applicant, in the remarks, argues that the combination of the cited reference(s) fails to teach the newly amended limitations in the claims.
The Examiner respectfully disagreed. The office action has been updated to address applicant’s argument. See the updated review comments for details.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JIANXUN YANG whose telephone number is (571)272-9874. The examiner can normally be reached on MON-FRI: 8AM-5PM Pacific Time.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amandeep Saini can be reached on (571)272-3382. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center. for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272- 1000.
/JIANXUN YANG/
Primary Examiner, Art Unit 2662 3/9/2026