DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Double Patenting
The rejection remains outstanding. Applicant however requests that the rejection be held in abeyance until the presence of allowable subjected matter is indicated.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 2-11 are rejected under 35 U.S.C. 102(a)(1) as being unpatentable by Hinkel et al (US 2018/0048820) or Kim et al (US 2018/0213144).
Claims 2, 10 and 11, Hinkle or Kim teaches an electronic device, a medium and a method comprising: a display device; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for:
displaying, via the display device, live video captured by a capture device, wherein a focal plane of the live video is selected so that a first portion of the live video is displayed in-focus and a second portion of the live video is displayed out-of-focus; (Hinkel: Fig. 9, display 902A where 904A being in-focus WHILE 906A, 908A and 910A being blurred (out-of-focus); Kim: Figs. 6A - 6C, 7A – 7E, 8A – 8C;
while displaying the live video, detecting an input corresponding to a request to change the focal plane of the live video; (Hinkel: a user may be able to select a particular focal depth and view the image generated from data captured at or near the selected focal depth...Likewise, the user may specify that image data captured at some or all focal depths beyond the selected focal depth be rendered in focus, with the image data captured at focal depths up to the selected focal depth rendered out-of-focus…. [0082-0086]; Kim: user-selection of different image having different focus, [0165];
in response to detecting the input, changing the focal plane of the live video to a second focal plane based on one or more properties of the input so that the first portion of the live video is displayed out-of-focus and the second portion of the live video is displayed in-focus, (Hinkel: [0082-0086]; Kim: Figs. 6A - 6C, 7A – 7E, 8A – 8C).
Claim 3. The electronic device of claim 2, wherein the live video includes image data for two or more focal planes including the second focal plane, and wherein changing the focal plane of the live video includes selecting image data corresponding to the second focal plane. (See claim 2).
Claim 4. The electronic device of claim 2, wherein changing the focal plane of the live video includes sending a request to the capture device to change a focal setting for the live video to the second focal plane. (This occurs during an image capturing event capturing or during a video call or real time displaying … See Hinkel’s claims 1 and 2; Kim: based on specific configuration, each focus-changed image associated with the other representative images, respectively, for which the focus is formed in the changed region based on the touch input, [0010], Figs. 8A – 8C;
Claim 5. The electronic device of claim 2, wherein the one or more properties of the input includes a characteristic intensity of the input. (Kim: sensing unit 142 may be configured to sense the strength or duration time of a touch applied to the touch screen. [0081];
Claim 6. The electronic device of claim 2, wherein the one or more properties of the input includes a duration of the input. (Kim: sensing unit 142 may be configured to sense the strength or duration time of a touch applied to the touch screen. [0081]).
Claim 7. The electronic device of claim 2, wherein the one or more properties of the input includes a location of the input on the display device. (Kim: the controller 180 displays an icon that receives a touch input to display another image, [0163, 0165, 0167, 0170]).
Claim 8. The electronic device of claim 2, wherein the one or more programs further include instructions for: in response to detecting the input, providing non-visual feedback in conjunction with changing the focal plane of the live video to the second focal plane based on the one or more properties of the input. (Kim: generate an output associated with… auditory sense or tactile sense,… on an audio output module 153, an alarm unit 154, a haptic module 155, and the like, [0059]).
Claim 9. The electronic device of claim 2, wherein the live video is associated with a live communication session, and wherein a subject of the live communication session is in the first portion of the live video. (The communication occurs during an image capturing event capturing or during a video call… in all references).
Response to Arguments
Applicant's arguments filed 11/26/25 have been fully considered but they are not persuasive.
Regarding Hinkle, Applicant states/argues that Hinkle fails to disclose/suggest the subject matter of claim 2. Hinkel generally discloses “collecting different sets of image data at different focal distance…”
PNG
media_image1.png
620
1008
media_image1.png
Greyscale
Examiner respectfully disagrees. Hinkle does teach live video communication where “a laptop or desktop computer, a tablet, a mobile phone or a smartphone, a smart television, [0130], simply computer systems and considered external devices capturing (Figs. 5 and 7) images such as Figs. 6, 8-10… and communicate/sharing with other users of one or more other external devices, [0129].
More important, it teaches selecting the focal depth as examiner specifically made reference to Fig. 9…where sets of image data captured during an image capture event with changing focal distances in accordance with at least one embodiment; For example, display 902A where 904A being in-focus WHILE 906A, 908A and 910A being blurred (out-of-focus), [0077-0080]).
Overall, par. [0081] describes similarity… “By using information obtained by determining the areas of the image in focus at different focal planes, image processing may be performed, in some embodiments in conjunction with other image processing methods, such as the operations described in FIG. 5 and FIG. 7, to obtain a deblurred image, such as image 1002. For example, a flower 1004 may be deblurred using the image data 902A where the flower 904A was in focus. Likewise, a bird 1006 may be deblurred using the image data 902B where the bird 906B was in focus. Similarly, a deer 1008 may be deblurred using the image data 902C where the deer 909C was in focus, and a tree 1010 may be deblurred using the image data 902D where the tree 910D was most in focus. In such a manner, objects captured at all focal depths may be rendered in focus in the generated image.
Respectfully, examiner sustains the teaching of Hinkle regarding the current claimed invention.
Regarding Kim, applicant states/argues that
PNG
media_image2.png
744
938
media_image2.png
Greyscale
Furthermore, since live video is not being displayed by Kim when the user changes the focus to a different figure, it follows that Kim also fails to disclose or suggest that changing the focus to a different figure involves changing the focal plane of live video that is being displayed, as required by independent claim 2.
Accordingly, Kim also fails to disclose or suggest, at least, "while displaying the live video, detecting an input corresponding to a request to change the focal plane of the live video; and in response to detecting the input, changing the focal plane of the live video" as recited in independent claim 2. As such, Kim fails to anticipate each and every element of independent claim 2, and the rejection under § 102 in view of Kim should be withdrawn.
Again, examiner respectfully disagrees as examiner points out that Kim does teach a live communication session when the user captures his or her own face and sends it to the other party during a video call, [0143] where multiple mobile terminals 100 (see Fig. 2, [0108-0112] are exchanging data including displaying representative image among images. As an example associated with the present invention, the display unit may display a plurality of representative images acquired by the same control command, and the controller may convert one representative image to a focus-changed image associated with the representative image having a focus on a changed region based on the touch input, and the controller may control the display unit to convert each focus-changed image associated with the other representative images, respectively, for which the focus is formed in the changed region based on the touch input, [0010]. FIG. 7A. FIG. 7A is a view illustrating the display unit 151 on which a representative image 520 containing a plurality of figures is displayed. For example, it corresponds to an image in which three figures are located at different locations, and the focus is adjusted on one figure. A figure for which the focus is well adjusted is illustrated as a bold line, a figure for which the focus is not adjusted and shown in a blurred manner as a dotted line, and a figure for which an intermediate level of focus is adjusted as a solid line, [0191]. a control method of changing the image or changing a focus region may not be necessarily limited to this. For example, when two figures are displayed on the image, the controller 180 may control the display unit 151 such that the focus is formed on the figure of a graphic image to which the touch input is applied, and the focus is not formed (focus-out) on another figure, [0211]. FIGS. 8A through 8C are conceptual views for explaining a control method of changing focuses when a plurality of representative images are displayed at once;
Respectfully, examiner sustains the teaching of Kim regarding the current claimed invention.
Regarding dependent claim 9. See the above.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PHUNG-HOANG J. NGUYEN whose telephone number is (571)270-1949. The examiner can normally be reached Reg. Sched. 6:00-3:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Duc Nguyen can be reached at 571-272-7503. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PHUNG-HOANG J NGUYEN/Primary Examiner, Art Unit 2691