Prosecution Insights
Last updated: April 19, 2026
Application No. 18/923,242

INFORMATION PROCESSING DEVICE, MEDIUM AND METHOD FOR USING A TOUCH SCREEN DISPLAY TO CAPTURE AT LEAST ONE IMAGE

Non-Final OA §102§103§DP
Filed
Oct 22, 2024
Examiner
CALDERON, CYNTHIA
Art Unit
2639
Tech Center
2600 — Communications
Assignee
Sony Group Corporation
OA Round
1 (Non-Final)
77%
Grant Probability
Favorable
1-2
OA Rounds
2y 4m
To Grant
96%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
602 granted / 782 resolved
+15.0% vs TC avg
Strong +18% interview lift
Without
With
+18.5%
Interview Lift
resolved cases with interview
Typical timeline
2y 4m
Avg Prosecution
17 currently pending
Career history
799
Total Applications
across all art units

Statute-Specific Performance

§101
4.9%
-35.1% vs TC avg
§103
42.1%
+2.1% vs TC avg
§102
30.7%
-9.3% vs TC avg
§112
11.9%
-28.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 782 resolved cases

Office Action

§102 §103 §DP
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Notice of Preliminary Amendment 2. The Examiner acknowledges the amended claims filed on 12/27/2024. - Claim 1 has been amended. - Claims 2-33 have been added. Priority 3. Acknowledgment is made of applicant's claim for foreign priority based on an application filed in Japan on 08/12/2024. It is noted, however, that applicant has not filed a certified copy of the JP2014-164154 application as required by 37 CFR 1.55. Information Disclosure Statement 4. The information disclosure statements (IDS) submitted on 11/24/2025 and 10/22/2024 are in compliance with the provisions of 37 CFR 1.97 and were considered by the examiner. Claim Objections 5. Claim 1 is objected to because of the following informalities: Claim 1, line 2 recites “an image sensor”. Similarly, claim 1; line 9 recites “control an image sensor”. It is unclear if the “image sensor” of line 9 refers to a second and different image sensor or if the image sensor of line 9 is related and it is supposed to refer back to the image sensor of line 2. Appropriate correction is required. Double Patenting 6. The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969). A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b). The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13. The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer. 7. Claims 1-33 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 1-33 of Kasa et al. (U.S. Patent No. 12,200,346). Although the claims at issue are not identical, they are not patentably distinct from each other because they are both claiming substantially the same features. Note the following similarities between the application claims and the patent claims. Instant Application No. 18/923,242 US Patent 12,200,346 Claim 1 An information processing device comprising: an image sensor; a touch screen display, and control circuitry configured to: select a shoot mode from a plurality of shoot modes by user input operation to a touch screen display; control a touch screen display to display a touch graphical user interface (GUI) including a movable display object on a slider including a first position and a second position; control an image sensor to begin capturing one or more images in the selected shoot mode in response to detecting in a movement of a user finger from the first position on the touch screen display to the second position on the touch screen display, and control the touch screen display to display an animation moving the movable display object to the first position in response to detecting ceasing of a touching of the touch screen display. Claim 1 An information processing device comprising: control circuitry configured to: control a touch screen display to display a touch graphical user interface (GUI) including a movable display object on a slider including a first position, a second position, and a third position, each position associated with a different operation mode of the information processing device; controlling an image sensor to begin capturing one or more images in a continuous shoot mode in response to detecting a movement of a user finger from the first position on the touch screen display to the third position on the touch screen display, and controlling the touch screen display to display an animation moving the movable display object to the first position in response to detecting a ceasing of a touching of the touch screen display, wherein a display of the GUI other than the movable display object is configured to rotate when the touch screen display is switched between a vertical orientation and a horizontal orientation, but a direction of detecting the movement of the user finger from the first position to the second position and from the first position to the third position with respect to the touch screen display does not change when the GUI is switched between the vertical orientation and the horizontal orientation. Claim 2 The information processing device of claim 1, wherein the control circuitry is further configured to operate a first operation in response to detecting a maintaining of a touching of the touch screen display at the first position for a predetermined amount of time. Claim 2 The information processing device of claim 1, wherein the control circuitry is further configured to operate a first operation in response to detecting a maintaining of a touching of the touch screen display at the first position for a predetermined amount of time. Claim 3 The information processing device of claim 1, wherein the control circuitry is further configured to stop capturing the one or more images in a second imaging mode in response to detecting the ceasing of the touching of the touch screen display. Claim 3 The information processing device of claim 1, wherein the control circuitry is further configured to stop capturing the one or more images in a second imaging mode in response to detecting the ceasing of the touching of the touch screen display. Claim 4 The information processing device of claim 1, wherein the control circuitry is further configured to end capturing the one or more images in a continuous shoot mode in response to detecting the ceasing of the touching of the touch screen display. Claim 4 The information processing device of claim 1, wherein the control circuitry is further configured to end capturing the one or more images in the continuous shoot mode in response to detecting the ceasing of the touching of the touch screen display. Claim 5 The information processing device of claim 1, wherein the control circuitry is further configured to move the movable display object back toward the first position in response to detecting a ceasing of touching of the touch screen display. Claim 5 The information processing device of claim 1, wherein the control circuitry is further configured to move the movable display object back toward the first position in response to detecting a ceasing of touching of the touch screen display. Claim 6 The information processing device of claim 1, wherein the movable display object is initially displayed near a home button of the touch GUI. Claim 6 The information processing device of claim 1, wherein the movable display object is initially displayed near a home button of the touch GUI. Claim 7 The information processing device of claim 1, wherein the touch GUI includes a settings icon Claim 7 The information processing device of claim 1, wherein the touch GUI includes a settings icon. Claim 8 The information processing device of claim 1, wherein the touch GUI includes an icon for setting a camera imaging mode. Claim 8 The information processing device of claim 1, wherein the touch GUI includes an icon for setting a camera imaging mode. Claim 9 The information processing device of claim 1, wherein the touch GUI displays an icon for changing a zoom magnification of the image sensor. Claim 9 The information processing device of claim 1, wherein the touch GUI displays an icon for changing a zoom magnification of the image sensor. Claim 10 The information processing device of claim 1, wherein the touch GUI is displayed on a first portion of the touch screen display, and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion. Claim 10 The information processing device of claim 1, wherein the touch GUI is displayed on a first portion of the touch screen display, and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion. Claim 11 The information processing device of claim 1, wherein the touch GUI includes an icon for setting a camera imaging mode and an icon for changing a zoom magnification of the image sensor, and wherein the touch GUI is displayed on a first portion of the touch screen display, and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion. Claim 11 The information processing device of claim 1, wherein the touch GUI includes an icon for setting a camera imaging mode and an icon for changing a zoom magnification of the image sensor, and wherein the touch GUI is displayed on a first portion of the touch screen display, and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion. Claim 12 A non-transitory computer-readable medium storing instructions that, when executed by a processor, cause the processor to execute operations comprising: selecting a shoot mode from a plurality of shoot modes by user input operation to a touch screen display, displaying a touch graphical user interface (GUi) including a movable display object on a slider including a first position and second position, on a touch screen display; controlling an image sensor to begin capturing one or more images in the selected mode in response to detecting a movement of a user finger, in a first direction of operation of the movable display object with respect to the touch screen display, from the first position on the touch screen display to the second position on the touch screen display, and controlling the touch screen display to display an animation moving the movable display object to the first position in response to detecting a ceasing of a touching of the touch screen display. Claim 12 A non-transitory computer-readable medium storing instructions that, when executed by a processor, cause the processor to execute operations comprising: displaying a touch graphical user interface (GUI) including a movable display object on a slider including a first position, a second position, and a third position on a touch screen display, each position associated with a different operation mode of an information processing device; controlling an image sensor to begin capturing one or more images in a continuous shoot mode in response to detecting a movement of a user finger, in a first direction of operation of the movable display object with respect to the touch screen display, from the first position on the touch screen display to the third position on the touch screen display, and controlling the touch screen display to display an animation moving the movable display object to the first position in response to detecting a ceasing of a touching of the touch screen display, wherein a display of the GUI other than the movable display object is configured to rotate when the touch screen display is switched between a vertical orientation and a horizontal orientation, but a direction of detecting the movement of the user finger from the first position to the second position and from the first position to the third position with respect to the touch screen display does not change when the GUI is switched between the vertical orientation and the horizontal orientation. Claim 13 The non-transitory computer-readable medium of claim 12, the operations further comprise: operating a first operation in response to detecting a maintaining of a touching of the touch screen display at the first position for a predetermined amount of time. Claim 13 The non-transitory computer-readable medium of claim 12, the operations further comprise: operating a first operation in response to detecting a maintaining of a touching of the touch screen display at the first position for a predetermined amount of time. Claim 14 The non-transitory computer-readable medium of claim 12, the operations further comprise: stopping capture of the one or more images in a second imaging mode in response to detecting the ceasing of the touching of the touch screen display. Claim 14 The non-transitory computer-readable medium of claim 12, the operations further comprise: stopping capture of the one or more images in a second imaging mode in response to detecting the ceasing of the touching of the touch screen display. Claim 15 The non-transitory computer-readable medium of claim 12, the operations further comprise: end capturing the one or more images in a continuous shoot mode in response to detecting the ceasing of the touching of the touch screen display. Claim 15 The non-transitory computer-readable medium of claim 12, the operations further comprise: end capturing the one or more images in the continuous shoot mode in response to detecting the ceasing of the touching of the touch screen display. Claim 16 The non-transitory computer-readable medium of claim 12, the operations further comprise: moving the movable display object back toward the first position in response to detecting the ceasing of the touching the touch screen display. Claim 16 The non-transitory computer-readable medium of claim 12, the operations further comprise: moving the movable display object back toward the first position in response to detecting the ceasing of the touching the touch screen display. Claim 17 The non-transitory computer-readable medium of claim 12, wherein the movable display object is initially displayed near a home button of the touch GUI. Claim 17 The non-transitory computer-readable medium of claim 12, wherein the movable display object is initially displayed near a home button of the touch GUI. Claim 18 The non-transitory computer-readable medium of claim 12, wherein the touch GUI includes a settings icon. Claim 18 The non-transitory computer-readable medium of claim 12, wherein the touch GUI includes a settings icon. Claim 19 The non-transitory computer-readable medium of claim 12, wherein the touch GUI includes an icon for setting a camera imaging mode. Claim 19 The non-transitory computer-readable medium of claim 12, wherein the touch GUI includes an icon for setting a camera imaging mode. Claim 20 The non-transitory computer-readable medium of claim 12, wherein the touch GUI displays an icon for changing a zoom magnification of the image sensor. Claim 20 The non-transitory computer-readable medium of claim 12, wherein the touch GUI displays an icon for changing a zoom magnification of the image sensor. Claim 21 The non-transitory computer-readable medium of claim 12, wherein the touch GUI is displayed on a first portion of the touch screen display, and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion. Claim 21 The non-transitory computer-readable medium of claim 12, wherein the touch GUI is displayed on a first portion of the touch screen display, and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion. Claim 22 The non-transitory computer-readable medium of claim 12, wherein the touch GUI includes an icon for setting a camera imaging mode and an icon for changing a zoom magnification of the image sensor, and wherein the touch GUI is displayed on a first portion of the touch screen display, and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion. Claim 22 The non-transitory computer-readable medium of claim 12, wherein the touch GUI includes an icon for setting a camera imaging mode and an icon for changing a zoom magnification of the image sensor, and wherein the touch GUI is displayed on a first portion of the touch screen display, and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion. Claim 23 An information processing method comprising: selecting a shoot mode from a plurality of shoot modes by user input operation to a touch screen display, displaying a touch graphical user interface (GUI) including a movable display object on a slider including a first position and a second position on a touch screen display; and, controlling an image sensor to begin capturing one or more images in the selected mode in response to detecting a first operation of a movement of a user finger from the first position on the touch screen display to the second position on the touch screen display, and controlling the touch screen display to display an animation moving the movable display object to the first position in response to detecting a second operation of removal of the user finger from the touch screen display. Claim 23 An information processing method comprising: displaying a touch graphical user interface (GUI) including a movable display object on a slider including a first position, a second position, and a third position on a touch screen display, each position associated with a different operation mode of an information processing device; and, instructing an image sensor to begin capturing one or more images in a continuous shoot mode in response to detecting a first operation of a movement of a user finger from the first position on the touch screen display to the third position on the touch screen display, and controlling the touch screen display to display an animation moving the movable display object to the first position in response to detecting a second operation of removal of the user finger from the touch screen display during capturing the one or more images in the continuous shoot mode, wherein a display of the GUI other than the movable display object is configured to rotate when the touch screen display is switched between a vertical orientation and a horizontal orientation, but a direction of detecting the first operation of the movement of the user finger from the first position to the second position and from the first position to the third position with respect to the touch screen display does not change when the GUI is switched between the vertical orientation and the horizontal orientation. Claim 24 The information processing method of claim 23, further comprising: operating a first operation in response to detecting the operation of maintaining a touching of the touch screen display for a predetermined amount of time. Claim 24 The information processing method of claim 23, further comprising: operating a first operation in response to detecting the operation of maintaining a touching of the touch screen display for a predetermined amount of time. Claim 25 The information processing method of claim 23, further comprising: stopping capture of the one or more images in a second imaging mode in response to detecting a ceasing of the touching the touch screen display. Claim 25 The information processing method of claim 23, further comprising: stopping capture of the one or more images in a second imaging mode in response to detecting a ceasing of the touching the touch screen display. Claim 26 The information processing method of claim 23, further comprising: ending a continuous shoot mode in response to detecting a ceasing of touching of the touch screen display. Claim 26 The information processing method of claim 23, further comprising: ending the continuous shoot mode in response to detecting a ceasing of touching of the touch screen display. Claim 27 The information processing method of claim 23, further comprising: moving the movable display object back toward the first position in response to detecting a ceasing of touching of the touch screen display. Claim 27 The information processing method of claim 23, further comprising: moving the movable display object back toward the first position in response to detecting a ceasing of touching of the touch screen display. Claim 28 The information processing method of claim 23, wherein the movable display object is displayed initially near a home button of the touch GUI. Claim 28 The information processing method of claim 23, wherein the movable display object is displayed initially near a home button of the touch GUI. Claim 29 The information processing method of claim 23, wherein the touch GUI includes a settings icon. Claim 29 The information processing method of claim 23, wherein the touch GUI includes a settings icon. Claim 30 The information processing method of claim 23, wherein the touch GUI includes an icon for setting a camera imaging mode. Claim 30 The information processing method of claim 23, wherein the touch GUI includes an icon for setting a camera imaging mode. Claim 31 The information processing method of claim 23, wherein the touch GUI displays an icon for changing a zoom magnification of the image sensor. Claim 31 The information processing method of claim 23, wherein the touch GUI displays an icon for changing a zoom magnification of the image sensor. Claim 32 The information processing method of claim 23, wherein the touch GUI is displayed on a first portion of the touch screen display, and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion. Claim 32 The information processing method of claim 23, wherein the touch GUI is displayed on a first portion of the touch screen display, and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion. Claim 33 The information processing method of claim 23, wherein the touch GUI includes an icon for setting a camera imaging mode and an icon for changing a zoom magnification of the image sensor, and wherein the touch GUI is displayed on a first portion of the touch screen display, and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion. Claim 33 The information processing method of claim 23, wherein the touch GUI includes an icon for setting a camera imaging mode and an icon for changing a zoom magnification of the image sensor, and wherein the touch GUI is displayed on a first portion of the touch screen display, and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion. Claim Rejections - 35 USC § 102 8. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. 9. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. 10. Claims 1-8, 10, 12-19, 21, 23-30 and 32 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Kinoshita (US-PGPUB 2014/0078371). Regarding claim 1, Kinoshita discloses an information processing device (see figs. 1-3) comprising: an image sensor (Apparatus 2 with image sensor 23; see fig. 1 and paragraph 0029); a touch screen display (Touch panel display 51, is the display 105. Fig. 4A is a display example of an operation/display screen 310 of the touch panel display 51; see paragraphs 0047-0048), and control circuitry (CPU 101; see fig. 3) configured to (CPU 101 controls the various units in the display control apparatus 100; see paragraph 0038): select a shoot mode from a plurality of shoot modes by user input operation to a touch screen display (The imaging apparatus 2 setting information including imaging setting information about the imaging mode can be changed by operating the display control apparatus 100 using display 51; see paragraph 0062); control a touch screen display to display a touch graphical user interface (GUI) including a movable display object on a slider including a first position and a second position (Operation icon 314 for instructing SW1 is displayed on the operation/display screen 310. In addition, SW1 button 314 can be slid by displaying a guide line 315 indicating the slide direction of the SW1 button 314; se figs. 4A-B and paragraphs 0049-0050); control an image sensor to begin capturing one or more images in the selected shoot mode in response to detecting in a movement of user finger from the first position on the touch screen display to the second position on the touch screen display (After the SW1 button 314 has been touched, the SW1 button 314 can be slid in the direction of the SW2 button 313; see paragraph 0054. The processing is performed so that continuous shooting is performed by continuing the imaging operation by the imaging apparatus 2 during the period that it is determined that the SW1 button is overlapping the SW2 button, without returning the SW1 button 314 to its original display position; see paragraph 0076), and control the touch screen display to display an animation moving the movable display object to the first position in response to detecting a ceasing of a touching of the touch screen display (“the processing may also be configured so that the display position of the SW1 button is not returned to its original position, and is displayed in a state overlapping the SW2 button until the touch on the SW2 button is released…Thus, by displaying the SW1 button and the SW2 button in an overlapping state until the touch is released, the user can grasp what instructions have been made by a touch operation up until that point”; see paragraph 0097. Therefore, when the released operation is detected (ceasing the touching), then the animation is provided by allowing the graphical button 314 to return to the original display position on its own). Regarding claim 2, Kinoshita discloses everything claimed as applied above (see claim 1). In addition, Kinoshita discloses the control circuitry (CPU 101; see fig. 1 and paragraph 0038) is further configured to operate a first operation (Imaging preparation operation is executed, such as adjustment of an imaging setting, including autofocusing, automatic exposure control, and automatic white balance; see paragraphs 0053, 0031) in response to detecting a maintaining of a touching of the touch screen display at the first position for a predetermined amount of time (When the display control apparatus 100 (CPU 101) detects that the SW1 button 314 has been touched, the display control apparatus 100 transmits an imaging preparation start command to the imaging apparatus 2. When the display control apparatus 100 detects that the touch has been released from the SW1 button 314, the display control apparatus 100 transmits an imaging preparation stop command to the imaging apparatus 2; see paragraph 0053. Therefore, while button 314 is touched and the touch is maintained; then the imaging preparation is executed. If button 314 is slid towards button 313, then continuous shooting is executed; see paragraph 0076. If button 314 is slid towards a direction that is not capable of sliding, then the action is taken as it is being released and imaging preparation is also stopped; see paragraph 0054). Regarding claim 3, Kinoshita discloses everything claimed as applied above (see claim 1). In addition, Kinoshita discloses the control circuitry is further configured to stop capturing the one or more images in a second imaging mode in response to detecting the ceasing of the touching the touch screen display (In response to detection of the fact that the SW1 button is no longer overlapping the SW2 button, an imaging stop command is transmitted to the imaging apparatus 2, and an instruction to stop the continuous shooting is notified; see paragraph 0076. The processing may also be configured so that the display position of the SW1 button is not returned to its original position, and is displayed in a state overlapping the SW2 button until the touch on the SW2 button is released. Thus, by displaying the SW1 button and the SW2 button in an overlapping state until the touch is released, the user can grasp what instructions have been made by a touch operation up until that point, see paragraph 0097). Regarding claim 4 Kinoshita discloses everything claimed as applied above (see claim 1). In addition, Kinoshita discloses the control circuitry is further configured to end capturing the one or more images in a continuous shoot mode in response to detecting the ceasing of the touching of the touch screen display (In response to detection of the fact that the SW1 button is no longer overlapping the SW2 button, an imaging stop command is transmitted to the imaging apparatus 2, and an instruction to stop the continuous shooting is notified; see paragraph 0076. By displaying the SW1 button and the SW2 button in an overlapping state until the touch is released, the user can grasp what instructions have been made by a touch operation up until that point, see paragraph 0097). Regarding claim 5, Kinoshita discloses everything claimed as applied above (see claim 1). In addition, Kinoshita discloses the control circuitry is further configured to move the movable display object back toward the first position in response to detecting a ceasing of touching of the touch screen display (The processing is configured so that the display position of the SW1 button is not returned to its original position, and is displayed in a state overlapping the SW2 button until the touch on the SW2 button is released. Thus, by displaying the SW1 button and the SW2 button in an overlapping state until the touch is released, the user can grasp what instructions have been made by a touch operation up until that point, see paragraph 0097). Regarding claim 6, Kinoshita discloses everything claimed as applied above (see claim 1). In addition, Kinoshita discloses the movable display object is initially displayed near a home button of the touch GUI (After the SW1 button 314 has been touched, the SW1 button 314 can be slid in the direction of the SW2 button 313; see paragraph 0054. Figure 4A illustrates button 314 at the initial first position before reaching and overlapping button 313, the initial position situated at the bottom of the display which is located near the home button of the display device 100 (white circle at the bottom of the display), see fig. 4A). Regarding claim 7, Kinoshita discloses everything claimed as applied above (see claim 1). In addition, Kinoshita discloses the touch GUI includes a settings icon (Operation icon 312 is an operation member for setting the display area of the screen 311 between an enlarged and a reduced display; see paragraph 0051. In addition, the imaging apparatus 2 setting information can be changed by operating the display control apparatus 100; see paragraph 0062). Regarding claim 8, Kinoshita discloses everything claimed as applied above (see claim 1). In addition, Kinoshita discloses the touch GUI includes an icon for setting a camera imaging mode (The setting information is, for example, imaging setting information about the imaging mode setting status. If the imaging apparatus 2 setting information has been changed by operating the display control apparatus 100, an update is notified from the imaging apparatus 2 to the display control apparatus 100; see paragraphs 0062, 0040, 0047-0048). Regarding claim 10, Kinoshita discloses everything claimed as applied above (see claim 1). In addition, Kinoshita discloses the touch GUI is displayed on a first portion of the touch screen display (The bottom area of item 310 located below rectangle area 311; see figs. 4A-B), and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion (The live view image obtained from the imaging apparatus 2 is displayed on screen 311, the upper area of item 310. Live image region 311 is displayed above icon buttons 314 and 313; see figs. 4A-4B and paragraph 0048). Regarding claim 12, Kinoshita discloses a non-transitory computer-readable medium (Memory 102; see paragraph 0038) storing instructions that, when executed by a processor (CPU 101; see fig. 3 and paragraphs 0037-0038), cause the processor to execute operations comprising: selecting a shoot mode from a plurality of shoot modes by user input operation to a touch screen display (The imaging apparatus 2 setting information including imaging setting information about the imaging mode can be changed by operating the display control apparatus 100 using display 51; see paragraph 0062); displaying a touch graphical user interface (GUI) (Display 105 displays an image and a graphical user interface (GUI) screen forming a GUI under the control of the CPU 101; see paragraph 0040. Touch panel display 51, is the display 105. Fig. 4A is a display example of an operation/display screen 310 of the touch panel display 51; see paragraphs 0047-0048) including a movable display object on a slider including a first position and second position, on a touch screen display (Operation icon 314 for instructing SW1 is displayed on the operation/display screen 310. In addition, SW1 button 314 can be slid by displaying a guide line 315 indicating the slide direction of the SW1 button 314; se figs. 4A-B and paragraphs 0049-0050); controlling an image sensor to begin capturing one or more images in the selected mode in response to detecting a movement of a user finger, in a first direction of operation of the movable display object with respect to the touch screen display, from the first position on the touch screen display to the second position on the touch screen display (After the SW1 button 314 has been touched, the SW1 button 314 can be slid in the direction of the SW2 button 313; see paragraph 0054. The processing is performed so that continuous shooting is performed by continuing the imaging operation by the imaging apparatus 2 during the period that it is determined that the SW1 button is overlapping the SW2 button, without returning the SW1 button 314 to its original display position; see paragraph 0076), and controlling the touch screen display to display an animation moving the movable display object to the first position in response to detecting a ceasing of a touching of the touch screen display (“the processing may also be configured so that the display position of the SW1 button is not returned to its original position, and is displayed in a state overlapping the SW2 button until the touch on the SW2 button is released…Thus, by displaying the SW1 button and the SW2 button in an overlapping state until the touch is released, the user can grasp what instructions have been made by a touch operation up until that point”; see paragraph 0097. Therefore, when the released operation is detected (ceasing the touching), then the animation is provided by allowing the graphical button 314 to return to the original display position on its own). Regarding claim 13, Kinoshita discloses everything claimed as applied above (see claim 12). In addition, Kinoshita discloses the operations further comprise: operating a first operation (Imaging preparation operation is executed, such as adjustment of an imaging setting, including autofocusing, automatic exposure control, and automatic white balance; see paragraphs 0053, 0031) in response to detecting a maintaining of a touching of touch screen display at the first position for a predetermined amount of time (When the display control apparatus 100 (CPU 101) detects that the SW1 button 314 has been touched, the display control apparatus 100 transmits an imaging preparation start command to the imaging apparatus 2. When the display control apparatus 100 detects that the touch has been released from the SW1 button 314, the display control apparatus 100 transmits an imaging preparation stop command to the imaging apparatus 2; see paragraph 0053. Therefore, while button 314 is touched and the touch is maintained; then the imaging preparation is executed. If button 314 is slid towards button 313, then continuous shooting is executed; see paragraph 0076. If button 314 is slid towards a direction that is not capable of sliding, then the action is taken as it is being released and imaging preparation is also stopped; see paragraph 0054). Regarding claim 14, Kinoshita discloses everything claimed as applied above (see claim 12). In addition, Kinoshita discloses the operations further comprise: stopping capture of the one or more images in a second imaging mode in response to detecting the ceasing of the touching of the touch screen display (In response to detection of the fact that the SW1 button is no longer overlapping the SW2 button, an imaging stop command is transmitted to the imaging apparatus 2, and an instruction to stop the continuous shooting is notified; see paragraph 0076. The processing may also be configured so that the display position of the SW1 button is not returned to its original position, and is displayed in a state overlapping the SW2 button until the touch on the SW2 button is released. Thus, by displaying the SW1 button and the SW2 button in an overlapping state until the touch is released, the user can grasp what instructions have been made by a touch operation up until that point, see paragraph 0097). Regarding claim 15, Kinoshita discloses everything claimed as applied above (see claim 12). In addition, Kinoshita discloses end capturing the one or more images in the continuous shoot mode in response to detecting the ceasing of touching of the touch screen display (In response to detection of the fact that the SW1 button is no longer overlapping the SW2 button, an imaging stop command is transmitted to the imaging apparatus 2, and an instruction to stop the continuous shooting is notified; see paragraph 0076. By displaying the SW1 button and the SW2 button in an overlapping state until the touch is released, the user can grasp what instructions have been made by a touch operation up until that point, see paragraph 0097). Regarding claim 16, Kinoshita discloses everything claimed as applied above (see claim 12). In addition, Kinoshita discloses moving the movable display object back toward the first position in response to detecting the ceasing of the touching touch screen display (The processing is configured so that the display position of the SW1 button is not returned to its original position, and is displayed in a state overlapping the SW2 button until the touch on the SW2 button is released. Thus, by displaying the SW1 button and the SW2 button in an overlapping state until the touch is released, the user can grasp what instructions have been made by a touch operation up until that point, see paragraph 0097). Regarding claim 17, Kinoshita discloses everything claimed as applied above (see claim 12). In addition, Kinoshita discloses the movable display object is initially displayed near a home button of the touch GUI (After the SW1 button 314 has been touched, the SW1 button 314 can be slid in the direction of the SW2 button 313; see paragraph 0054. Figure 4A illustrates button 314 at the initial first position before reaching and overlapping button 313, the initial position situated at the bottom of the display which is located near the home button of the display device 100 (white circle at the bottom of the display), see fig. 4A). Regarding claim 18, Kinoshita discloses everything claimed as applied above (see claim 12). In addition, Kinoshita discloses the touch GUI includes a settings icon (Operation icon 312 is an operation member for setting the display area of the screen 311 between an enlarged and a reduced display; see paragraph 0051. In addition, the imaging apparatus 2 setting information can be changed by operating the display control apparatus 100; see paragraph 0062). Regarding claim 19, Kinoshita discloses everything claimed as applied above (see claim 12). In addition, Kinoshita discloses the touch GUI includes an icon for setting a camera imaging mode (The setting information is, for example, imaging setting information about the imaging mode setting status. If the imaging apparatus 2 setting information has been changed by operating the display control apparatus 100, an update is notified from the imaging apparatus 2 to the display control apparatus 100; see paragraphs 0062, 0040, 0047-0048). Regarding claim 21, Kinoshita discloses everything claimed as applied above (see claim 12). In addition, Kinoshita discloses the touch GUI is displayed on a first portion of the touch screen display (The bottom area of item 310 located below rectangle area 311; see figs. 4A-B), and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion (The live view image obtained from the imaging apparatus 2 is displayed on screen 311, the upper area of item 310. Live image region 311 is displayed above icon buttons 314 and 313; see figs. 4A-4B and paragraph 0048). Regarding claim 23, Kinoshita discloses an information processing method (see figs. 1-3, 6) comprising: selecting a shoot mode from a plurality of shoot modes by user input operation to a touch screen display (The imaging apparatus 2 setting information including imaging setting information about the imaging mode can be changed by operating the display control apparatus 100 using display 51; see paragraph 0062); displaying a touch graphical user interface (GUI) (Display 105 displays an image and a graphical user interface (GUI) screen forming a GUI under the control of the CPU 101; see paragraph 0040. Touch panel display 51, is the display 105. Fig. 4A is a display example of an operation/display screen 310 of the touch panel display 51; see paragraphs 0047-0048) including a movable display object on a slider including a first position and a second position on a touch screen display (Operation icon 314 for instructing SW1 is displayed on the operation/display screen 310. In addition, SW1 button 314 can be slid by displaying a guide line 315 indicating the slide direction of the SW1 button 314; se figs. 4A-B and paragraphs 0049-0050); and controlling an image sensor to begin capturing one or more images the selected mode in response to detecting a first operation of a movement of a user finger from the first position on the touch screen display to the second position on the touch screen display (After the SW1 button 314 has been touched, the SW1 button 314 can be slid in the direction of the SW2 button 313; see paragraph 0054. The processing is performed so that continuous shooting is performed by continuing the imaging operation by the imaging apparatus 2 during the period that it is determined that the SW1 button is overlapping the SW2 button, without returning the SW1 button 314 to its original display position; see paragraph 0076), and controlling the touch screen display to display an animation moving the movable display object to the first position in response to detecting a second operation of removal of the user finger from the touch screen display (“the processing may also be configured so that the display position of the SW1 button is not returned to its original position, and is displayed in a state overlapping the SW2 button until the touch on the SW2 button is released…Thus, by displaying the SW1 button and the SW2 button in an overlapping state until the touch is released, the user can grasp what instructions have been made by a touch operation up until that point”; see paragraph 0097. Therefore, when the released operation is detected (ceasing the touching), then the animation is provided by allowing the graphical button 314 to return to the original display position on its own). Regarding claim 24, Kinoshita discloses everything claimed as applied above (see claim 23). In addition, Kinoshita discloses operating a first operation (Imaging preparation operation is executed, such as adjustment of an imaging setting, including autofocusing, automatic exposure control, and automatic white balance; see paragraphs 0053, 0031) in response to detecting the operation of maintaining a touching of the touch screen display for a predetermined amount of time (When the display control apparatus 100 (CPU 101) detects that the SW1 button 314 has been touched, the display control apparatus 100 transmits an imaging preparation start command to the imaging apparatus 2. When the display control apparatus 100 detects that the touch has been released from the SW1 button 314, the display control apparatus 100 transmits an imaging preparation stop command to the imaging apparatus 2; see paragraph 0053. Therefore, while button 314 is touched and the touch is maintained; then the imaging preparation is executed. If button 314 is slid towards button 313, then continuous shooting is executed; see paragraph 0076. If button 314 is slid towards a direction that is not capable of sliding, then the action is taken as it is being released and imaging preparation is also stopped; see paragraph 0054). Regarding claim 25, Kinoshita discloses everything claimed as applied above (see claim 23). In addition, Kinoshita discloses stopping capture of the one or more images in a second imaging mode in response to detecting a ceasing of the touching the touch screen display (In response to detection of the fact that the SW1 button is no longer overlapping the SW2 button, an imaging stop command is transmitted to the imaging apparatus 2, and an instruction to stop the continuous shooting is notified; see paragraph 0076. The processing may also be configured so that the display position of the SW1 button is not returned to its original position, and is displayed in a state overlapping the SW2 button until the touch on the SW2 button is released. Thus, by displaying the SW1 button and the SW2 button in an overlapping state until the touch is released, the user can grasp what instructions have been made by a touch operation up until that point, see paragraph 0097). Regarding claim 26, Kinoshita discloses everything claimed as applied above (see claim 23). In addition, Kinoshita ending a continuous shoot mode in response to detecting a ceasing of touching of the touch screen display (In response to detection of the fact that the SW1 button is no longer overlapping the SW2 button, an imaging stop command is transmitted to the imaging apparatus 2, and an instruction to stop the continuous shooting is notified; see paragraph 0076. By displaying the SW1 button and the SW2 button in an overlapping state until the touch is released, the user can grasp what instructions have been made by a touch operation up until that point, see paragraph 0097). Regarding claim 27, Kinoshita discloses everything claimed as applied above (see claim 23). In addition, Kinoshita discloses moving the movable display object back toward the first position in response to detecting a ceasing of touching of the touch screen display (The processing is configured so that the display position of the SW1 button is not returned to its original position, and is displayed in a state overlapping the SW2 button until the touch on the SW2 button is released. Thus, by displaying the SW1 button and the SW2 button in an overlapping state until the touch is released, the user can grasp what instructions have been made by a touch operation up until that point, see paragraph 0097). Regarding claim 28 Kinoshita discloses everything claimed as applied above (see claim 23). In addition, Kinoshita discloses the movable display object is displayed initially near a home button of the touch GUI (After the SW1 button 314 has been touched, the SW1 button 314 can be slid in the direction of the SW2 button 313; see paragraph 0054. Figure 4A illustrates button 314 at the initial first position before reaching and overlapping button 313, the initial position situated at the bottom of the display which is located near the home button of the display device 100 (white circle at the bottom of the display), see fig. 4A). Regarding claim 29, Kinoshita discloses everything claimed as applied above (see claim 23). In addition, Kinoshita discloses the touch GUI includes a settings icon (Operation icon 312 is an operation member for setting the display area of the screen 311 between an enlarged and a reduced display; see paragraph 0051. In addition, the imaging apparatus 2 setting information can be changed by operating the display control apparatus 100; see paragraph 0062). Regarding claim 30, Kinoshita discloses everything claimed as applied above (see claim 23). In addition, Kinoshita discloses the touch GUI includes an icon for setting a camera imaging mode (The setting information is, for example, imaging setting information about the imaging mode setting status. If the imaging apparatus 2 setting information has been changed by operating the display control apparatus 100, an update is notified from the imaging apparatus 2 to the display control apparatus 100; see paragraphs 0062, 0040, 0047-0048). Regarding claim 32, Kinoshita discloses everything claimed as applied above (see claim 23). In addition, Kinoshita discloses the touch GUI is displayed on a first portion of the touch screen display (The bottom area of item 310 located below rectangle area 311; see figs. 4A-B), and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion (The live view image obtained from the imaging apparatus 2 is displayed on screen 311, the upper area of item 310. Live image region 311 is displayed above icon buttons 314 and 313; see figs. 4A-4B and paragraph 0048). 11. Claims 9, 11, 20, 22, 31 and 33 are rejected under 35 U.S.C. 103 as being unpatentable over Kinoshita in view of Choi et al. (US-PGPUB 2014/0063313). Regarding claim 9, Kinoshita discloses everything claimed as applied above (see claim 1). However, Kinoshita fails to disclose the touch GUI displays an icon for changing a zoom magnification of the image sensor. On the other hand, Choi discloses the touch GUI displays an icon for changing a zoom magnification of the image sensor (In a camera mode, a slide button 237 is output to at least one side of the capturing button. The slide button 237 can be moved by a user's drag. Once the slide button 237 is dragged towards the capturing button 233, a zoom function is activated. If the slide button 237 is slid towards the plus icon, a zoom-in function is executed. On the other hand, if the slide button 237 is slid towards the minus icon, a zoom-out function is executed; see paragraphs 0152-0153 and fig. 7A). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Kinoshita and Choi to provide the touch GUI displays an icon for changing a zoom magnification of the image sensor for the purpose of effectively capturing images in accordance to user preferences while allowing the user to easily control the captured field of view through the manipulation of a friendly user interface. Regarding claim 11, Kinoshita discloses everything claimed as applied above (see claim 1). In addition, Kinoshita discloses the touch GUI includes an icon for setting a camera imaging mode (The setting information is, for example, imaging setting information about the imaging mode setting status. If the imaging apparatus 2 setting information has been changed by operating the display control apparatus 100, an update is notified from the imaging apparatus 2 to the display control apparatus 100; see paragraphs 0062, 0040, 0047-0048) and wherein the touch GUI is displayed on a first portion of the touch screen display (The bottom area of item 310 located below rectangle area 311; see figs. 4A-B), and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion (The live view image obtained from the imaging apparatus 2 is displayed on screen 311, the upper area of item 310. Live image region 311 is displayed above icon buttons 314 and 313; see figs. 4A-4B and paragraph 0048). However, Kinoshita fails to disclose the touch GUI includes an icon for changing a zoom magnification of the image sensor. Nevertheless, Choi discloses the touch GUI includes an icon for changing a zoom magnification of the image sensor (In a camera mode, a slide button 237 is output to at least one side of the capturing button. The slide button 237 can be moved by a user's drag. Once the slide button 237 is dragged towards the capturing button 233, a zoom function is activated. If the slide button 237 is slid towards the plus icon, a zoom-in function is executed. On the other hand, if the slide button 237 is slid towards the minus icon, a zoom-out function is executed; see paragraphs 0152-0153 and fig. 7A). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Kinoshita and Choi to provide the touch GUI includes an icon for changing a zoom magnification of the image sensor for the purpose of effectively capturing images in accordance to user preferences while allowing the user to easily control the captured field of view through the manipulation of a friendly user interface. Regarding claim 20, Kinoshita discloses everything claimed as applied above (see claim 12). However, Kinoshita fails to disclose the touch GUI displays an icon for changing a zoom magnification of the image sensor. On the other hand, Choi discloses the touch GUI displays an icon for changing a zoom magnification of the image sensor (In a camera mode, a slide button 237 is output to at least one side of the capturing button. The slide button 237 can be moved by a user's drag. Once the slide button 237 is dragged towards the capturing button 233, a zoom function is activated. If the slide button 237 is slid towards the plus icon, a zoom-in function is executed. On the other hand, if the slide button 237 is slid towards the minus icon, a zoom-out function is executed; see paragraphs 0152-0153 and fig. 7A). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Kinoshita and Choi to provide the touch GUI displays an icon for changing a zoom magnification of the image sensor for the purpose of effectively capturing images in accordance to user preferences while allowing the user to easily control the captured field of view through the manipulation of a friendly user interface. Regarding claim 22, Kinoshita discloses everything claimed as applied above (see claim 12). In addition, Kinoshita discloses the touch GUI includes an icon for setting a camera imaging mode (The setting information is, for example, imaging setting information about the imaging mode setting status. If the imaging apparatus 2 setting information has been changed by operating the display control apparatus 100, an update is notified from the imaging apparatus 2 to the display control apparatus 100; see paragraphs 0062, 0040, 0047-0048) and wherein the touch GUI is displayed on a first portion of the touch screen display (The bottom area of item 310 located below rectangle area 311; see figs. 4A-B), and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion (The live view image obtained from the imaging apparatus 2 is displayed on screen 311, the upper area of item 310. Live image region 311 is displayed above icon buttons 314 and 313; see figs. 4A-4B and paragraph 0048). However, Kinoshita fails to disclose the touch GUI includes an icon for changing a zoom magnification of the image sensor. Nevertheless, Choi discloses the touch GUI includes an icon for changing a zoom magnification of the image sensor (In a camera mode, a slide button 237 is output to at least one side of the capturing button. The slide button 237 can be moved by a user's drag. Once the slide button 237 is dragged towards the capturing button 233, a zoom function is activated. If the slide button 237 is slid towards the plus icon, a zoom-in function is executed. On the other hand, if the slide button 237 is slid towards the minus icon, a zoom-out function is executed; see paragraphs 0152-0153). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Kinoshita and Choi to provide the touch GUI includes an icon for changing a zoom magnification of the image sensor for the purpose of effectively capturing images in accordance to user preferences while allowing the user to easily control the captured field of view through the manipulation of a friendly user interface. Regarding claim 31, Kinoshita discloses everything claimed as applied above (see claim 23). However, Kinoshita fails to disclose the touch GUI displays an icon for changing a zoom magnification of the image sensor. On the other hand, Choi discloses the touch GUI displays an icon for changing a zoom magnification of the image sensor (In a camera mode, a slide button 237 is output to at least one side of the capturing button. The slide button 237 can be moved by a user's drag. Once the slide button 237 is dragged towards the capturing button 233, a zoom function is activated. If the slide button 237 is slid towards the plus icon, a zoom-in function is executed. On the other hand, if the slide button 237 is slid towards the minus icon, a zoom-out function is executed; see paragraphs 0152-0153 and fig. 7A). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Kinoshita and Choi to provide the touch GUI displays an icon for changing a zoom magnification of the image sensor for the purpose of effectively capturing images in accordance to user preferences while allowing the user to easily control the captured field of view through the manipulation of a friendly user interface. Regarding claim 33, Kinoshita discloses everything claimed as applied above (see claim 23). In addition, Kinoshita discloses the touch GUI includes an icon for setting a camera imaging mode (The setting information is, for example, imaging setting information about the imaging mode setting status. If the imaging apparatus 2 setting information has been changed by operating the display control apparatus 100, an update is notified from the imaging apparatus 2 to the display control apparatus 100; see paragraphs 0062, 0040, 0047-0048) and wherein the touch GUI is displayed on a first portion of the touch screen display (The bottom area of item 310 located below rectangle area 311; see figs. 4A-B), and a through-image preview is shown on a second portion of the touch screen display, the second portion being distinct from the first portion (The live view image obtained from the imaging apparatus 2 is displayed on screen 311, the upper area of item 310. Live image region 311 is displayed above icon buttons 314 and 313; see figs. 4A-4B and paragraph 0048). However, Kinoshita fails to disclose the touch GUI includes an icon for changing a zoom magnification of the image sensor. Nevertheless, Choi discloses the touch GUI includes an icon for changing a zoom magnification of the image sensor (In a camera mode, a slide button 237 is output to at least one side of the capturing button. The slide button 237 can be moved by a user's drag. Once the slide button 237 is dragged towards the capturing button 233, a zoom function is activated. If the slide button 237 is slid towards the plus icon, a zoom-in function is executed. On the other hand, if the slide button 237 is slid towards the minus icon, a zoom-out function is executed; see paragraphs 0152-0153 and fig. 7A). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Kinoshita and Choi to provide the touch GUI includes an icon for changing a zoom magnification of the image sensor for the purpose of effectively capturing images in accordance to user preferences while allowing the user to easily control the captured field of view through the manipulation of a friendly user interface. Contact Information 12. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CYNTHIA CALDERON whose telephone number is (571)270-3580. The examiner can normally be reached M-F 9:00 AM-5:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, TWYLER HASKINS can be reached at (571)272-7406. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CYNTHIA CALDERON/Primary Examiner, Art Unit 2639 01/07/2026
Read full office action

Prosecution Timeline

Oct 22, 2024
Application Filed
Jan 07, 2026
Non-Final Rejection — §102, §103, §DP (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12604088
IMAGE PICKUP APPARATUS CAPABLE OF CONTROLLING POWER SUPPLY, ITS CONTROL METHOD, AND STORAGE MEDIUM
2y 5m to grant Granted Apr 14, 2026
Patent 12604108
LIGHTFIELD CAMERA THAT CAN SIMULTANEOUSLY ACQUIRE 2D INFORMATION AND 3D SPATIAL INFORMATION FROM SAME DEPTH
2y 5m to grant Granted Apr 14, 2026
Patent 12598388
IMAGE CAPTURING APPARATUS AND CONTROL METHOD THEREOF
2y 5m to grant Granted Apr 07, 2026
Patent 12593120
METHOD FOR ACQUIRING A PHOTOGRAPHIC PORTRAIT OF AN INDIVIDUAL AND APPARATUS IMPLEMENTING THIS METHOD
2y 5m to grant Granted Mar 31, 2026
Patent 12587745
IMAGE CAPTURING APPARATUS AND CONTROL METHOD THEREOF
2y 5m to grant Granted Mar 24, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
77%
Grant Probability
96%
With Interview (+18.5%)
2y 4m
Median Time to Grant
Low
PTA Risk
Based on 782 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month