Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
Priority
Receipt is acknowledged of papers submitted under 35 U.S.C. 119(a)-(d), which papers have been placed of record in the file.
Information Disclosure Statement
The information disclosure statement(s) submitted on 11/8/2024 and 5/22/2025 is/are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement(s) is/are being considered by the examiner.
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1 and 8-17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kinoshita et al (US 20220385829 A1) in view of Yoshikawa (US 20190289220 A1).
Regarding claim 1, Kinoshita teaches An electronic device (Figs. 1-5; 500) comprising:
a processor; and a memory storing a program which, when executed by the processor, causes the electronic device to (Fig. 5; para. 0069)
execute acquisition processing to acquire an image having a plurality of image areas respectively captured via a plurality of optical systems (300) by an imaging apparatus (100) (Figs. 1, 4);
execute display control processing to perform control so that a display image based on the acquire image is displayed (Figs. 11; para. 0136)
but fails to teach
perform control so that an item indicating a position in the display image is displayed; and
execute receiving processing to receive a user operation for changing a position of the item, wherein in the display control processing, control is performed so that the item is displayed in a first form in a case where the position of the item corresponds to a position in a predetermined area of the image, and control is performed so that the item is displayed in a second form in a case where the position of the item corresponds to a position outside the predetermined area.
However, in the same field of endeavor Yoshikawa teaches
perform control so that an item indicating a position in the display image is displayed (Figs. 3; paras. 0074-0083; “the user can visually confirm the focus guide 301 (the focus guides 304 to 307) and thereby confirm the focus detection position and the focus status at the focus detection position”); and
execute receiving processing to receive a user operation for changing a position of the item (para. 0094: “the user can move the position of the focus guide 301”), wherein in the display control processing, control is performed so that the item is displayed in a first form (Fig. 9A; form 301) in a case where the position of the item corresponds to a position in a predetermined area (enlarged range 405) of the image, and control is performed so that the item is displayed in a second form (Fig. 9C; form 801) in a case where the position of the item corresponds to a position outside the predetermined area (Figs. 3, 9; paras. 0125-0134, 0116).
Therefore, it would have been obvious to one of ordinary skill in this art before the effective filing date of the claimed invention (AIA ) to use the teachings as taught by Yoshikawa in Kinoshita to have perform control so that an item indicating a position in the display image is displayed; and execute receiving processing to receive a user operation for changing a position of the item, wherein in the display control processing, control is performed so that the item is displayed in a first form in a case where the position of the item corresponds to a position in a predetermined area of the image, and control is performed so that the item is displayed in a second form in a case where the position of the item corresponds to a position outside the predetermined area for allowing that in a case where the enlarged image display function is executed, the user can figure out the position of the enlarged range relative to the entirety of the image while visually confirming the focus detection position and the focus status at the focus detection position with the focus guide on the display yielding a predicted result.
Regarding claim 8, the combination of Kinoshita and Yoshikawa teaches everything as claimed in claim 1. In addition, Kinoshita teaches wherein in the display control processing, control is further performed to display a second item (enlarged range 1214) indicating an area corresponding to the predetermined area in the display image (Figs. 12, 14; paras. 0153, 0169).
Regarding claim 9, the combination of Kinoshita and Yoshikawa teaches everything as claimed in claim 1. In addition, Yoshikawa teaches wherein the program, when executed by the processor, further causes the electronic device to execute setting processing to set, according to an instruction from a user (para. 0094: “the user can move the position of the focus guide 301”), whether to change a form of the item according to whether the position of the item corresponds to a position in the predetermined area (Figs. 3, 9; paras. 0125-0134, 0116).
Therefore, it would have been obvious to one of ordinary skill in this art before the effective filing date of the claimed invention (AIA ) to use the teachings as taught by Yoshikawa in the combination to have wherein the program, when executed by the processor, further causes the electronic device to execute setting processing to set, according to an instruction from a user, whether to change a form of the item according to whether the position of the item corresponds to a position in the predetermined area for allowing that in a case where the enlarged image display function is executed, the user can figure out the position of the enlarged range relative to the entirety of the image while visually confirming the focus detection position and the focus status at the focus detection position with the focus guide on the display yielding a predicted result.
Regarding claims 10-11, the combination of Kinoshita and Yoshikawa teaches everything as claimed in claim 1. In addition, Yoshikawa teaches
Claim 10: The electronic device according to claim 1, wherein in the display control processing, control is performed so that the item is displayed in a form corresponding to an operation member used for the user operation among a plurality of operation members (Figs. 1, 3; paras. 0074-0083; “the user can visually confirm the focus guide 301 (the focus guides 304 to 307) and thereby confirm the focus detection position and the focus status at the focus detection position”; para. 0094: “the user can move the position of the focus guide 301”; para. 0093: “the user may move the focus guide by, for example, performing a touch operation on the touch panel”).
Claim 11: The electronic device according to claim 10, wherein the plurality of operation members include at least one of a mouse, a keyboard, and a touch panel (para. 0093: “the user may move the focus guide by, for example, performing a touch operation on the touch panel”).
Therefore, it would have been obvious to one of ordinary skill in this art before the effective filing date of the claimed invention (AIA ) to use the teachings as taught by Yoshikawa in the combination to have features of claims 10-11 for enabling selecting different focus detection positions while visually confirming the focus detection position and the focus status at the focus detection position with the focus guide on the display yielding a predicted result.
Regarding claim 12, the combination of Kinoshita and Yoshikawa teaches everything as claimed in claim 1. In addition, Yoshikawa teaches wherein a plurality of areas of different types are used as the predetermined area (Fig. 5A-C; para. 0080; an enlarged range display 405 at different locations is selected), and in the display control processing, control is performed so that the item (301, 801) is displayed in a form according to a positional relationship between the plurality of areas and the item (Figs. 3, 9; paras. 0125-0134, 0116).
Therefore, it would have been obvious to one of ordinary skill in this art before the effective filing date of the claimed invention (AIA ) to use the teachings as taught by Yoshikawa in the combination to have wherein a plurality of areas of different types are used as the predetermined area, and in the display control processing, control is performed so that the item is displayed in a form according to a positional relationship between the plurality of areas and the item for allowing that in a case where the enlarged image display function is executed, the user can figure out the position of the enlarged range relative to the entirety of the image while visually confirming the focus detection position and the focus status at the focus detection position with the focus guide on the display yielding a predicted result.
Regarding claim 13, the combination of Kinoshita and Yoshikawa teaches everything as claimed in claim 1. In addition, Kinoshita teaches wherein in the acquisition processing, a live view image is acquired as the image (Figs. 11; para. 0135).
Regarding claim 14, the combination of Kinoshita and Yoshikawa teaches everything as claimed in claim 1. In addition, Kinoshita teaches wherein the predetermined area corresponds to an area suitable for white balance adjustment, enlargement display, exposure adjustment, and focus adjustment in the imaging apparatus (Figs. 12; para. 0153; 1214).
Regarding claim 15, the combination of Kinoshita and Yoshikawa teaches everything as claimed in claim 1. In addition, Kinoshita teaches wherein the image is an image in which two optical images are respectively formed in two areas of one imaging element (Figs. 4, 11-12).
Regarding claim 16, claim 16 reciting features corresponding to claim 1 is also rejected for the same reason above.
Regarding claim 17, claim 16 reciting features corresponding to claim 1 is also rejected for the same reason above. In addition, Kinoshita in the combination of Kinoshita and Yoshikawa teaches A non-transitory computer readable medium (para. 0096) that stores a program, wherein the program causes a computer to execute a control method of an electronic device, the control method comprising: (as taught in claim 1).
Claim(s) 2-7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kinoshita et al (US 20220385829 A1) in view of Yoshikawa (US 20190289220 A1) as applied to claim 1 above, and further in view of Shibuya (JP 2023081041 A).
Regarding claim 2, the combination of Kinoshita and Yoshikawa teaches everything as claimed in claim 1. In addition, Kinoshita teaches “In the state where distance measurement has failed, the CPU 112a may display the guide frame display 801a in the gray color (the third color)” (para. 0116),
but fails to teach
wherein the program, when executed by the processor, further causes the electronic device to execute information acquisition processing to acquire area information indicating the predetermined area from the imaging apparatus, and in the display control processing, it is determined whether the position of the item corresponds to a position in the predetermined area, based on the area information.
However, in the same field of endeavor Shibuya teaches
wherein the program, when executed by the processor, further causes the electronic device to execute information acquisition processing to acquire area information indicating the predetermined area from the imaging apparatus, and in the display control processing, it is determined whether the position of the item corresponds to a position in the predetermined area, based on the area information (Figs. 8; page 11: “the focus detection frame 311 is arranged inside the horizontal/vertical detectable range frame 821, the focus detection frame 311 is displayed with a solid line, and the horizontal and vertical This indicates that focus detection is possible in both directions”, “the focus detection frame 311 is positioned outside the horizontal/vertical detectable range frame 821, the focus detection frame 311 is displayed. The form is indicated by a dotted line like a focus detection frame 811, indicating that focus detection is possible only in the horizontal direction”).
Therefore, it would have been obvious to one of ordinary skill in this art before the effective filing date of the claimed invention (AIA ) to use the teachings as taught by Shibuya in the combination to have wherein the program, when executed by the processor, further causes the electronic device to execute information acquisition processing to acquire area information indicating the predetermined area from the imaging apparatus, and in the display control processing, it is determined whether the position of the item corresponds to a position in the predetermined area, based on the area information for enabling users to easily check if focus detection is possible in any direction of a plurality of directions yielding a predicted result.
Regarding claim 3, the combination of Kinoshita, Yoshikawa and Shibuya teaches everything as claimed in claim 2. In addition, Kinoshita teaches wherein the display image is an image obtained by performing equirectangular conversion on the image, and in the display control processing, the equirectangular conversion is performed on the area information (Kinoshita: Figs. 11C, 12B; paras. 0140, 0146) and it is determined whether the position of the item corresponds to a position in the predetermined area, based on the area information (as taught by Shibuya for the reason above in claim 2) after the equirectangular conversion (Kinoshita: Figs. 11C, 12B).
Regarding claim 4, the combination of Kinoshita, Yoshikawa and Shibuya teaches everything as claimed in claim 2. In addition, Kinoshita teaches wherein the display image is an image obtained by performing arrangement conversion of the plurality of image areas (Kinoshita: figs. 11A-11B, 12A-C), and in the display control processing, the arrangement conversion is performed on the area information and it is determined whether the position of the item corresponds to a position in the predetermined area, based on the area information (as taught by Shibuya for the reason above in claim 2) after the arrangement conversion (Kinoshita: figs. 11A-11B, 12A-C).
Regarding claim 5, the combination of Kinoshita, Yoshikawa and Shibuya teaches everything as claimed in claim 4. In addition, Kinoshita teaches wherein the image is an image in which two image areas are arranged side by side, and the arrangement conversion is conversion for exchanging positions of the two image areas (figs. 11A-11B, 12A-C).
Regarding claim 6, the combination of Kinoshita and Yoshikawa teaches everything as claimed in claim 1. In addition, Kinoshita teaches “In the state where distance measurement has failed, the CPU 112a may display the guide frame display 801a in the gray color (the third color)” (para. 0116),
But fails to teach
wherein the program, when executed by the processor, further causes the electronic device to execute control processing to control to perform predetermined processing based on the position of the item, and in the control processing, control is performed not to perform the predetermined processing in a case where the position of the item corresponds to a position outside the predetermined area and to perform the predetermined processing in a case where the position of the item corresponds to a position in the predetermined area.
However, in the same field of endeavor Shibuya teaches
wherein the program, when executed by the processor, further causes the electronic device to execute control processing to control to perform predetermined processing based on the position of the item, and in the control processing, control is performed not to perform the predetermined processing in a case where the position of the item corresponds to a position outside the predetermined area and to perform the predetermined processing in a case where the position of the item corresponds to a position in the predetermined area (Figs. 8; page 11: “the focus detection frame 311 is arranged inside the horizontal/vertical detectable range frame 821, the focus detection frame 311 is displayed with a solid line, and the horizontal and vertical This indicates that focus detection is possible in both directions”, “the focus detection frame 311 is positioned outside the horizontal/vertical detectable range frame 821, the focus detection frame 311 is displayed. The form is indicated by a dotted line like a focus detection frame 811, indicating that focus detection is possible only in the horizontal direction”; “when the focus detection frame 311 exceeds the horizontal/vertical detectable range frame 821, a mode may be provided in which the lens driving operation by the imaging plane phase difference AF is stopped”).
Therefore, it would have been obvious to one of ordinary skill in this art before the effective filing date of the claimed invention (AIA ) to use the teachings as taught by Shibuya in the combination to have wherein the program, when executed by the processor, further causes the electronic device to execute control processing to control to perform predetermined processing based on the position of the item, and in the control processing, control is performed not to perform the predetermined processing in a case where the position of the item corresponds to a position outside the predetermined area and to perform the predetermined processing in a case where the position of the item corresponds to a position in the predetermined area for enabling users to easily check if focus detection is possible in any direction of a plurality of directions yielding a predicted result.
Regarding claim 7, the combination of Kinoshita, Yoshikawa and Shibuya teaches everything as claimed in claim 6. In addition, Yoshikawa teaches wherein the image is acquired from the imaging apparatus, and the predetermined processing is processing of transmitting instruction of focus adjustment, white balance adjustment, or enlargement display to the imaging apparatus (Figs. 8-9; para. 0029).
Therefore, it would have been obvious to one of ordinary skill in this art before the effective filing date of the claimed invention (AIA ) to use the teachings as taught by Yoshikawa in the combination to have wherein the image is acquired from the imaging apparatus, and the predetermined processing is processing of transmitting instruction of focus adjustment, white balance adjustment, or enlargement display to the imaging apparatus for visually confirming the focus detection position and the focus status at the focus detection position with the focus guide on the display yielding a predicted result.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Quan Pham whose telephone number is (571)272-4438. The examiner can normally be reached Mon-Fri 9am-7pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sinh Tran can be reached at (571) 272-7564. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/Quan Pham/Primary Examiner, Art Unit 2637