claimNotice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Claims 3, 10, 16, &19 have been canceled. Accordingly, Claims 1,2,4-9,11-15, 17, & 18 are pending. Applicant's arguments filed 11/14/2025 have been fully considered but they are not persuasive.
On page 7 of Applicant arguments dated 11/14/2025, the Applicant states that the distinguishable feature of the claimed invention (i.e., Claim 1) includes “in a case where the flicker detection unit detects a flicker, the first determination unit determines a combination of an exposure time of the sensor and a diaphragm value of the optical system with priority given to setting the exposure time to an exposure time in which a flicker is inconspicuous over to opening a diaphragm of the optical system.” Applicant goes on to elaborate that a key feature of the claimed invention is that when flicker is detected, instead of immediately adjusting the aperture, the exposure time is first set to a value where the flicker is less noticeable. The crux of the Applicant’s argument appears to be that Shibuno does not disclose or suggest an idea of the distinguishable feature of the claimed invention, which includes, “wherein in a case where the flicker detection unit detects a flicker, the first determination unit determines a combination of an exposure time of the sensor and a diaphragm value of the optical system with priority given to setting an exposure time in which a flicker is inconspicuous over to opening a diaphragm of the optical system.”
The Examiner respectfully disagrees with the Applicant’s assertion that Shibuno does not disclose or suggest an idea of the distinguishable feature of the claimed invention (i.e., Claim 1). However, the Examiner concedes that the initial citation of Shibuno, para. [0086], ln. 6-11 may have been lacking the context necessary to present a prima facie case of obviousness. As such, the final rejection has been updated to include para. [0086] in it’s entirety, as the first half of the paragraph provides clarity and context. Paragraph [0086] reads (all brackets added by Examiner), "When a flicker occurs or the imaging is performed under a fluorescent lamp, if the camera body 2 shortens the exposure time, a flicker in the image data becomes highly visible. Thus, when a flicker occurs in the image data, if the camera body 2 suppresses the shortening of the exposure time [i.e., sets the exposure time], more desirable image data can be provided for the user [flicker is minimized or inconspicuous]. When the obtained diaphragm value differs from the already set diaphragm value (Yes at step J1), the CPU 211 determines whether or not a flicker occurs in image data generated by the CMOS image sensor (J2). When the CPU 211 determines that the flicker occurs, it adjusts the diaphragm 307 in the interchangeable lens 3 (J3). When the CPU 211 determines that the flicker does not occur, it adjusts the shooting conditions in the camera body 2 (J4). In this way, when the flicker occurs, the diaphragm drive unit 308 in the interchangeable lens 3 is driven to prevent the shorter exposure time [i.e., priority given to setting the exposure time]. Accordingly, more desirable moving image can be provided for the user. The CPU 211 may preferentially control the diaphragm similarly to the above other embodiment [may determine which of a setting of exposure time or opening of a diaphragm to give priority to]." The Examiner interprets this paragraph to be describing “…determining, in a case of detecting the in-focus state in the focus detection, which of a setting of an exposure time of the sensor to an exposure time in which a flicker is inconspicuous and opening of a diaphragm of the optical system to give priority to based on a detection result of a flicker by the flicker detection, and, based on the determination, determining a combination of an exposure time of the sensor and a diaphragm value of the optical system, wherein in a case where the detecting detects a flicker, the determining determines a combination of an exposure time of the sensor and a diaphragm value of the optical system with priority given to setting the exposure time to an exposure time in which a flicker is inconspicuous over to opening a diaphragm of the optical system.” Examiner respectfully recommends that claim language more specific to the novelty of the Applicant’s invention be used to differentiate said invention from that of Shibuno in this regard, as the current language is overly broad.
Rejection citations for Claims 4, 14, & 17 have also been updated to include Shibuno para. [0086] in its entirety for similar reasons.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 4, 5-9, 11, 12, 14, & 17 are rejected under 35 U.S.C. 103 as being unpatentable over Shibuno (US 20090295940 A1, hereinafter, "Shibuno") in view of Inagaki (US 20220351395 A1, hereinafter, "Inagaki").
Regarding Claim 1, Shibuno teaches An apparatus comprising: a sensor that captures a subject image formed by an optical system (Shibuno, Fig. 2, [0030], ln. 1, "The camera body 2 includes a CMOS sensor [CMOS image sensor] 201..."); and at least one processor (Shibuno, Fig. 2, [0030], ln. 2, "…a signal processing processor 203..."); and a memory coupled to the at least one processor storing instructions (Shibuno, Fig. 2, [0030], ln. 2, "…a buffer memory 204...") that, when executed by the at least one processor, cause the at least one processor to function as: a flicker detection unit that detects a flicker using an image captured by the sensor (Shibuno, Fig. 16, Step J2, [0086], ln. 5, "…the CPU 211 determines whether or not a flicker occurs…"), and a first determination unit that determines, in a case where the focus detection unit detects the in-focus state, which of a setting of an exposure time of the sensor to an exposure time in which a flicker is inconspicuous and an opening of a diaphragm of the optical system to give priority to based on a detection result of a flicker by the flicker detection unit, and, based on the determination, determines a combination of an exposure time of the sensor and a diaphragm value of the optical system, wherein in a case where the flicker detection unit detects a flicker, the first determination unit determines a combination of an exposure time of the sensor and a diaphragm value of the optical system with priority given to setting the exposure time to an exposure time in which a flicker is inconspicuous over to opening a diaphragm of the optical system (Shibuno, Fig. 16, Steps J3 & J4, [0086], all lines, "When a flicker occurs or the imaging is performed under a fluorescent lamp, if the camera body 2 shortens the exposure time, a flicker in the image data becomes highly visible. Thus, when a flicker occurs in the image data, if the camera body 2 suppresses the shortening of the exposure time, more desirable image data can be provided for the user. When the obtained diaphragm value differs from the already set diaphragm value (Yes at step J1), the CPU 211 determines whether or not a flicker occurs in image data generated by the CMOS image sensor (J2). When the CPU 211 determines that the flicker occurs, it adjusts the diaphragm 307 in the interchangeable lens 3 (J3). When the CPU 211 determines that the flicker does not occur, it adjusts the shooting conditions in the camera body 2 (J4). In this way, when the flicker occurs, the diaphragm drive unit 308 in the interchangeable lens 3 is driven to prevent the shorter exposure time. Accordingly, more desirable moving image can be provided for the user. The CPU 211 may preferentially control the diaphragm similarly to the above other embodiment." It is the Examiner's interpretation that the diaphragm drive unit 308 being driven to prevent a shorter exposure time constitutes a "…priority given to setting the exposure time to an exposure time in which a flicker is inconspicuous…" since it is stipulated that shortening the exposure time would make the flicker highly visible.). Shibuno does not teach a focus detection unit that detects an in-focus state of the subject image by using a light flux that has passed through the optical system. However, Inagaki teaches a focus detection unit that detects an in-focus state of the subject image by using a light flux that has passed through the optical system (Inagaki, Fig. 1, [0039], ln. 1-4, "The phase detection AF unit 129 performs focus detection processing using the phase detection method, based on image signals of focus detection image data (signals for phase detection AF) obtained from the image sensor 122 and the image processing circuit 124. To be more specific, the image processing circuit 124 generates a pair of image data formed by light fluxes…"). It would have been obvious to a person having ordinary skill in the art at the time of the invention to combine the teachings of Inagaki with those of Shibuno because it is well known in the art to use focus detection units that utilize light flux that has passed through an optical system.
Regarding Claim 4, Shibuno and Inagaki teach the limitations of dependent Claim 1 as noted above. Shibuno teaches in a case where the flicker detection unit detects a flicker, the first determination unit determines a combination of an exposure time of the sensor and a diaphragm value of the optical system so as to open a diaphragm of the optical system within a range in which the exposure time can be maintained to an exposure time in which a flicker is inconspicuous (Shibuno, Fig. 16, Steps J3 & J4, [0086], all lines, "When a flicker occurs or the imaging is performed under a fluorescent lamp, if the camera body 2 shortens the exposure time, a flicker in the image data becomes highly visible. Thus, when a flicker occurs in the image data, if the camera body 2 suppresses the shortening of the exposure time, more desirable image data can be provided for the user. When the obtained diaphragm value differs from the already set diaphragm value (Yes at step J1), the CPU 211 determines whether or not a flicker occurs in image data generated by the CMOS image sensor (J2). When the CPU 211 determines that the flicker occurs, it adjusts the diaphragm 307 in the interchangeable lens 3 (J3). When the CPU 211 determines that the flicker does not occur, it adjusts the shooting conditions in the camera body 2 (J4). In this way, when the flicker occurs, the diaphragm drive unit 308 in the interchangeable lens 3 is driven to prevent the shorter exposure time. Accordingly, more desirable moving image can be provided for the user. The CPU 211 may preferentially control the diaphragm similarly to the above other embodiment." It is the Examiner's interpretation that the diaphragm drive unit 308 being driven to prevent a shorter exposure time constitutes a "…priority given to setting the exposure time to an exposure time in which a flicker is inconspicuous…" since it is stipulated that shortening the exposure time would make the flicker highly visible.).
Regarding Claim 5, Shibuno and Inagaki teach the limitations of dependent Claim 1 as noted above. Shibuno teaches in a case where the flicker detection unit detects no flicker, the first determination unit determines a combination of an exposure time of the sensor and a diaphragm value of the optical system with priority given to opening a diaphragm of the optical system (Shibuno, Fig. 2, [0086], ln. 7-11, "When the CPU 211 determines that the flicker does not occur, it adjusts the shooting conditions in the camera body 2 (J4). In this way, when the flicker occurs, the diaphragm drive unit 308 in the interchangeable lens 3 is driven to prevent the shorter exposure time. Accordingly, more desirable moving image can be provided for the user. The CPU 211 may preferentially control the diaphragm similarly to the above other embodiment.").
Regarding Claim 6, Shibuno and Inagaki teach the limitations of dependent Claim 1 as noted above. Shibuno teaches the at least one processor further functions as a second determination unit that determines a diaphragm value, which is a diaphragm value in a case where the sensor images a still image, wherein in a case where the flicker detection unit detects a flicker, the first determination unit determines a combination of an exposure time of the sensor and a diaphragm value of the optical system in a case where the in-focus state is detected with the diaphragm value as a limit on a small diaphragm side (Shibuno, Fig. 16, Steps J3 & J4, [0086], ln. 6-11, "When the CPU 211 determines that the flicker occurs, it adjusts the diaphragm 307 in the interchangeable lens 3 [J3]. When the CPU 211 determines that the flicker does not occur, it adjusts the shooting conditions in the camera body 2 [J4]. In this way, when the flicker occurs, the diaphragm drive unit 308 in the interchangeable lens 3 is driven to prevent the shorter exposure time. Accordingly, more desirable moving image can be provided for the user. The CPU 211 may preferentially control the diaphragm similarly to the above other embodiment." It is interpreted that "the diaphragm value as a limit on a small diaphragm side" is preventing a shorter exposure time.).
Regarding Claim 7, Shibuno and Inagaki teach the limitations of dependent Claim 1 as noted above. Inagaki teaches the sensor includes a plurality of pixels each including a plurality of photoelectric conversion units that receive a light flux passing through different pupil regions of the optical system, and the focus detection unit detects the in-focus state based on a shift amount of a signal from a plurality of conversion units of the pixel (Inagaki, Fig. 1, [0039], ln. 3-6, "To be more specific, the image processing circuit 124 generates a pair of image data formed by light fluxes passing through a pair of pupil regions of the imaging optical system as focus detection data, and the phase detection AF unit 129 detects a focus shift amount based on a shift amount in the pair of image data.").
Regarding Claim 8, Shibuno and Inagaki teach the limitations of dependent Claim 1 as noted above. Inagaki teaches the focus detection unit detects the in-focus state by using at least one of a shift amount of the signal in a first direction, which is a reading row direction of the pixel, and a shift amount of the signal in a second direction (Inagaki, Figs. 1, 2, & 5, [0054], ln. 3-7, "Light fluxes passing through different partial pupil regions of the first partial pupil region 501 and the second partial pupil region 502 are incident on each pixel of the image sensor 122 at different angles, and are received by the first focus detection pixel 201 and the second focus detection pixel 202 divided into 2×1. The present embodiment describes an example in which the pupil region is divided in two in the horizontal direction. The pupil region may be divided in the vertical direction as necessary.").
Regarding Claim 9, Shibuno and Inagaki teach the limitations of dependent Claim 8 as noted above. Inagaki teaches the second direction is a direction perpendicular to the first direction (Inagaki, Figs. 1, 2, & 5, [0054], ln. 3-7, "Light fluxes passing through different partial pupil regions of the first partial pupil region 501 and the second partial pupil region 502 are incident on each pixel of the image sensor 122 at different angles, and are received by the first focus detection pixel 201 and the second focus detection pixel 202 divided into 2×1. The present embodiment describes an example in which the pupil region is divided in two in the horizontal direction. The pupil region may be divided in the vertical direction as necessary.").
Regarding Claim 11, Shibuno and Inagaki teach the limitations of dependent Claim 8 as noted above. Shibuno teaches the at least one processor further functions as a second determination unit that determines a diaphragm value, which is a diaphragm value in a case where the sensor images a still image (Shibuno, Fig. 17, [0088], ln. 12-15, "When the CPU 211 determines that the highlight detail loss occurs, it adjusts the diaphragm 307 in the interchangeable lens 3 (K3). When the CPU 211 determines that the highlight detail loss does not occur, it adjusts the shooting conditions in the camera body 2 (K4)."), and in a case where the flicker detection unit detects a flicker, the first determination unit determines a combination of an exposure time of the sensor and a diaphragm value of the optical system in a case where the in-focus state is detected with the diaphragm value as a limit on a small diaphragm side (Shibuno, Fig. 16, Steps J3 & J4, [0086], ln. 6-11, "When the CPU 211 determines that the flicker occurs, it adjusts the diaphragm 307 in the interchangeable lens 3 [J3]. When the CPU 211 determines that the flicker does not occur, it adjusts the shooting conditions in the camera body 2 [J4]. In this way, when the flicker occurs, the diaphragm drive unit 308 in the interchangeable lens 3 is driven to prevent the shorter exposure time. Accordingly, more desirable moving image can be provided for the user. The CPU 211 may preferentially control the diaphragm similarly to the above other embodiment." It is interpreted that "the diaphragm value as a limit on a small diaphragm side" is preventing a shorter exposure time.). Shibuno does not teach in a case where the focus detection unit detects the in-focus state without using a shift amount of the signal in the second direction. However, Inagaki teaches in a case where the focus detection unit detects the in-focus state without using a shift amount of the signal in the second direction (Inagaki, Figs. 1, 2, & 5, [0054], ln. 3-6, "Light fluxes passing through different partial pupil regions of the first partial pupil region 501 and the second partial pupil region 502 are incident on each pixel of the image sensor 122 at different angles, and are received by the first focus detection pixel 201 and the second focus detection pixel 202 divided into 2×1.").
Regarding Claim 12, Shibuno and Inagaki teach the limitations of dependent Claim 8 as noted above. Shibuno teaches in a case where the flicker detection unit detects no flicker, the first determination unit determines a combination of an exposure time of the sensor and a diaphragm value of the optical system with priority given to opening a diaphragm of the optical system (Shibuno, Fig. 2, [0086], ln. 7-11, "When the CPU 211 determines that the flicker does not occur, it adjusts the shooting conditions in the camera body 2 (J4). In this way, when the flicker occurs, the diaphragm drive unit 308 in the interchangeable lens 3 is driven to prevent the shorter exposure time. Accordingly, more desirable moving image can be provided for the user. The CPU 211 may preferentially control the diaphragm similarly to the above other embodiment.").
Regarding Claim 14, Shibuno teaches A method for controlling an apparatus including a sensor that captures a subject image formed by an optical system, the method comprising: detecting a flicker by using an image captured by the sensor (Shibuno, Fig. 16, Step J2, [0086], ln. 5, "…the CPU 211 determines whether or not a flicker occurs…"), and determining, in a case of detecting the in-focus state in the focus detection, which of a setting of an exposure time of the sensor to an exposure time in which a flicker is inconspicuous and an opening of a diaphragm of the optical system to give priority to based on a detection result of a flicker by the flicker detection, and, based on the determination, determining a combination of an exposure time of the sensor and a diaphragm value of the optical system, wherein in a case where the detecting detects a flicker, the determining determines a combination of an exposure time of the sensor and a diaphragm value of the optical system with priority given to setting the exposure time to an exposure time in which a flicker is inconspicuous over to opening a diaphragm of the optical system (Shibuno, Fig. 16, Steps J3 & J4, [0086], all lines, "When a flicker occurs or the imaging is performed under a fluorescent lamp, if the camera body 2 shortens the exposure time, a flicker in the image data becomes highly visible. Thus, when a flicker occurs in the image data, if the camera body 2 suppresses the shortening of the exposure time, more desirable image data can be provided for the user. When the obtained diaphragm value differs from the already set diaphragm value (Yes at step J1), the CPU 211 determines whether or not a flicker occurs in image data generated by the CMOS image sensor (J2). When the CPU 211 determines that the flicker occurs, it adjusts the diaphragm 307 in the interchangeable lens 3 (J3). When the CPU 211 determines that the flicker does not occur, it adjusts the shooting conditions in the camera body 2 (J4). In this way, when the flicker occurs, the diaphragm drive unit 308 in the interchangeable lens 3 is driven to prevent the shorter exposure time. Accordingly, more desirable moving image can be provided for the user. The CPU 211 may preferentially control the diaphragm similarly to the above other embodiment." It is the Examiner's interpretation that the diaphragm drive unit 308 being driven to prevent a shorter exposure time constitutes a "…priority given to setting the exposure time to an exposure time in which a flicker is inconspicuous…" since it is stipulated that shortening the exposure time would make the flicker highly visible.). Shibuno does not teach detecting an in-focus state of the subject image by using a light flux that has passed through the optical system. However, Inagaki teaches detecting an in-focus state of the subject image by using a light flux that has passed through the optical system (Inagaki, Fig. 1, [0039], ln. 1-4, "The phase detection AF unit 129 performs focus detection processing using the phase detection method, based on image signals of focus detection image data (signals for phase detection AF) obtained from the image sensor 122 and the image processing circuit 124. To be more specific, the image processing circuit 124 generates a pair of image data formed by light fluxes…").
Regarding Claim 17, Shibuno teaches A non-transitory computer-readable storage medium storing a program for causing a computer to execute each process of a control method for an apparatus including a sensor that captures a subject image formed by an optical system (Shibuno, Fig. 2, [0032], ln. 9-10, "The moving image recorded in the storage medium such as the memory card 218 is referred to as "a moving image for recording".), the method comprising: detecting a flicker by using an image captured by the sensor (Shibuno, Fig. 16, Step J2, [0086], ln. 5, "…the CPU 211 determines whether or not a flicker occurs…"), and determining, in a case of detecting the in-focus state in the focus detection, which of a setting of an exposure time of the sensor to an exposure time in which a flicker is inconspicuous and opening of a diaphragm of the optical system to give priority to based on a detection result of a flicker by the flicker detection, and, based on the determination, determining a combination of an exposure time of the sensor and a diaphragm value of the optical system (Shibuno, Fig. 16, Steps J3 & J4, [0086], all lines, "When a flicker occurs or the imaging is performed under a fluorescent lamp, if the camera body 2 shortens the exposure time, a flicker in the image data becomes highly visible. Thus, when a flicker occurs in the image data, if the camera body 2 suppresses the shortening of the exposure time, more desirable image data can be provided for the user. When the obtained diaphragm value differs from the already set diaphragm value (Yes at step J1), the CPU 211 determines whether or not a flicker occurs in image data generated by the CMOS image sensor (J2). When the CPU 211 determines that the flicker occurs, it adjusts the diaphragm 307 in the interchangeable lens 3 (J3). When the CPU 211 determines that the flicker does not occur, it adjusts the shooting conditions in the camera body 2 (J4). In this way, when the flicker occurs, the diaphragm drive unit 308 in the interchangeable lens 3 is driven to prevent the shorter exposure time. Accordingly, more desirable moving image can be provided for the user. The CPU 211 may preferentially control the diaphragm similarly to the above other embodiment." It is the Examiner's interpretation that the diaphragm drive unit 308 being driven to prevent a shorter exposure time constitutes a "…priority given to setting the exposure time to an exposure time in which a flicker is inconspicuous…" since it is stipulated that shortening the exposure time would make the flicker highly visible."). Shibuno does not teach detecting an in-focus state of the subject image by using a light flux that has passed through the optical system. However, Inagaki teaches detecting an in-focus state of the subject image by using a light flux that has passed through the optical system (Inagaki, Fig. 1, [0039], ln. 1-4, "The phase detection AF unit 129 performs focus detection processing using the phase detection method, based on image signals of focus detection image data (signals for phase detection AF) obtained from the image sensor 122 and the image processing circuit 124. To be more specific, the image processing circuit 124 generates a pair of image data formed by light fluxes…").
Claims 2, 15, & 18 are rejected under 35 U.S.C. 103 as being unpatentable over Shibuno in view of Inagaki and Sugawara (US 20220337736 A1, hereinafter, "Sugawara").
Regarding Claim 2, Shibuno and Inagaki teach the limitations of dependent Claim 1 as noted above. Sugawara teaches a display device that displays, as a live view image, an image captured by the sensor, in a state where the in-focus state is being detected (Sugawara, Fig. 3, Step S106, [0073], ln. 4-7, "However, when actually capturing the images for flicker detection in step S106, there are disturbances, such as the focusing lens 202 of the photographing lens moving, the user panning, and the subjects changing, in order to maintain the live view display at a proper focus."). It would have been obvious to a person having ordinary skill in the art at the time of the invention to combine the teachings of Sugawara with those of Shibuno and Inagaki because it is well known in the art to install live view image displays on digital cameras.
Regarding Claim 15, Shibuno and Inagaki teach the limitations of dependent Claim 14 as noted above. Sugawara teaches displaying, as a live view image, an image captured by the sensor, in a state where the in-focus state is being detected (Sugawara, Fig. 3, Step S106, [0073], ln. 4-7, "However, when actually capturing the images for flicker detection in step S106, there are disturbances, such as the focusing lens 202 of the photographing lens moving, the user panning, and the subjects changing, in order to maintain the live view display at a proper focus.").
Regarding Claim 18, Shibuno and Inagaki teach the limitations of dependent Claim 17 as noted above. Sugawara teaches displaying, as a live view image, an image captured by the sensor, in a state where the in-focus state is being detected (Sugawara, Fig. 3, Step S106, [0073], ln. 4-7, "However, when actually capturing the images for flicker detection in step S106, there are disturbances, such as the focusing lens 202 of the photographing lens moving, the user panning, and the subjects changing, in order to maintain the live view display at a proper focus.").
Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over Shibuno in view of Inagaki and Kobayashi (US 20220353405 A1, hereinafter, "Kobayashi").
Regarding Claim 13, Shibuno and Inagaki teach the limitations of dependent Claim 1 as noted above. Kobayashi teaches the first determination unit sets an exposure time of the sensor to an integral multiple of a flickering cycle as the exposure time in which the flicker is inconspicuous (Kobayashi, [0046], ln. 1-3, "As described above, it is known that by setting the charge accumulation period to an integral multiple of the flicker cycle, an image in which the influence of flicker is suppressed can be taken."). It would have been obvious to a person having ordinary skill in the art at the time of the invention to combine the teachings of Kobayashi with those of Shibuno and Inagaki because it is well known in the art to set the exposure time to an integral multiple of a flickering cycle to render said flickering inconspicuous.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to STEVEN DANIEL BARRY whose telephone number is (571)270-0432. The examiner can normally be reached M-Th 0730-1630.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Lin Ye can be reached on 517-272-7372. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/STEVEN DANIEL BARRY/Examiner, Art Unit 2638
/LIN YE/Supervisory Patent Examiner, Art Unit 2638