DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
2. Receipt is acknowledged of certified copies of documents required by 37 CFR 1.55.
Information Disclosure Statement
3. The information disclosure statement (IDS) submitted on 03/20/2024 is in compliance with the provisions of 37 CFR 1.97 and was considered by the examiner.
Claim Rejections - 35 USC § 102
4. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
5. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
6. Claims 1-2 and 14-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Thumpudi (US-PGPUB 2018/0367734).
Regarding claim 1, Thumpudi discloses an apparatus (Device 102, see fig. 1 and paragraph 0018) comprising one or more processors (Processor(s) 42; see fig. 1 and paragraphs 0018, 0021, 0057) and/or circuitry which function as:
a detection unit (Scene analyzer component 28; see figs. 1, 4) configured to detect an object from an image (The scene analyzer can determine whether a subject is moving in the scene; see paragraph 0015);
a first obtaining unit configured to obtain object motion information indicating information about a temporal change of a position of the object detected in the image (Local movement component 31 includes any detected motion in the scene 106 captured by the imaging device 104. That is, the local movement analyzer determines whether any objects 20 may be moving in the scene 106 and the location of the detected movement; see paragraph 0024 and fig. 1. The scene analyzer analyzes the pixels in the image frames 18 and the motion analysis can be performed five times a second; see paragraph 0027);
a second obtaining unit configured to obtain image motion information indicating a motion of a capturing apparatus configured to capture the image (Global movement component 32 includes any movement of the imaging device 104. The global movement 34 is based on the amount of detected movement by one or more sensors 10 and a pixel analysis of the image frames 18; see paragraph 0023 and fig. 1);
a control unit configured to control a stabilization unit using a first mode (Selected mode to correct detected local movement) for giving priority to object tracking to correct the position of the object in the image and a second mode (Selected mode to correct detected global movement) for giving priority to stabilization to correct an image blur due to the motion of the capturing apparatus (Processor 42 automatically selects different modes of operation for the imaging device based on the detected global movement of an imaging device (e.g., movement of the imaging device) and/or local movement of a scene being captured by the imaging device (e.g., detected motion in the scene). When motion is detected when capturing an image, either from the movement of the imaging device and/or the subject of the image moving, the device 102 compensates for the specific detected motion in the resulting image by reducing the blur in the resulting image; see paragraphs 0014, 0017, 0018 and figs. 1, 3);
a determination unit (Evaluator component 36; see fig. 1) configured to determine whether to switch from the first mode to the second mode (If the detected motion exceeds the movement threshold 39 for the mode of operation of the imaging device 104, evaluator component 36 compensates for the detected motion by selecting a mode of operation 43 of imaging device 104 that can obtain a clear picture under the detected motion condition. Movement thresholds 39 can vary and are modified for a targeted system and/or for a desired outcome; see paragraphs 0032-0034); and
a setting unit configured to set a reference position (Component 28 selects a weight 30 for each of the detected areas of motion. Component 28 scales the weights based on a location of the motion relative to a location of the center of the image frame 18. Thus, component 28 applies a higher weight for the detected motion 48, 50, 51 and 52 because the motion is closer toward the center of the image; see paragraph 0028) being a control center of the stabilization based on the object motion information in a case where the determination unit determines to switch from the first mode to the second mode (Evaluator component 36 receives the global movement 34 information and/or the local movement 31 information and compensates for the detected motion by selecting a mode of operation 43 for imaging device 104 based on the received movement information. Component 36 evaluates the detected motion (e.g., the global movement 34 and/or the local movement 31) and determine whether the detected motion is within a movement threshold 39 for the current mode of operation of the imaging device 104; see paragraphs 00031, 0039, 0051. Component 28 can determine both of the local movement 31 of objects 20 I the scene 106 and the global movement 34 of the imaging device 24, see paragraph 0024).
Regarding claim 2, Thumpudi discloses everything claimed as applied above (see claim 1). In addition, Thumpudi discloses after switching to the second mode, the control unit controls the stabilization unit in the second mode by calculating a correction amount from the reference position based on the image motion information (Processor 42 automatically selects different modes of operation for the imaging device based on the detected global movement of an imaging device (e.g., movement of the imaging device) and/or local movement of a scene being captured by the imaging device (e.g., detected motion in the scene). When motion is detected when capturing an image, either from the movement of the imaging device and/or the subject of the image moving, the device 102 compensates for the specific detected motion in the resulting image by reducing the blur in the resulting image. Component 28 determines where movement is acceptable in an image. Movement in peripheral areas of the image is acceptable, while movement in the center of the image may be undesirable, thus processor 42 corrects undesired motion accordingly; see paragraphs 0014, 0017, 0018, 0024, 0032 and figs. 1, 3).
Regarding claim 14, Thumpudi discloses everything claimed as applied above (see claim 1). In addition, Thumpudi discloses after switching, the control unit controls the stabilization unit such that a still object included in a capturing range changes based on the reference position even in a case where the motion of the capturing apparatus indicated by the capturing apparatus motion information is 0 (Component 36 receives the global movement 34 information and/or the local movement 31 information and compensates for the detected motion by selecting a mode of operation 43 for imaging device 104 based on the received movement information. Imaging device 104 can be stationary while taking pictures of a sports team playing a game. As such, evaluator component 36 determine that there is only local movement 31 (e.g., the sports teaming playing the sport) and no global movement 34 of the imaging device 104 and adjust the mode accordingly; see paragraphs 0031-0036).
Regarding claim 15, Thumpudi discloses everything claimed as applied above (see claim 1). In addition, Thumpudi discloses the control unit moves the control center of the stabilization based on the reference position set based on the object motion information and the motion of the capturing apparatus obtained from the image motion information (Component 28 selects a weight 30 for each of the detected areas of motion. Component 28 scales the weights based on a location of the motion relative to a location of the center of the image frame 18. Thus, component 28 applies a higher weight for the detected motion 48, 50, 51 and 52 because the motion is closer toward the center of the image; see paragraph 0028. Evaluator component 36 receives the global movement 34 information and/or the local movement 31 information and compensates for the detected motion by selecting a mode of operation 43 for imaging device 104 based on the received movement information; see paragraphs 00031, 0039, 0051. Component 28 can determine both of the local movement 31 of objects 20 I the scene 106 and the global movement 34 of the imaging device 24, see paragraph 0024).
Regarding claim 16, Thumpudi discloses everything claimed as applied above (see claim 1). In addition, Thumpudi discloses comprising a sensor, wherein the detected object from the image is obtained using the sensor (The scene analyzer can determine whether a subject is moving in the scene; see paragraph 0015)
Regarding claim 17, Thumpudi discloses a method (see fig. 3) for an apparatus (Device 102, see fig. 1 and paragraph 0018), comprising:
detecting an object from an image (The scene analyzer can determine whether a subject is moving in the scene; see paragraphs 0015, 0042);
obtaining object motion information indicating information about a temporal change of a position of the object detected in the image (Obtaining Local movement which includes any detected motion in the scene 106 captured by the imaging device 104. That is, determining whether any objects 20 may be moving in the scene 106 and the location of the detected movement; see paragraph 0024, 0043 and figs. 1,3. The scene analyzer analyzes the pixels in the image frames 18 and the motion analysis can be performed five times a second; see paragraph 0027);
obtaining image motion information indicating a motion of a capturing apparatus configured to capture the image (Obtaining global movement which includes any movement of the imaging device 104. The global movement 34 is based on the amount of detected movement by one or more sensors 10 and a pixel analysis of the image frames 18; see paragraphs 0023, 0042 and figs. 1, 3);
controlling a stabilization unit using a first mode (Selected mode to correct detected local movement) for giving priority to object tracking to correct the position of the object in the image and a second mode (Selected mode to correct detected global movement) for giving priority to stabilization to correct an image blur due to the motion of the capturing apparatus (Processor 42 automatically selects different modes of operation for the imaging device based on the detected global movement of an imaging device (e.g., movement of the imaging device) and/or local movement of a scene being captured by the imaging device (e.g., detected motion in the scene). When motion is detected when capturing an image, either from the movement of the imaging device and/or the subject of the image moving, the device 102 compensates for the specific detected motion in the resulting image by reducing the blur in the resulting image; see paragraphs 0014, 0017, 0018 and figs. 1, 3);
determining whether to switch from the first mode to the second mode (If the detected motion exceeds the movement threshold 39 for the mode of operation of the imaging device 104, evaluator component 36 compensates for the detected motion by selecting a mode of operation 43 of imaging device 104 that can obtain a clear picture under the detected motion condition. Movement thresholds 39 can vary and are modified for a targeted system and/or for a desired outcome; see paragraphs 0032-0034); and
setting a reference position (Component 28 selects a weight 30 for each of the detected areas of motion. Component 28 scales the weights based on a location of the motion relative to a location of the center of the image frame 18. Thus, component 28 applies a higher weight for the detected motion 48, 50, 51 and 52 because the motion is closer toward the center of the image; see paragraph 0028) being a control center of the stabilization based on the object motion information in a case where it is determined to switch from the first mode to the second mode (Evaluator component 36 receives the global movement 34 information and/or the local movement 31 information and compensates for the detected motion by selecting a mode of operation 43 for imaging device 104 based on the received movement information. Component 36 evaluates the detected motion (e.g., the global movement 34 and/or the local movement 31) and determine whether the detected motion is within a movement threshold 39 for the current mode of operation of the imaging device 104; see paragraphs 00031, 0039, 0051. Component 28 can determine both of the local movement 31 of objects 20 I the scene 106 and the global movement 34 of the imaging device 24, see paragraph 0024).
Regarding claim 18, Thumpudi discloses everything claimed as applied above (see claim 17). In addition, Thumpudi discloses after switching to the second mode, the controlling controls the stabilization unit in the second mode by calculating a correction amount from the reference position based on the image motion information (Processor 42 automatically selects different modes of operation for the imaging device based on the detected global movement of an imaging device (e.g., movement of the imaging device) and/or local movement of a scene being captured by the imaging device (e.g., detected motion in the scene). When motion is detected when capturing an image, either from the movement of the imaging device and/or the subject of the image moving, the device 102 compensates for the specific detected motion in the resulting image by reducing the blur in the resulting image. Component 28 determines where movement is acceptable in an image. Movement in peripheral areas of the image is acceptable, while movement in the center of the image may be undesirable, thus processor 42 corrects undesired motion accordingly; see paragraphs 0014, 0017, 0018, 0024, 0032 and figs. 1, 3).
Regarding claim 19, Thumpudi discloses a non-transitory computer-readable storage medium (Memory 44; see paragraphs 0018, 0021) storing a program for causing a computer (Processor(s) 42; see fig. 1 and paragraphs 0018, 0021, 0057) to execute a method (see fig. 3), the method comprising:
detecting an object from an image (The scene analyzer can determine whether a subject is moving in the scene; see paragraphs 0015, 0042);
obtaining object motion information indicating information about a temporal change of a position of the object detected in the image (Obtaining Local movement which includes any detected motion in the scene 106 captured by the imaging device 104. That is, determining whether any objects 20 may be moving in the scene 106 and the location of the detected movement; see paragraph 0024, 0043 and figs. 1,3. The scene analyzer analyzes the pixels in the image frames 18 and the motion analysis can be performed five times a second; see paragraph 0027);
obtaining image motion information indicating a motion of a capturing apparatus configured to capture the image (Obtaining global movement which includes any movement of the imaging device 104. The global movement 34 is based on the amount of detected movement by one or more sensors 10 and a pixel analysis of the image frames 18; see paragraphs 0023, 0042 and figs. 1, 3);
controlling a stabilization unit using a first mode (Selected mode to correct detected local movement) for giving priority to object tracking to correct the position of the object in the image and a second mode (Selected mode to correct detected global movement) for giving priority to stabilization to correct an image blur due to the motion of the capturing apparatus (Processor 42 automatically selects different modes of operation for the imaging device based on the detected global movement of an imaging device (e.g., movement of the imaging device) and/or local movement of a scene being captured by the imaging device (e.g., detected motion in the scene). When motion is detected when capturing an image, either from the movement of the imaging device and/or the subject of the image moving, the device 102 compensates for the specific detected motion in the resulting image by reducing the blur in the resulting image; see paragraphs 0014, 0017, 0018 and figs. 1, 3);
determining whether to switch from the first mode to the second mode (If the detected motion exceeds the movement threshold 39 for the mode of operation of the imaging device 104, evaluator component 36 compensates for the detected motion by selecting a mode of operation 43 of imaging device 104 that can obtain a clear picture under the detected motion condition. Movement thresholds 39 can vary and are modified for a targeted system and/or for a desired outcome; see paragraphs 0032-0034); and
setting a reference position (Component 28 selects a weight 30 for each of the detected areas of motion. Component 28 scales the weights based on a location of the motion relative to a location of the center of the image frame 18. Thus, component 28 applies a higher weight for the detected motion 48, 50, 51 and 52 because the motion is closer toward the center of the image; see paragraph 0028) being a control center of the stabilization based on the object motion information in a case where it is determined to switch from the first mode to the second mode (Evaluator component 36 receives the global movement 34 information and/or the local movement 31 information and compensates for the detected motion by selecting a mode of operation 43 for imaging device 104 based on the received movement information. Component 36 evaluates the detected motion (e.g., the global movement 34 and/or the local movement 31) and determine whether the detected motion is within a movement threshold 39 for the current mode of operation of the imaging device 104; see paragraphs 00031, 0039, 0051. Component 28 can determine both of the local movement 31 of objects 20 I the scene 106 and the global movement 34 of the imaging device 24, see paragraph 0024).
Regarding claim 20, Thumpudi discloses everything claimed as applied above (see claim 19). In addition, Thumpudi discloses after switching to the second mode, the controlling controls the stabilization unit in the second mode by calculating a correction amount from the reference position based on the image motion information (Processor 42 automatically selects different modes of operation for the imaging device based on the detected global movement of an imaging device (e.g., movement of the imaging device) and/or local movement of a scene being captured by the imaging device (e.g., detected motion in the scene). When motion is detected when capturing an image, either from the movement of the imaging device and/or the subject of the image moving, the device 102 compensates for the specific detected motion in the resulting image by reducing the blur in the resulting image. Component 28 determines where movement is acceptable in an image. Movement in peripheral areas of the image is acceptable, while movement in the center of the image may be undesirable, thus processor 42 corrects undesired motion accordingly; see paragraphs 0014, 0017, 0018, 0024, 0032 and figs. 1, 3).
Claim Rejections - 35 USC § 103
7. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
8. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
9. Claims 3-5 are rejected under 35 U.S.C. 103 as being unpatentable over Thumpudi in view of Gaizman et al. (US-PGPUB 2020/0412954).
Regarding claim 3, Thumpudi discloses everything claimed as applied above (see claim 1). However, Thumpudi fails to expressly disclose the control unit controls the stabilization by displacing a correction lens in the second mode, the correction lens being included in an optical system configured to capture the image, and wherein the reference position is a position of the correction lens in a case where the motion of the capturing apparatus indicated by the capturing apparatus motion information is 0.
On the other hand, Gaizman discloses the control unit controls the stabilization by displacing a correction lens in the second mode, the correction lens being included in an optical system configured to capture the image, and wherein the reference position is a position of the correction lens in a case where the motion of the capturing apparatus indicated by the capturing apparatus motion information is 0 (The system 400 includes a lens 404 with a fixed position with reference to the image sensor 406, and the lens 404 and the image sensor 406 can tilt or rotate (as indicated by movement 412). A lens position sensor measurement is compared to a gyroscope measurement to determine if the lens 404 is to be shifted (by rotating the lens 404 and the image sensor 406); see paragraphs 0038, 0040. If the device 600 performs lens shift OIS, the lens position is used to determine a change in the lens distortion. A transform that is centered for a captured frame from the image sensor when the lens 602 is in a neutral position (such as centered for the image sensor), is shifted based on a shift of the lens 602 during OIS; see paragraph 0076).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Thumpudi and Gaizman to provide the control unit controls the stabilization by displacing a correction lens in the second mode, the correction lens being included in an optical system configured to capture the image, and wherein the reference position is a position of the correction lens in a case where the motion of the capturing apparatus indicated by the capturing apparatus motion information is 0 for the purpose of effectively compensating for blurring objects in the scene.
Regarding claim 4, Thumpudi discloses everything claimed as applied above (see claim 1). However, Thumpudi fails to expressly disclose the control unit controls the stabilization by displacing a sensor configured to capture the image, and wherein the reference position is a position of the sensor in a case where the motion of the capturing apparatus indicated by the capturing apparatus motion information is 0.
Nevertheless, Gaizman discloses the control unit controls the stabilization by displacing a sensor configured to capture the image, and wherein the reference position is a position of the sensor in a case where the motion of the capturing apparatus indicated by the capturing apparatus motion information is 0 (The system 400 includes a lens 404 with a fixed position with reference to the image sensor 406, and the lens 404 and the image sensor 406 can tilt or rotate (as indicated by movement 412). A lens position sensor measurement is compared to a gyroscope measurement to determine if the lens 404 is to be shifted (by rotating the lens 404 and the image sensor 406); see paragraphs 0038, 0040. If the device 600 performs lens shift OIS, the lens position is used to determine a change in the lens distortion. A transform that is centered for a captured frame from the image sensor when the lens 602 is in a neutral position (such as centered for the image sensor), is shifted based on a shift of the lens 602 during OIS; see paragraph 0076).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Thumpudi and Gaizman to provide the control unit controls the stabilization by displacing a sensor configured to capture the image, and wherein the reference position is a position of the sensor in a case where the motion of the capturing apparatus indicated by the capturing apparatus motion information is 0 for the purpose of effectively compensating for blurring objects in the scene.
Regarding claim 5, Thumpudi discloses everything claimed as applied above (see claim 1). However, Thumpudi fails to expressly disclose the control unit controls the stabilization by geometric deformation, and wherein the reference position is a clipping position in a case where the motion of the capturing apparatus indicated by the image motion information is 0.
On the other hand, Gaizman discloses the control unit controls the stabilization by geometric deformation, and wherein the reference position is a clipping position in a case where the motion of the capturing apparatus indicated by the image motion information is 0 (EIS can compensate for a camera's global motion to reduce shakiness in the video. The device can crop a portion (such as 10 percent) of each of the captured frames (with the cropping location moving based on the global motion) to generate a final, processed stream of images that are a fraction in pixel size of the frames captured by the camera sensor. The cropping window can be shifted to compensate for camera motion; see paragraph 0030. OIS and EIS can both be provided to achieve the lens to a neutral position; see paragraphs 0076, 0087).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Thumpudi and Gaizman to provide the control unit controls the stabilization by geometric deformation, and wherein the reference position is a clipping position in a case where the motion of the capturing apparatus indicated by the image motion information is 0 for the purpose of effectively compensating for blurring objects in the scene.
10. Claims 12-13 are rejected under 35 U.S.C. 103 as being unpatentable over Thumpudi in view of Yuan (US-PGPUB 2024/0040255).
Regarding claim 12, Thumpudi discloses everything claimed as applied above (see claim 1). However, Thumpudi fails to disclose after switching, the control unit gradually moves the control center of the stabilization from a position of the stabilization unit at a timing of the switching to the reference position based on the object motion information.
On the other hand, Yuan discloses after switching, the control unit gradually moves the control center of the stabilization from a position of the stabilization unit at a timing of the switching to the reference position based on the object motion information (If the imaging device switches from the first motion state to the second motion state, a difference between a video frame obtained based on the processing performed in the first anti-shake manner and an original video frame is gradually reduced, so as to reduce the video shake caused by switching from the first anti-shake manner to the second anti-shake manner. Anti-shake processing is performed in the second anti-shake manner on the reference frame and a subsequent video frame, so as to reduce the video shake caused when the first anti-shake manner is switched to the second anti-shake manner; see paragraphs 0055-0058).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Thumpudi and Yuan to provide after switching, the control unit gradually moves the control center of the stabilization from a position of the stabilization unit at a timing of the switching to the reference position based on the object motion information for the purpose of improving image quality by avoiding additional image deterioration that occurs when switching between different image stabilization mechanisms, thus providing gradual processing limits these side effects.
Regarding claim 13, Thumpudi discloses everything claimed as applied above (see claim 12). However, Thumpudi fails to disclose the control unit performs the stabilization to correct the image blur due to the motion of the capturing apparatus while gradually moving the control center of the stabilization.
Nevertheless, Yuan discloses the control unit performs the stabilization to correct the image blur due to the motion of the capturing apparatus while gradually moving the control center of the stabilization (If the imaging device switches from the first motion state to the second motion state, a difference between a video frame obtained based on the processing performed in the first anti-shake manner and an original video frame is gradually reduced, so as to reduce the video shake caused by switching from the first anti-shake manner to the second anti-shake manner. Anti-shake processing is performed in the second anti-shake manner on the reference frame and a subsequent video frame, so as to reduce the video shake caused when the first anti-shake manner is switched to the second anti-shake manner; see paragraphs 0055-0058).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Thumpudi and Yuan to provide the control unit performs the stabilization to correct the image blur due to the motion of the capturing apparatus while gradually moving the control center of the stabilization for the purpose of improving image quality by avoiding additional image deterioration that occurs when switching between different image stabilization mechanisms, thus providing gradual processing limits these side effects.
Allowable Subject Matter
11. Claims 6-11 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all the limitations of the base claim and any intervening claims.
Regarding claim 6, the specific limitation of “the setting unit obtains a centroid position of a motion of the object based on a temporal change of the motion of the object indicated by the object motion information, and sets the reference position based on the obtained centroid position” in the combination as claimed is neither anticipated nor made obvious over the prior art made of record.
Regarding claim 7, it is objected to for depending on claim 6.
Regarding claim 8, the specific limitation of “the setting unit predicts a motion of the object based on a temporal change of the motion of the object indicated by the object motion information, and sets the reference position based on a result of prediction” in the combination as claimed is neither anticipated nor made obvious over the prior art made of record.
Regarding claim 9, it is objected to for depending on claim 8.
Regarding claim 10, it is objected to for depending on claim 8.
Regarding claim 11, it is objected to for depending on claim 10.
Contact Information
12. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CYNTHIA CALDERON whose telephone number is (571)270-3580. The examiner can normally be reached M-F 9:00 AM-5:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, TWYLER HASKINS can be reached at (571)272-7406. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CYNTHIA CALDERON/Primary Examiner, Art Unit 2639 01/30/2026