DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
2. Receipt is acknowledged of certified copies of documents required by 37 CFR 1.55.
Information Disclosure Statement
3. The information disclosure statement (IDS) submitted on 03/05/2024 is in compliance with the provisions of 37 CFR 1.97 and was considered by the examiner.
CLAIM INTERPRETATION
4. The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
5. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
6. This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitations are: “a detection unit”, “a first obtaining unit”, “a second obtaining unit”, “a control unit” and “a determination unit” in claims 1-19.
Because these claim limitations are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, they are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
a) “a detection unit”, “a first obtaining unit”, “a second obtaining unit”, “a control unit” and “a determination unit” correspond to a processor executing the functions of the claimed units under the control of particular algorithms; see paragraph 0076 of the publication of the instant application.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 102
7. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
8. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
9. Claims 1-9, 11-13 and 19-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Thumpudi (US-PGPUB 2018/0367734).
Regarding claim 1, Thumpudi discloses an apparatus (Device 102; see fig. 1 and paragraph 0018) comprising:
a detection unit (Scene analyzer component 28; see figs. 1, 4) configured to detect an object from an image (The scene analyzer can determine whether a subject is moving in the scene; see paragraph 0015);
a first obtaining unit configured to obtain motion information indicating a motion of the detected object (Local movement component 31 includes any detected motion in the scene 106 captured by the imaging device 104. That is, the local movement analyzer determines whether any objects 20 may be moving in the scene 106 and the location of the detected movement; see paragraph 0024 and fig. 1);
a second obtaining unit configured to obtain motion information indicating a motion of a capturing apparatus (Global movement component 32 includes any movement of the imaging device 104. The global movement 34 is based on the amount of detected movement by one or more sensors 10 and a pixel analysis of the image frames 18; see paragraph 0023 and fig. 1);
a control unit configured to control first stabilization (Selected mode to correct detected local movement) to correct the motion of the object in the image and second stabilization (Selected mode to correct detected global movement) to correct an image blur caused by the motion of the capturing apparatus (Processor 42 automatically selects different modes of operation for the imaging device based on the detected global movement of an imaging device (e.g., movement of the imaging device) and/or local movement of a scene being captured by the imaging device (e.g., detected motion in the scene). When motion is detected when capturing an image, either from the movement of the imaging device and/or the subject of the image moving, the device 102 compensates for the specific detected motion in the resulting image by reducing the blur in the resulting image; see paragraphs 0014, 0017, 0018 and figs. 1, 3); and
a determination unit (Evaluator component 36; see fig. 1) configured to perform a determination of which one of the first stabilization and the second stabilization has a higher priority based on the motion information about the object (If the detected motion exceeds the movement threshold 39 for the mode of operation of the imaging device 104, evaluator component 36 compensates for the detected motion by selecting a mode of operation 43 of imaging device 104 that can obtain a clear picture under the detected motion condition. Movement thresholds 39 can vary and are modified for a targeted system and/or for a desired outcome; see paragraphs 0032-0034. The higher priority is given by correcting for the specific detected motion and the amount of motion that is detected with it), wherein the control unit controls the first stabilization and the second stabilization based on a result of the determination (Evaluator component 36 receives the global movement 34 information and/or the local movement 31 information and compensates for the detected motion by selecting a mode of operation 43 for imaging device 104 based on the received movement information. Component 36 evaluates the detected motion (e.g., the global movement 34 and/or the local movement 31) and determine whether the detected motion is within a movement threshold 39 for the current mode of operation of the imaging device 104; see paragraphs 00031, 0039, 0051).
Regarding claim 2, Thumpudi discloses everything claimed as applied above (see claim 1). In addition, Thumpudi discloses the determination unit performs the determination based on the motion information about the object during execution of the first stabilization (Scene analyzer component 28 determines whether any objects 20 are moving in the scene 106 and/or a location of the detected movement; see paragraph 0024, 0043. Movement thresholds 39 can vary and can be modified for a targeted system and/or for a desired outcome. Local movements 31 of 4-7% of region dimension per second may be treated as moderate movement, while local movements 31 more than 8% per second may be treated as large movements. If the detected motion exceeds the movement threshold 39 for the mode of operation of the imaging device 104, evaluator component 36 compensates for the detected motion, see paragraphs 0032-0033).
Regarding claim 3, Thumpudi discloses everything claimed as applied above (see claim 1). In addition, Thumpudi discloses in the first stabilization, a position of the object in the image is made closer to a position closer to a target position set in the image (Movement thresholds 39 can vary and can be modified for a targeted system and/or for a desired outcome; see paragraphs 0032-0033. Evaluator component 36 can automatically select a mode of operation 43 for the imaging device 104 based on the local movement 31; see paragraph 0051).
Regarding claim 4, Thumpudi discloses everything claimed as applied above (see claim 1). In addition, Thumpudi discloses a correction control unit calculates a correction amount for correction to be performed by a stabilization unit based on a result of the determination, the motion of the object, and the motion of the capturing apparatus (When motion is detected when capturing an image, either from the movement of the imaging device and/or the subject of the image moving, the devices and methods may compensate for the detected motion in the resulting image by reducing the blur in the resulting image; see paragraphs 0014, 0017. Evaluator 36 can determine that there is both global movement 34 and local movement 31 and correct for both; see paragraphs 0031, 0034).
Regarding claim 5, Thumpudi discloses everything claimed as applied above (see claim 4). In addition, Thumpudi discloses the correction control unit changes weighting for a first correction amount and weighting for a second correction amount based on the result of the determination, the first correction amount being based on the motion of the object and the second correction amount being based on the motion of the capturing apparatus (Scene analyzer component 28 can select a weight 30 for each of the detected areas of motion. Scene analyzer component 28 can apply various weights 30 to the detected motion 46, 48, 50, 51, and 52 based on a location of the detected motion 46, 48, 50, 51, and 52, and/or based on features in the sections of the image frame 18 having the motion. Component 28 can perform both global and local motion analysis as an optimization, when performing pixel analysis and determine both the local movement 31 of objects 20 in the scene 106 and the global movement 34 of the imaging device 104 for motion correction; see paragraphs 0028-0029, 0024).
Regarding claim 6, Thumpudi discloses everything claimed as applied above (see claim 5). In addition, Thumpudi discloses the first correction amount is based on a position of the object in the image and a target position set in the image (Component 28 determines where movement can be acceptable in an image. Movement in peripheral areas of the image can be acceptable, while movement in the center of the image is undesirable. Movement thresholds 39 for motion correction can vary and can be modified for a targeted system and for a desired outcome; see paragraphs 0032-0033, 0024).
Regarding claim 7, Thumpudi discloses everything claimed as applied above (see claim 5). In addition, Thumpudi discloses the first correction amount is based on an amount of movement of the object (Scene analyzer component 28 determines whether any objects 20 are moving in the scene 106 and/or a location of the detected movement; see paragraph 0024, 0043. Movement thresholds 39 can vary and can be modified for a targeted system and/or for a desired outcome. Local movements 31 of 4-7% of region dimension per second may be treated as moderate movement, while local movements 31 more than 8% per second may be treated as large movements. If the detected motion exceeds the movement threshold 39 for the mode of operation of the imaging device 104, evaluator component 36 compensates for the detected motion, see paragraphs 0032-0033).
Regarding claim 8, Thumpudi discloses everything claimed as applied above (see claim 1). In addition, Thumpudi discloses in a case where an amount of movement of the object exceeds a threshold by a predetermined number of times (Movement thresholds 39 can vary and can be modified for a targeted system and for a desired outcome; see paragraph 0032), the determination unit determines that the second stabilization has a higher priority than the first stabilization based on the motion information about the object (If the detected motion exceeds the movement threshold 39 for the mode of operation of the imaging device 104, evaluator component 36 compensates for the detected motion, see paragraphs 0032-0033. The higher priority is given by correcting for the specific detected motion and the amount of motion that is detected with it).
Regarding claim 9, Thumpudi discloses everything claimed as applied above (see claim 1). In addition, Thumpudi discloses in a case where an amount of movement of the object exceeds a threshold over a predetermined period of time Movement thresholds 39 can vary and can be modified for a targeted system and for a desired outcome; see paragraph 0032), the determination unit determines that the second stabilization has a higher priority than the first stabilization based on the motion information about the object (If the detected motion exceeds the movement threshold 39 for the mode of operation of the imaging device 104, evaluator component 36 compensates for the detected motion, see paragraphs 0032-0033. The higher priority is given by correcting for the specific detected motion and the amount of motion that is detected with it).
Regarding claim 11, Thumpudi discloses everything claimed as applied above (see claim 1). In addition, Thumpudi discloses a plurality of thresholds with different magnitudes is set for the determination unit (Movement thresholds 39 can be provided. For global movement 34, such as hand jitter of few degrees (e.g., a rotation of within 5 degrees per second) may be treated as no movement or noise, while rotations of 6 to 24 degrees per second may be treated as moderate movement, while rotations of higher than 25 degrees per second may be treated as large movement. Local movements 31 of less than 3% of region dimension (e.g., a tile width or tile height) per second may be considered no movement, local movements 31 of 4-7% of region dimension per second may be treated as moderate movement, while local movements 31 more than 8% per second may be treated as large movements; see paragraphs 0032), and in a case where an amount of movement of the object exceeds the thresholds in a phased manner, the determination unit determines that the second stabilization has a higher priority than the first stabilization based on the motion information about the object (If the detected motion exceeds the movement threshold 39 for the mode of operation of the imaging device 104, evaluator component 36 compensates for the detected motion, see paragraphs 0032-0033. The higher priority is given by correcting for the specific detected motion and the amount of motion that is detected with it).
Regarding claim 12, Thumpudi discloses everything claimed as applied above (see claim 1). In addition, Thumpudi discloses the determination unit performs the determination based on the motion information about the capturing apparatus, and in a case where a temporal variation (while taking the picture) in an amount of movement of the capturing apparatus increases, the determination unit determines that the second stabilization has a higher priority than the first stabilization (A user moving slightly while taking a picture of a building, and evaluator component 36 determine that there is global movement 34 of the imaging device while there is no local movement 31 of the scene 106; see paragraphs 0031-0033. The higher priority is given by correcting for the specific detected motion and the amount of motion that is detected with it).
Regarding claim 13, Thumpudi discloses everything claimed as applied above (see claim 1). In addition, Thumpudi discloses the determination unit obtains an amount of change in a capturing range by driving of a stabilization unit configured to perform the first stabilization and the second stabilization to correct the image blur (Detecting motion when capturing an image using an imaging device and compensating for the detected motion in the resulting image by reducing blur in the resulting image. Automatically select different modes of operation for the imaging device based on the detected global movement of an imaging device (e.g., movement of the imaging device) and/or local movement of a scene being captured by the imaging device (e.g., detected motion in the scene; see paragraphs 0014, 0017), and performs the determination based on a movement amount obtained by adding the amount of change and an amount of movement of the object that is based on the motion information about the object (The local movement analyzer determines whether any objects 20 may be moving in the scene 106 and the location of the detected movement; see paragraphs 0024, 0043 and fig. 1. The amount of blur in the captured image 19 is be reduced; see paragraph 0039).
Regarding claim 19, Thumpudi discloses everything claimed as applied above (see claim 1). In addition, Thumpudi discloses a capturing apparatus (System 100; see fig. 1 and paragraph 0018) comprising: the apparatus according to claim 1 (Device 102; see fig. 1 and the rejection of claim 1 above); a sensor configured to capture the image (Imaging device 104; see fig. 1 and paragraphs 0019); and a stabilization unit configured to be controlled by the control unit (Operating system 110; see fig. 1 and paragraph 0021).
Regarding claim 20, Thumpudi discloses a method for an apparatus (See fig. 3 and paragraph 0041), comprising:
detecting an object from an image (The scene analyzer can determine whether a subject is moving in the scene; see paragraphs 0015, 0042);
obtaining first information indicating a motion of the object detected in the image (Obtaining Local movement which includes any detected motion in the scene 106 captured by the imaging device 104. That is, determining whether any objects 20 may be moving in the scene 106 and the location of the detected movement; see paragraph 0024, 0043 and figs. 1,3);
obtaining second information indicating a motion of a capturing apparatus (Obtaining global movement which includes any movement of the imaging device 104. The global movement 34 is based on the amount of detected movement by one or more sensors 10 and a pixel analysis of the image frames 18; see paragraphs 0023, 0042 and figs. 1, 3);
performing stabilization control to control first stabilization (Selected mode to correct detected local movement) to correct the motion of the object in the image and second stabilization (Selected mode to correct detected global movement) to correct an image blur caused by the motion of the capturing apparatus (Automatically selecting different modes of operation for the imaging device based on the detected global movement of an imaging device (e.g., movement of the imaging device) and/or local movement of a scene being captured by the imaging device (e.g., detected motion in the scene). When motion is detected when capturing an image, either from the movement of the imaging device and/or the subject of the image moving, the device 102 compensates for the specific detected motion in the resulting image by reducing the blur in the resulting image; see paragraphs 0014, 0017, 0018, 0051 and figs. 1, 3); and
performing a determination of which one of the first stabilization and the second stabilization has a higher priority based on the motion information about the object (If the detected motion exceeds the movement threshold 39 for the mode of operation of the imaging device 104, evaluator component 36 compensates for the detected motion by selecting a mode of operation 43 of imaging device 104 that can obtain a clear picture under the detected motion condition. Movement thresholds 39 can vary and are modified for a targeted system and/or for a desired outcome; see paragraphs 0032-0034. The higher priority is given by correcting for the specific detected motion and the amount of motion that is detected with it), wherein, in the stabilization control, the first stabilization and the second stabilization are controlled based on a result of the determination (Evaluator component 36 receives the global movement 34 information and/or the local movement 31 information and compensates for the detected motion by selecting a mode of operation 43 for imaging device 104 based on the received movement information. Component 36 evaluates the detected motion (e.g., the global movement 34 and/or the local movement 31) and determine whether the detected motion is within a movement threshold 39 for the current mode of operation of the imaging device 104; see paragraphs 00031, 0039, 0051).
Claim Rejections - 35 USC § 103
10. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
11. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
12. Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Thumpudi in view of Le et al. (US-PGPUB 2024/0013441).
Regarding claim 14, Thumpudi discloses everything claimed as applied above (see claim 1). However, Thumpudi does not expressly disclose the determination unit performs the determination using machine learning.
On the other hand, Le discloses the determination unit performs the determination using machine learning (A machine learning-based video coding system 800 that includes a depth map prediction system 822 and an object motion prediction system 802. The depth map prediction system 822 can be used to compensate camera motion while the object motion prediction system 802 can be used to compensate object motion; see paragraph 0143).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Thumpudi and Le to provide the determination unit performs the determination using machine learning for the purpose of accelerating the motion correction processing while improving image quality.
Allowable Subject Matter
13. Claims 10 and 15-18 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all the limitations of the base claim and any intervening claims.
Regarding claim 10, the specific limitation of “in a case where a slope of an envelope curve representing an amount of movement of the object exceeds a threshold, the determination unit determines that the second stabilization has a higher priority than the first stabilization based on the motion information about the object” in the combination as claimed is neither anticipated nor made obvious over the prior art made of record.
Regarding claim 15, the specific limitation of “in a case where the determination unit determines that the second stabilization has a higher priority than the first stabilization during execution of the first stabilization with a higher priority than the second stabilization, the control unit switches the first stabilization to the second stabilization at a predetermined ratio” in the combination as claimed is neither anticipated nor made obvious over the prior art made of record.
Regarding claim 16, the specific limitation of “in a case where the determination unit determines that the first stabilization becomes unable to be performed during execution of the first stabilization with a higher priority than the second stabilization, the determination unit estimates a time when the first stabilization becomes unable to be performed, and the control unit gradually switches the first stabilization to the second stabilization over a period of time before the estimated time” in the combination as claimed is neither anticipated nor made obvious over the prior art made of record.
Regarding claim 17, the specific limitation of “in a case where the determination unit determines that the second stabilization has a higher priority than the first stabilization, the control unit switches the first stabilization to the second stabilization more rapidly as an amount of movement of the object is larger” in the combination as claimed is neither anticipated nor made obvious over the prior art made of record.
Regarding claim 18, the specific limitation of “in a case where the determination unit determines that the second stabilization has a higher priority than the first stabilization, the control unit switches the first stabilization to the second stabilization at respective different speeds in a horizontal direction and a vertical direction” in the combination as claimed is neither anticipated nor made obvious over the prior art made of record.
Citation of Pertinent Art
14. The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure.
Yumiki (US-PGPUB 2008/0204564) discloses even if the moving speed of part or the whole of the detected photographing object is equal to or higher than the threshold A, as long as the motion of the face of the photographing object is lower than the threshold A, "camera shake correcting mode" is continued and does not change to "sensitivity increasing mode." That is, "camera shake correcting mode" is continued as long as possible until the motion of the face of the photographing object reaches or exceeds the threshold A, and the mode is changed to "sensitivity increasing mode" only when the motion of the face of the photographing object reaches or exceeds the threshold A.
Inomata et al. (US-PGPUB 2019/0268545) discloses in a case in which a value of the detected defocus amount is greater than a ranging area determination threshold set in advance, set a ranging size of the ranging area to a speed priority ranging size and in a case in which the value of the defocus amount is equal to or less than the ranging area determination threshold, set the ranging size of the ranging area to a subject priority ranging size that is different from the speed priority ranging size.
Contact Information
15. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CYNTHIA CALDERON whose telephone number is (571)270-3580. The examiner can normally be reached M-F 9:00 AM-5:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, TWYLER HASKINS can be reached at (571)272-7406. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CYNTHIA CALDERON/Primary Examiner, Art Unit 2639 01/22/2026