DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 2-3, 5, 9-10 and 12-13 are currently pending in U.S. Patent Application No. 18/386,051 and an Office action on the merits follows.
Election/Restrictions
Applicant's election without traverse of Group II/Process (e.g. Applicant’s Fig. 6 – see also MPEP 806.05(h)) in the reply filed on 11/28/2025 is acknowledged.
Claim(s) 2-3, 5 and 9-10, directed to various Product/sensor module embodiments, are withdrawn from further consideration pursuant to 37 CFR 1.142(b) as being drawn to a nonelected Group I/Product, there being no allowable generic or linking claim. See MPEP § 821.02. Examiner understands it common to amend even withdrawn claims during prosecution so as to retain the right to rejoinder, in the event that prosecution identifies an Allowable claim from the elected grouping/invention (see MPEP 821.04). MPEP 821.04(b) describes rejoinder in the context of a restriction requirement as described in MPEP 806.05(h).
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 12-13 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 12 recites the language “configuring a sensor module to perform …”, wherein it is unclear what is required for such a configuring, and/or if such a ‘configuring’ is actually a step in the process at all. Examiner notes Applicant’s Specification does not appear to use any corresponding/similar language (see PGPUB US 2024/0062394 A1 – the sole instance of “configuring” is in claim 12), nor does e.g. Applicant’s Figure 6 and supporting disclosure feature any explicitly disclosed step of configuring. As currently recited it is additionally unclear as to if the method requires actually performing any ‘object recognition’ and/or ‘human detection for the object’, or if the claim simply requires a configuring that would enable such steps. For compact prosecution purposes Examiner interprets the claim such that actually performing object recognition and human detection is required. Claim 12 further recites “human detection for the object”, for which it is unclear how the limitation in question is to be interpreted. The ‘object’ recited is implicitly distinct from a ‘human’ given the manner in which the claim requires a ‘recognition’ of the first and a ‘detection’ of the later, however the language ‘for the object’ suggests the ‘human detection’ is perhaps reaching a determination that the object itself is either a human, or a portion thereof. It is unclear if the claims are a literal translation into English from a foreign document and the limitations in question are idiomatic English as a result, however the Examiner requests clarifying remarks and/or amendment. For compact prosecution purposes Examiner understands that the object may indeed be a human/portion thereof, and need not be distinct therefrom.
Dependent claim 13 inherits and fails to cure that deficiency identified above for the case of claim 12 and is rejected accordingly.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 12-13 are rejected on the ground of nonstatutory double patenting as being unpatentable over (anticipated by) claims 12-13 (direct correspondence) of U.S. Patent No. 11,854,215.
Claims 12-13 are rejected on the ground of nonstatutory double patenting as being unpatentable over (anticipated by) claim 6 of U.S. Patent No. 11,417,002.
Although the claims at issue are not identical, they are not patentably distinct from each other because claims of reference anticipate the claims of the instant application and/or require only minimal/obvious modification to teach/suggest all elements recited in the instant claims. The following additional considerations similarly apply:
• Instant claim(s) and claim(s) of reference recite common subject matter;
• Whereby instant claim(s), recite the open ended transitional phrase “comprising”, and do not preclude those additional elements recited by claims of reference;
• Language/terminology of instant claim(s) constituting minor/slight variations from the claims of reference, if/where present, require interpretations under Broadest Reasonable Interpretation and/or plain meaning definitions (MPEP 2173 and 2111) equivalent to/met by language of the reference claims in view of that corresponding/shared Specification. While the disclosure of reference may not be used as prior art (Double Patenting concerns the claims of reference), portions of the specification which provide support for reference claims may also be examined and considered when addressing the scope of claim(s) of reference and the issue of whether an instant claim defines an obvious variation or falls within the scope of an invention claimed in the claim(s) of reference. See MPEP 804 with reference to In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
1. Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. (US 2013/0229491 A1) (cited by Applicant) in view of Jung et al. (US 2013/0123014 A1).
As to claim 12, Kim discloses a method for image recognition comprising:
detecting whether an object is detected (Fig. 1 S10, Fig. 4 S100, [0048] “the three-dimensional image sensor detects a position change of an object by generating a two-dimensional image in a low power standby mode (S10)”, [0049] “In the low power standby mode, the three-dimensional image sensor may perform not a three-dimensional image sensing operation but a two-dimensional image sensing operation. That is, the three-dimensional image sensor may generate the two-dimensional image to detect the position change of the object in the low power standby mode”, [0050], [0089] “The three-dimensional image sensor 600 may operate in a low power standby mode 2D MODE while no object is detected or while an object does not move”, etc.,);
configuring a sensor module (Fig. 2 3D image sensor comprising pixel array 110 and control unit 150) to perform object recognition and human detection for the object (Fig. 1 S30, Fig. 4 S300, Fig. 5 S450-S550, [0068] “The three-dimensional image sensor 100 may perform the gesture recognition for the object 160 by using the light TX of high luminance in the three-dimensional operating mode”, [0053] “In the three-dimensional operating mode, the three dimensional image sensor performs gesture recognition for the object by generating a three-dimensional image using the light source module (S30). The gesture recognition may be performed by measuring a distance (or depth) of the object from the three-dimensional image sensor and a horizontal movement of the object. For example, in a case where the three-dimensional image sensor employed in an electronic book (E-book), the three-dimensional image sensor may detect a horizontal movement of a hand of a user when the user performs an action, such as flipping or turning E-book pages. In a case where the three-dimensional image sensor included in a video game machine, the three-dimensional image sensor may measure a distance of a user from the video game machine when the user approaches or recedes from the video game machine”), wherein the sensor module includes a plurality of first sensors and a plurality of second sensors (Figs. 8-9, [0058] “The pixel array 110 may further include color pixels for providing color image information. In this case, the three dimensional image sensor 100 may be a three-dimensional color image sensor that provides the color image information and the depth information”, [0099], etc.,), the first sensors are image sensors for the object recognition (e.g. RGB pixels as distinguished from those depth pixels 630) and the second sensors are IR sensors for the human detection ([0058] “In some example embodiments, an infrared filter and/or a near-infrared filter may be formed on the depth pixels … In some example embodiments, three-dimensional image sensor 100 may generate the two-dimensional image using the color pixels in the low power standby mode, and may generate a three-dimensional image using the depth pixels in a three-dimensional operating mode”, [0091] “The three-dimensional image sensor 600 may generate a three-dimensional image using the depth pixels 630, and may analyze a movement direction of the object (e.g., the hand 650) and a type of gesture based on the generated three dimensional image. For example, if the hand 650 of the user moves from a right side to a left side, the three-dimensional image sensor 600 determine the action as turning pages of the E-book”, etc.,; Examiner notes Applicant’s Specification appears to suggest the term thermal sensor and infrared sensor may be synonymous [0035], and the claims do not preclude the use of any IR light source);
wherein, when the first sensors and the second sensors are arranged in columns, each column of the first sensors is adjacent to at least one column of the second sensors, or when the first sensors and the second sensors are arranged in rows, each row of the first sensors is adjacent to at least one row of the second sensors, or when the first sensors and the second sensors are arranged in a checkerboard manner in the matrix, each of the first sensors is adjacent to the second sensors (Figs. 8-9, [0099] “Referring to FIG. 8, a pixel array 300 of a three dimensional image sensor may include a plurality of depth pixels 310 that are arranged in a matrix form having a plurality of rows and a plurality of columns. The three-dimensional image sensor may generate a two-dimensional image using the depth pixels 310 in a portion 330 of the plurality of rows. For example, the three-dimensional image sensor may skip the depth pixels 310 in even-numbered rows, and may use the depth pixels 310 in odd-numbered rows 330 to generate the two-dimensional image”, [0100], etc.,; Examiner notes Applicant may refer to MPEP 2111.04 regarding contingent limitations and how they may or may not limit claim scope/ patentably distinguish a method/process for steps that are merely suggested and/or made optional – Examiner also notes specifics regarding any sensor structure, as distinguished from functions/ steps of a method/process are likely drawn to the non-elected grouping. While Applicant may amend process claims to incorporate specifics from the non-elected grouping, such process claims would likely be withdrawn from consideration; claim 12 remains for consideration only because it does not actually require the specifics of the non-elected grouping (instead features a series of contingent limitations all presented in the alternative, such that none are absolutely required if precedent conditions are not met)).
While Kim discloses depth pixels in optional IR embodiments ([0058]), Kim fails to explicitly disclose them as “thermal” sensors (e.g. detecting long wave (possibly even MWIR) IR radiation, LWIR being approximately 7-8 µm to 14-15 µm wavelength – see supporting materials cited by the Examiner NPL, Ottney et al. (US 2007/0235634 A1) [0006], etc.), and further discloses the light source module as activated in the three-dimensional operating mode ([0009]) and optionally deactivated and/or emitting light at low luminance in the low power standby mode ([0021]). Stated differently, while the Examiner understands all thermal sensors (species) to be infrared sensors (genus), it may not be the case that all ‘infrared sensors’ are thermal sensors, if they depend on active illumination/detect only short-wave (NIR) IR radiation (and not LWIR), and Kim falls silent on the wavelength range detected.
Jung however evidences the obvious nature of second sensors that are thermal sensors for object/human motion/gesture recognition (Fig. 2B Thermal pixel array 14c and object 8, [0016] “The image sensor may include a pixel array, and the pixel array may include a color pixel configured to detect color information of an object, a depth pixel configured to detect depth information of the object and a thermal pixel configured to detect thermal information of the object”, Fig. 10 object recognition module 130, motion extracting module 140, and motion recognition module 150 of motion processing module 120).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to modify the method of Kim so as to further comprise executing a human operation/gesture recognition based on thermal sensor signals as taught/ suggested by Jung, the motivation as similarly taught/suggested therein ([0119]) and readily recognized by POSITA, that the resultant imagery may facilitate a more accurate recognition for instances wherein the complementary modalities are otherwise less effective (e.g. low illumination).
2. Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over Kim et al. (US 2013/0229491 A1) (cited by Applicant) in view of Jung et al. (US 2013/0123014 A1) and Huang (US 8,773,352 B1) (cited by Applicant).
As to claim 13, Kim as modified by Jung teaches/suggests the method of claim 12.
Kim as modified by Jung further suggests wherein the first sensors and the second sensors are arranged in a matrix, numbers of the first sensor and the second sensor are more than or equal to two (Figs. 8-9, [0099], [0100], etc., see also Jung Fig. 8),
Kim fails to explicitly disclose the first sensors and the second sensors formed in the same plane.
Huang however evidences the obvious nature of first and second sensors formed in the same plane (Huang Fig. 7, col 13 lines 63-66 “image sensor having image sensor pixel elements disposed on a grid of pixels in a single plane”, col 15 lines 30-35).
It would have been obvious to a person of ordinary skill in the art, before the effective filing date, to further modify the method of Kim in view of Jung so as be implemented in manner such that the first sensors and second sensors are formed in the same plane, as taught/suggested by Huang, the motivation as similarly taught/suggested therein and readily recognized by POSITA that such an implementation would enable the method to be performed on devices characterized by a thinner form factor where further characterized by a reasonable expectation of success.
Additional References
Prior art made of record and not relied upon that is considered pertinent to applicant's disclosure:
Additionally cited references (see attached PTO-892) otherwise not relied upon above have been made of record in view of the manner in which they evidence the general state of the art.
Inquiry
Any inquiry concerning this communication or earlier communications from the examiner should be directed to IAN L LEMIEUX whose telephone number is (571)270-5796. The examiner can normally be reached Mon - Fri 9:00 - 6:00 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Chan Park can be reached on 571-272-7409. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/IAN L LEMIEUX/Primary Examiner, Art Unit 2669