DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statements (IDS) submitted on 19 December 2023, 15 April 2025, and 04 August 2025 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements are being considered by the examiner.
Specification
The title of the invention is not descriptive. A new title is required that is clearly indicative of the invention to which the claims are directed.
Claim Objections
Claims 11 and 13 are objected to because of the following informalities:
Claim 11: “said emitted signal” in line 10 should be “said emitted light” or “emit light” in line 7 should be “emit a signal”; “the difference” in line 12 should be “a difference”; and “the signal” in line 12 should be “the signals” for further clarity and continuity in the claim language.
Claim 13: “an emitting signa” in line 3 should be “an emitted signal” for further clarity.
Appropriate correction is required.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation is:
"a control unit" in claims 1 and 11: information regarding where the control unit may be disposed is explained in ¶16, however, no sufficient structure is provided for the control unit.
Because this claim limitation is being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it is being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this limitation interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation to avoid it being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation recites sufficient structure to perform the claimed function so as to avoid it being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-15 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim limitation “a control unit” invokes 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. However, the written description fails to disclose the corresponding structure, material, or acts for performing the entire claimed function and to clearly link the structure, material, or acts to the function. The specification does not explain the unit in any detail other than briefly stating in ¶16, The control unit 31 may be coupled to the light-emitting device 11, the first light-sensing unit 21, and the second light-sensing unit 22, respectively, for controlling their operations and processing the generated signals. Nonetheless, according to another embodiment of the present application, the control unit 31 may be set in an external device, for example, a mobile communication device or a wearable device, to control the light-emitting device 11, the first light-sensing unit 21, and the second light-sensing unit 22 and process signals. Therefore, claims 1-15 are indefinite and are rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph.
Applicant may:
(a) Amend the claim so that the claim limitation will no longer be interpreted as a limitation under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph;
(b) Amend the written description of the specification such that it expressly recites what structure, material, or acts perform the entire claimed function, without introducing any new matter (35 U.S.C. 132(a)); or
(c) Amend the written description of the specification such that it clearly links the structure, material, or acts disclosed therein to the function recited in the claim, without introducing any new matter (35 U.S.C. 132(a)).
If applicant is of the opinion that the written description of the specification already implicitly or inherently discloses the corresponding structure, material, or acts and clearly links them to the function so that one of ordinary skill in the art would recognize what structure, material, or acts perform the claimed function, applicant should clarify the record by either:
(a) Amending the written description of the specification such that it expressly recites the corresponding structure, material, or acts for performing the claimed function and clearly links or associates the structure, material, or acts to the claimed function, without introducing any new matter (35 U.S.C. 132(a)); or
(b) Stating on the record what the corresponding structure, material, or acts, which are implicitly or inherently set forth in the written description of the specification, perform the claimed function. For more information, see 37 CFR 1.75(d) and MPEP §§ 608.01(o) and 2181.
Regarding claim 1, “the type” in line 8 is unclear as it lacks proper antecedent basis.
Claims 2-10 are rejected for their dependency on claim 1.
Regarding claim 3, “the light-receiving region” in line 2 is unclear as it lacks proper antecedent basis.
Claim 4 is rejected for its dependency on claim 3.
Regarding claim 4, “the light-receiving region” in line 2 is unclear as it lacks proper antecedent basis.
Regarding claim 6, “the wavelength” in lines 2 and 3 are both unclear as they both lack proper antecedent basis.
Regarding claim 11, “the operation” in line 1 is unclear as it lacks proper antecedent basis. Additionally, “a light sensor” in line 1 is unclear as this limitation has been mentioned previously in the same claim. Is this limitation referring to a different light sensor, or the same light sensor mentioned previously? In light of the specification, the Examiner is interpreting this limitation to be referring to the same light sensor mentioned previously. “the signals” in line 8 is unclear as it lacks proper antecedent basis. “the type” in line 11 is unclear as it lacks proper antecedent basis.
Claims 12-15 are rejected for their dependency on claim 11.
Regarding claim 13, “the wavelength” in lines 3 and 4 respectively are unclear as they lack proper antecedent basis.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-9 and 11-14 are rejected under 35 U.S.C. 103 as being unpatentable over Ju et al. (CN 213693974 U) in view of Na et al. (USPGPub 20210381960 A1).
Regarding claim 1, Ju teaches a light sensor, comprising: a light-emitting device (112), generating an emitted signal (see figure 2, light source 112; and ¶57, the hybrid light source 11 needs to emit light beams of at least three different wavelengths); a first light-sensing unit (14), having light-sensing characteristics corresponding to a first wavelength range; and a second light-sensing unit (14), having light-sensing characteristics corresponding to a second wavelength range, said second wavelength range differing from said first wavelength range (see figures 2 and 5, photosensitive element 14; and ¶64, Since each photosensitive element 14 can only receive information light of one wavelength, the information light needs to be split by the beam splitter 13 before the photosensitive element 14 receives the information light, so that each wavelength of information light in the information light is projected onto its corresponding photosensitive element 14); wherein when said emitted signal is reflected by an object (15) and received by said first light-sensing unit (14) and said second light-sensing unit (14), a control unit (16) judges the type of said object (¶73, the 3D recognition module 1 emits light toward the target object 15 and receives information light reflected from the target object 15 to generate image information. In this way, the image information can be deeply analyzed to realize object type sensing and recognition). However, Ju fails to explicitly teach wherein the control unit judges the type of said object according to a difference between a sensed signal of said first light-sensing unit and a sensed signal of said second light-sensing unit.
However, Na teaches wherein the control unit (104/203) judges the type of said object (140) according to a difference between a sensed signal of said first light-sensing unit (108) and a sensed signal of said second light-sensing unit (110) (¶117, As the reflectivity of the first wavelength and the reflectivity of the second wavelength are dependent on the object 140, the magnitude of the first detecting signal detected by the photodetector 108 and the magnitude of the second detecting signal detected by the photodetector 110 may be different. In some implementations, the calculation circuit 104 can be configured to calculate a ratio of the first magnitude and the second magnitude as the calculating result ED. In some embodiments, the calculation circuit 104 can be configured to calculate a difference of the first magnitude and the second magnitude).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Ju to incorporate the teachings of Na to further calculate a difference between the two signals because, depending on the object, the reflectivity of the wavelengths may differ, the measurement of which would help determine the type of object.
Regarding claim 2, Ju as modified by Na teaches the light sensor of claim 1, wherein said control unit (Ju 16 | Na 104/203) performing a proximity sensing according to the signal sensed by said first light-sensing unit (Ju 14 | Na 108) or the signal sensed by said second light-sensing unit (Ju 14 | Na 110) (Ju, ¶60, The reflected pulse wave is received by the photosensitive element 14. Then, the 3D recognition module 1 detects and calculates the round-trip time of the light pulse of each pixel point in the pulsed light to obtain the distance of each point in the target object 15 relative to the 3D recognition module 1).
Regarding claim 3, Ju as modified by Na teaches the light sensor of claim 1, wherein said first light-sensing unit (Ju 14 | Na 108) includes an optical filter covering the light-receiving region of said first light-sensing unit (Ju 14 | Na 108) (Ju, ¶10, the beam splitter is a filter, and the same beam splitting region has at least two filters adapted to the information light of each wavelength; each filter is used to filter the information light of different wavelengths to allow the information light adapted to the filter to be projected onto the corresponding photosensitive chip).
Regarding claim 4, Ju as modified by Na teaches the light sensor of claim 3, wherein said second light-sensing unit (Ju 14 | Na 110) includes an optical filter covering the light-receiving region of said second light-sensing unit (Ju 14 | Na 110); and said optical filter of said second light-sensing unit (Ju 14 | Na 110) is different from the optical filter of said first light-sensing unit (Ju 14 | Na 108) (Ju, ¶10, the beam splitter is a filter, and the same beam splitting region has at least two filters adapted to the information light of each wavelength; each filter is used to filter the information light of different wavelengths to allow the information light adapted to the filter to be projected onto the corresponding photosensitive chip).
Regarding claim 5, Ju as modified by Na teaches the light sensor of claim 1, wherein said first light-sensing unit (Ju 14 | Na 108) includes an optoelectronic diode corresponding to said first wavelength range; and said second light-sensing unit (Ju 14 | Na 110) includes an optoelectronic diode corresponding to said second wavelength range (Ju, see figures 2 and 5, photosensitive element 14; and ¶64, Since each photosensitive element 14 can only receive information light of one wavelength, the information light needs to be split by the beam splitter 13 before the photosensitive element 14 receives the information light, so that each wavelength of information light in the information light is projected onto its corresponding photosensitive element 14).
Regarding claim 6, Ju as modified by Na teaches the light sensor of claim 1, wherein said light sensor comprises another light-emitting device (Ju 112 | Na 114) generating an emitted signal; and the wavelength of said emitted signal generated by said light-emitting device (Ju 112 | Na 112) is different from the wavelength of said emitted signal generated by said another light-emitting device (Ju 112 | Na 114) (Ju, ¶59, the hybrid light source 11 may also be equipped with a visible light laser and an infrared laser).
Regarding claim 7, Ju as modified by Na teaches the light sensor of claim 1, wherein said light sensor comprises a third light-sensing unit (Ju 14) with light-sensing characteristics corresponding to a third wavelength range different from said first wavelength range and said second wavelength range (Ju, see figures 2 and 5, photosensitive element 14; ¶55, in order to further improve the identification accuracy of the type of object, the hybrid light source 11 is used to emit light beams of three different wavelengths; and ¶64, Since each photosensitive element 14 can only receive information light of one wavelength, the information light needs to be split by the beam splitter 13 before the photosensitive element 14 receives the information light, so that each wavelength of information light in the information light is projected onto its corresponding photosensitive element 14); and said control unit (Ju 16 | Na 104) judges the type of said object (Na 140) according to the differences between the signals sensed by said first light-sensing unit (Ju 14 | Na 108), said second light-sensing unit (Ju 14 | Na 110), and said third light-sensing unit (Ju 14) (Na, ¶117, As the reflectivity of the first wavelength and the reflectivity of the second wavelength are dependent on the object 140, the magnitude of the first detecting signal detected by the photodetector 108 and the magnitude of the second detecting signal detected by the photodetector 110 may be different. In some implementations, the calculation circuit 104 can be configured to calculate a ratio of the first magnitude and the second magnitude as the calculating result ED. In some embodiments, the calculation circuit 104 can be configured to calculate a difference of the first magnitude and the second magnitude; and see ¶160 for further details).
Regarding claim 8, Ju as modified by Na teaches the light sensor of claim 1, wherein said control unit (Ju 16 | Na 104/203) is coupled to said light-emitting device (Ju 112 | Na 112), said first light-sensing unit (Ju 14 | Na 108), and said second light-sensing unit (Ju 14 | Na 110), respectively (Ju, see figure 1, processor 16; and ¶72, the 3D recognition module 1 also includes a processor 16, which is electrically connected to the photosensitive element 14; and Na, ¶21, the light-emitting unit, the photo-detecting unit, and the controller can be implemented on a common chip; and see ¶¶140-141 for further details).
Regarding claim 9, Ju as modified by Na teaches the light sensor of claim 1, wherein said first light-sensing unit (Ju 14 | Na 108), said second light-sensing unit (Ju 14 | Na 110), and said control unit (Ju 16 | Na 104/203) are integrated on an integrated-circuit chip (Na, ¶21, the light-emitting unit, the photo-detecting unit, and the controller can be implemented on a common chip; and see ¶¶140-141 for further details).
Regarding claim 11, Ju teaches a control method of light sensor, controlling the operation of a light sensor comprising a light-emitting device (112) (see figure 2, light source 112; and ¶57, the hybrid light source 11 needs to emit light beams of at least three different wavelengths), a first light-sensing unit (14), and a second light-sensing unit (14), said first light-sensing unit (14) having light-sensing characteristics corresponding to a first wavelength range, said second light-sensing unit (14) having light-sensing characteristics corresponding to a second wavelength range, said first wavelength range different from said second wavelength range (see figures 2 and 5, photosensitive element 14; and ¶64, Since each photosensitive element 14 can only receive information light of one wavelength, the information light needs to be split by the beam splitter 13 before the photosensitive element 14 receives the information light, so that each wavelength of information light in the information light is projected onto its corresponding photosensitive element 14), and a control unit (16) receiving the signals sensed by said first light-sensing unit (14) and said second light-sensing unit (14); where when said emitted signal is reflected by an object and received by said first light-sensing unit (14) and said second light-sensing unit (14), said control unit (16) judges the type of said object (¶73, the 3D recognition module 1 emits light toward the target object 15 and receives information light reflected from the target object 15 to generate image information. In this way, the image information can be deeply analyzed to realize object type sensing and recognition). However, Ju fails to explicitly teach the control unit controlling said light-emitting device to emit light; and wherein the control unit judges the type of said object according to a difference between a sensed signal of said first light-sensing unit and a sensed signal of said second light-sensing unit.
However, Na teaches the control unit (104/203) controlling said light-emitting device (112) to emit light (¶140, the controller 203 (e.g., analog or digital circuitry, or one or more processors) can be configured to generate a first control signal CS1 and a second control signal CS2 to control (e.g., drive) the light emitted from the light sources 112/114); and wherein the control unit (104/203) judges the type of said object (140) according to a difference between a sensed signal of said first light-sensing unit (108) and a sensed signal of said second light-sensing unit (110) (¶117, As the reflectivity of the first wavelength and the reflectivity of the second wavelength are dependent on the object 140, the magnitude of the first detecting signal detected by the photodetector 108 and the magnitude of the second detecting signal detected by the photodetector 110 may be different. In some implementations, the calculation circuit 104 can be configured to calculate a ratio of the first magnitude and the second magnitude as the calculating result ED. In some embodiments, the calculation circuit 104 can be configured to calculate a difference of the first magnitude and the second magnitude).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Ju to incorporate the teachings of Na to further include a controller configured to control the light source in order to not only provide function to the device and controlling wavelength, but to also save energy by preventing the light source being constantly turned on. Additionally, it would have been obvious to further calculate a difference between the two signals because, depending on the object, the reflectivity of the wavelengths may differ, the measurement of which would help determine the type of object.
Regarding claim 12, Ju as modified by Na teaches the control method of light sensor of claim 11, wherein said control unit (Ju 16 | Na 104/203) senses distance according to the signal sensed by said first light-sensing unit (Ju 14 | Na 108) or the signal sensed by said second light-sensing unit (Ju 14 | Na 110) (Ju, ¶60, The reflected pulse wave is received by the photosensitive element 14. Then, the 3D recognition module 1 detects and calculates the round-trip time of the light pulse of each pixel point in the pulsed light to obtain the distance of each point in the target object 15 relative to the 3D recognition module 1).
Regarding claim 13, Ju as modified by Na teaches the control method of light sensor of claim 11, wherein said light sensor comprises another light-emitting device (Ju 112 | Na 112); and said control method comprises a step of controlling said another light-emitting device (Ju 112 | Na 114) to generate an emitting signa and controlling the wavelength of said emitted signal generated by said light-emitting (Ju 112 | Na 112) device different from the wavelength of said emitted signal generated by said another light-emitting device (Ju 112 | Na 114) (Ju, ¶59, the hybrid light source 11 may also be equipped with a visible light laser and an infrared laser; and Na, ¶140, the controller 203 (e.g., analog or digital circuitry, or one or more processors) can be configured to generate a first control signal CS1 and a second control signal CS2 to control (e.g., drive) the light emitted from the light sources 112/114).
Regarding claim 14, Ju as modified by Na teaches the control method of light sensor of claim 11, wherein said light sensor comprises a third light-sensing unit (Ju 14) with light-sensing characteristics corresponding to a third wavelength range different from said first wavelength range and said second wavelength range (Ju, see figures 2 and 5, photosensitive element 14; ¶55, in order to further improve the identification accuracy of the type of object, the hybrid light source 11 is used to emit light beams of three different wavelengths; and ¶64, Since each photosensitive element 14 can only receive information light of one wavelength, the information light needs to be split by the beam splitter 13 before the photosensitive element 14 receives the information light, so that each wavelength of information light in the information light is projected onto its corresponding photosensitive element 14); and said control method comprises a step of said control unit (Ju 16 | Na 104/203) judging the type of said object (Na 140) according to the differences between the signals sensed by said first light-sensing unit (Ju 14 | Na 108), said second light-sensing unit (Ju 14 | Na 110), and said third light-sensing unit (Ju 14) (Na, ¶117, As the reflectivity of the first wavelength and the reflectivity of the second wavelength are dependent on the object 140, the magnitude of the first detecting signal detected by the photodetector 108 and the magnitude of the second detecting signal detected by the photodetector 110 may be different. In some implementations, the calculation circuit 104 can be configured to calculate a ratio of the first magnitude and the second magnitude as the calculating result ED. In some embodiments, the calculation circuit 104 can be configured to calculate a difference of the first magnitude and the second magnitude; and see ¶160 for further details).
Claims 10 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Ju et al. (CN 213693974 U) in view of Na et al. (USPGPub 20210381960 A1) as applied to claims 1 and 11 above, and further in view of Ruichi et al. (CN 111948669 A).
Regarding claims 10 and 15, Ju as modified by Na teaches the control unit (Ju 16 | Na 104/203), the first light-sensing unit (Ju 14 | Na 108), and the second light-sensing unit (Ju 14 | Na 110) (Ju, ¶72, the 3D recognition module 1 also includes a processor 16, which is electrically connected to the photosensitive element 14; and Na, ¶21, the light-emitting unit, the photo-detecting unit, and the controller can be implemented on a common chip). However, the combination fails to explicitly teach generating an identification rate and stores a range index of said identification rate for judging the type of said object.
However, Ruichi teaches generating an identification rate and stores a range index of said identification rate for judging the type of said object (¶69, The database stores hyperspectral data information of various objects. The identification device is used to compare the hyperspectral data information generated by the hyperspectral data information acquisition system 100 with the hyperspectral data information in the database, and determine the category of the object to be tested corresponding to the hyperspectral data information based on the data comparison result).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the combination of Ju and Na to incorporate the teachings of Ruichi to further include storing a database of object data in order to quickly identify the object under test.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ERIN R GARBER whose telephone number is (571)272-4663. The examiner can normally be reached M-F 0730-1730.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Georgia Y Epps can be reached at (571)272-2328. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ERIN R GARBER/Examiner, Art Unit 2878