DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-10, 12-23, 25 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Herman et al (US 2020/0409382 A1).
Regarding claim 1, Herman et al discloses a processor-implemented method for sensing surfaces comprising: using a housing (paragraph [0035]), wherein the housing includes a first light source (105) (114) (paragraphs [0032], [0077]) and a first photosensor (104) (paragraph [0035]) for the first light source, wherein the first light source is mounted to project light downward and the first photosensor is mounted to capture light reflected upward from a surface (See Figs. 5A, 5B), and wherein data from the first photosensor is used with a machine learning model (paragraph [0085]); moving, within a minimum distance, the housing along the surface, wherein the minimum distance allows for detection, by the first photosensor, of reflected light from the first light source off the surface; sending light from the first light source; capturing, by the first photosensor, reflected light from the first light source; interpreting, by the machine learning model (paragraph [0085]), an output of the first photosensor, wherein the interpreting recognizes a texture of the surface; and identifying a composition of the surface, based on the texture (paragraphs [0021], [0073], [0085], [0097]).
PNG
media_image1.png
220
574
media_image1.png
Greyscale
PNG
media_image2.png
204
614
media_image2.png
Greyscale
Regarding claim 2, Herman et al discloses wherein the housing includes a second photosensor (104) for the first light source(105) and wherein the second photosensor for the first light source captures light reflected upward from a surface and wherein the interpreting is based on data from the second photosensor (See Figs 5A, 5B).
Regarding claim 3, Herman et al discloses wherein the interpreting is based on data from the first photosensor and the second photosensor, even when one of the first photosensor or the second photosensor is occluded (obstacles) from receiving reflected light from the first light source (paragraphs [0004], [0007]).
Regarding claims 4, 16, Herman et al discloses the first light source is a first infrared (IR) light emitting diode (LED) (paragraph [0078]).
Regarding claim 5, Herman et al discloses wherein the first photosensor is a first IR transistor (104) (paragraphs [0035], [0099]).
Regarding claim 6, Herman et al discloses wherein the first IR LED is mounted at a first angle to the surface (See Fig. 5B).
Regarding claim 7, Herman et al discloses wherein the first IR transistor is mounted at a second angle on an opposite side of the housing to the first IR LED (See Fig. 5B).
Regarding claim 8, Herman et al discloses wherein further comprising detecting, by the first IR transistor, infrared light originating from the first IR LED after it bounces off the surface (paragraph [0035]).
Regarding claim 9, Herman et al discloses wherein the housing includes a second IR transistor (104) for the first IR LED, wherein the second IR transistor is mounted at the first angle to the surface on a same side of the housing as the first LED (See Fig. 5B).
Regarding claim 10, Herman et al discloses wherein further comprising detecting, by the second IR transistor, infrared light originating from the first IR LED after it bounces off the surface (paragraph [0035]).
Regarding claim 12, Herman et al discloses wherein further comprising detecting, by the first IR transistor, infrared light originating from the first IR LED after it bounces off the surface (paragraph [0035]).
Regarding claim 13, Herman et al discloses wherein the housing includes a second IR transistor for the first IR LED (paragraph [0078]).
Regarding claim 14, Herman et al discloses wherein the second IR transistor is mounted at the first angle to the surface on a same side of the housing as the first LED (See Fig. 5B).
Regarding claim 15, Herman et al discloses wherein further comprising detecting, by the second IR transistor, infrared light originating from first IR LED after it bounces off the surface (See Fig. 5B).
Regarding claim 17, Herman et al discloses wherein the interpreting further comprises examining, from the first photosensor, one or more segments of data collected over a timeframe (paragraph [0028]).
Regarding claim 18, Herman et al discloses a computer system (microcontroller) (paragraph [0027]) for instruction execution comprising: a memory (memory storing software) which stores instructions; one or more processors coupled to the memory wherein the one or more processors, when executing the instructions which are stored, are configured to: use a housing, wherein the housing (paragraph [0035]), includes a first light source (105) (114) (paragraphs [0032], [0077]) and a first photosensor (104) (paragraph [0035]) for the first light source, wherein the first light source is mounted to project light downward and the first photosensor is mounted to capture light reflected upward from a surface (See Figs. 5A, 5B), and wherein data from the first photosensor is used with a machine learning model; move, within a minimum distance, the housing along the surface, wherein the minimum distance allows for detection, by the first photosensor, of reflected light from the first light source off the surface; send light from the first light source; capture, by the first photosensor, reflected light from the first light source; interpret, by the machine learning model (paragraph [0085]), an output of the first photosensor, wherein the interpreting recognizes a texture of the surface; and identify a composition of the surface, based on the texture (paragraphs [0021], [0073], [0085], [0097]).
Regarding claim 19, Herman et al discloses an apparatus (See Figs. 5A, 5B) for sensing surfaces comprising: a first infrared (IR) light emitting diode (LED) located in a housing (paragraph [0035]), wherein the first IR LED (105) (114) (paragraphs [0032], [0077]) is mounted to project infrared light downward toward a surface; a first IR semiconductor sensor (stereo camera) (104) (paragraph [0035]) for the first IR LED located in the housing (See Figs. 5A, 5B), wherein the first IR semiconductor sensor is mounted to capture light reflected from the first IR LED upward from the surface; a microcontroller, wherein the microcontroller hosts a convolutional neural network (paragraph [0085]), and wherein the microcontroller (paragraph [0027]) is coupled to the first IR LED and the first IR semiconductor sensor; and a power source (battery voltage) (122), wherein the power source is connected to provide power to the first IR LED, the first IR semiconductor sensor and the microcontroller, and wherein the power source (battery voltage) (122) (paragraphs [0050], is contained within, on, or next to the housing (docking station) (paragraphs [0050], [0052]).
Regarding claim 20, Herman et al discloses wherein the first IR LED (105) (114) (paragraphs [0032], [0077]), the first IR semiconductor sensor (stereo camera) (104) (paragraph [0035]) , and the microcontroller (paragraph [0027]) that hosts the convolutional neural network (paragraph [0085]) are used to identify a composition of the surface, based on interpreting output of the first IR semiconductor sensor using the microcontroller (See Figs. 5A, 5B and paragraphs [0073], [0085], [0097]).
Regarding claim 21, Herman et al discloses wherein the first IR LED is mounted at a first angle to the surface (See Fig. 5B).
Regarding claim 22, Herman et al discloses wherein the first IR semiconductor sensor is mounted at a second angle to the surface on an opposite side of the housing to the first IR LED (See Fig. 5B).
Regarding claim 23, Herman et al discloses wherein a second IR semiconductor sensor (104) for the first IR LED is mounted at the first angle to the surface on a same side of the housing as the first LED (See Fig. 5B).
Regarding claim 25, Herman et al discloses wherein further comprising an external memory (memory storing software), wherein the external memory is coupled to the microcontroller (microcontroller) (paragraph [0027]) and wherein the external memory is powered by the power source (battery voltage) (122) (paragraphs [0050]).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 11, 24 is/are rejected under 35 U.S.C. 103 as being unpatentable over Herman et al (US 2020/0409382 A1) in view of Lu et al (US 2023/0078597 A1).
Regarding claims 11, 24, Herman et al discloses all of the limitations of claims 5 and 19, as disclosed supra however, Herman et al is silent with regards to mounting IR transistor at an angle as claimed. Lu et al discloses an electronic device with optical sensor for sampling surfaces, comprising: an IR LED and IR transistor mounted on a top of a housing at an angle of substantially 45 degree from a surface (See Fig. 9 and paragraphs [0064]-[0065]). Thus, it would have been obvious to modify Herman et al with the teaching of Lu et al, so as to enable a versatile means of mounting to a housing enclosure of different orientations and shapes.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to FANI POLYZOS BOOSALIS whose telephone number is (571)272-2447. The examiner can normally be reached 7:30-3:30 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Uzma Alam can be reached at Uzma.Alam@USPTO.GOV. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/F.P.B./Examiner, Art Unit 2884
/UZMA ALAM/Supervisory Patent Examiner, Art Unit 2884