DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
The reply filed on 26 January 2026 has been entered. Applicant’s arguments with respect to claims 1-20 have been considered but are moot in view of new ground(s) of rejection caused by the amendments.
Claims 1-20 are pending in this application and have been considered below.
Priority
Applicant claims the benefit of US Provisional Application No. 67/376,881, filed 23 September 2022. Claims 1-20 have been afforded the benefit of this filing date.
1st Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claims 1-3, 5-11 and 13-20 are rejected under 35 U.S.C. 103 as obvious over US Patent Publication 2024 0177500 A1, (Edwards) in view of US Patent Publication 2018 0143320 A1, (Steever et al.). The references are listed in a PTO-892 from the Office Action in which they are first used.
Claim 1
[AltContent: textbox (Edwards, Fig 6A, showing detecting a person by an infrared device.)]
PNG
media_image1.png
139
233
media_image1.png
Greyscale
Regarding Claim 1, Edwards teaches a system ("System 100 is further configured for performing various image processing algorithms on the captured images," paragraph [0058]) comprising:
one or more cameras ("an imaging camera 106," paragraph [0059]);
one or more illuminators ("the infrared light sources are described as being LEDs," paragraph [0057]);
one or more processors ("an associated controller 112 which includes a computer processor or microprocessor and memory for storing and buffering the captured images from camera 201," paragraph [0061]); and
one or more computer readable media comprising computer readable code executable by one or more processors ("any form of computer processing device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory," paragraph [0067]) to:
obtain image data captured by the one or more cameras ("System 100 is further configured for performing various image processing algorithms on the captured images," paragraph [0058]), wherein the image data is captured when at least one of the one or more illuminators are illuminated ("LEDs 108 and 110 are activated and deactivated in synchronization with consecutive image frames captured by camera 106 to illuminate the driver during image capture," paragraph [0072]);
determine brightness statistics from the image data ("the detection of an object within the caution zone is estimated based on a brightness assessment of captured images," paragraph [0077]);
determine whether the brightness statistics satisfies a predetermined threshold ("the detection of an object within the caution zone is estimated based on a brightness assessment of captured images," paragraph [0077])wherein the computer readable code to determine whether the brightness statistics satisfies the predetermined threshold further comprises computer readable code to:
determine, based on the number of pixels that satisfy the brightness range, whether the brightness statistics satisfies the predetermined threshold ("In one embodiment, the brightness measure is an average pixel intensity of a pixel region within the image. The pixel region may correspond to a face, an eye or both eyes of the subject," paragraph [0033]); and
determine that a proximate object is detected in accordance with a determination that the brightness statistics satisfies the predetermined threshold ("the detection of an object within the caution zone is estimated based on a brightness assessment of captured images," paragraph [0077]).
Edwards is not relied upon to explicitly teach all of number of pixels in the image data that satisfy a brightness range.
[AltContent: textbox (Steever et al. Fig. 18, showing detecting the presence of a surface by a worn device.)]
PNG
media_image2.png
517
520
media_image2.png
Greyscale
However, Steever et al. teach identify a number of pixels in the image data that satisfy a brightness range, ("detecting the transparent surface presence includes identifying a population of errant pixels, wherein the population satisfies a predetermined population condition (example shown in FIG. 14). The predetermined population condition can include: matching a predetermined pattern (e.g., wherein the errant pixel population substantially matches the pattern, which can be machine-learned, associated with a modulation pattern or phase, or otherwise determined), having a predetermined number of pixels, having a predetermined concentration of pixels, having an average or median intensity above a threshold intensity, or satisfying any other suitable condition," paragraph [0133]).
Therefore, taking the teachings of Edwards and Steever et al. as a whole, it would have been obvious to a person having ordinary skill in the art before the time of the effective filing date of the claimed invention of the instant application to modify “Infrared Light source Protective System” as taught by Edwards to use “Sensing System and Method” as taught by Steever et al. The suggestion/motivation for doing so would have been that, “variants of the sensing system and method can confer any other suitable set of benefits over conventional systems.” as noted by the Steever et al. disclosure in paragraph [0039], which also motivates combination because the combination would predictably have a higher efficiency as there is a reasonable expectation that systems will need to save energy; and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
The rejection of system claim 1 above applies mutatis mutandis to the corresponding limitations of apparatus claim 9 and method claim 17 while noting that the rejection above cites to both device and method disclosures. Claims 9 and 17 are mapped below for clarity of the record and to specify any new limitations not included in claim 1.
Claim 2
Regarding claim 2, Edwards teaches the system of claim 1, wherein the one or more cameras are situated in front of an eye when the wearable device is worn ("the images relate to driver's face including the driver's eyes to monitor eye gaze and drowsiness," paragraph [0080]), as noted above.
Edwards is not relied upon to explicitly teach all of a wearable device.
However, Steever et al. teach wherein the system comprises a wearable device ("Examples of host systems include robots, terrestrial vehicles, aerial vehicles, aquatic vehicles, security systems, or any other suitable host system. In a specific example, the sensing system can be mounted to and used as a safety curtain for industrial robots (e.g., as a sensing system located on an industrial robot arm). However, the sensing system can be otherwise used," paragraph [0043] where a robot host system teaches a wearable).
Edwards and Steever et al. are combined as per claim 1.
Claim 3
Regarding claim 3, Edwards teaches the system of claim 2, further comprising computer readable code to:
power on a display of the device in accordance with detecting that the object is proximate to the wearable device ("when an infrared light source has been switched off for a predetermined delay period, the controller is configured to reactivate the infrared light source for a test period during which one or more test images are captured, the controller being further configured to maintain the infrared light source in an active state if the brightness measure of the test images is equal to or greater than the predetermined brightness threshold, otherwise the controller deactivates the infrared light source," paragraph [0035] and "In addition to providing safety advantages, the disclosed embodiments also have applications in creating a lighting diagnostic function for a driver monitoring system. In these applications, the disclosed system is configured to detect if an infrared light is "blocked" by an obstacle at close range, which can be useful" paragraph [0100]).
Claim 5
Regarding claim 5, Edwards teaches the system of claim 1, wherein the computer readable code to determine whether the brightness statistics satisfies a predetermined threshold further comprises computer readable code to:
determine whether the identified number of pixels satisfies a proximity threshold ("The system described above allows for the proximity between a subject and an LED to be monitored (via the proxy measure of image brightness) and feedback control is fed to device controller 120," paragraph [0098]).
Claim 6
Regarding claim 6, Edwards teaches the system of claim 1, wherein the one or more cameras comprise one or more infrared cameras, and wherein the one or more illuminators comprise one or more light emitting diodes (LEDs) ("the initial stage 701 of illuminating a subject (e.g. driver 102) from two or more spaced apart infrared light sources (e.g. LEDs 108 and 110)," paragraph [0079]).
Claim 7
Regarding claim 7, Edwards teaches the system of claim 1, further comprising computer readable code to:
obtain additional image data captured by the one or more cameras ("in method 700, at least a subset of the captured images must be captured while the subject is illuminated by a single LED," paragraph [0079]);
determine whether brightness statistics for the additional image data satisfy a predetermined threshold ("Alternatively, the brightness threshold may be equal to a percentage of an average pixel intensity calculated over a number of past images, such as 75%, 50%, 40%, 30%, 25%, 20% or 10% of the past average pixel intensity," paragraph [0088]);
determine a lack of proximate object in accordance with a determination that the brightness statistics for the additional image data fail to satisfy the predetermined threshold ("In response to detecting a brightness measure below the predetermined brightness threshold, at stage 705A device controller 120 either switches off or reduces an output illumination intensity of one of the infrared LEDs," paragraph [0089]); and
power down a display in accordance with the determination of the lack of proximate object ("In response to detecting a brightness measure below the predetermined brightness threshold, at stage 705A device controller 120 either switches off or reduces an output illumination intensity of one of the infrared LEDs," paragraph [0089] where reducing the illumination intensity also teaches powering down a display).
Claim 8
Regarding claim 8, Edwards teaches the system of claim 1, wherein the computer readable code to determine brightness statistics from the image data further comprises computer readable code to:
identify a lit image and an unlit image in the image data ("average pixel intensity may be performed on all pixels in an image to calculate the overall average pixel intensity or it may be performed only on a subset of the pixels of each image. The latter operation may be useful where the image includes less useful components such as a dark background behind a driver's face," paragraph [0084]);
remove ambient light from the lit image based on the unlit image to obtain a modified lit image ("a bounding region around the feature may be designated to define the relevant pixel region from which the average pixel intensity is to be calculated," paragraph [0084]); and
apply a mask to the lit image to obtain a masked image ("a bounding region around the feature may be designated to define the relevant pixel region from which the average pixel intensity is to be calculated," paragraph [0084]),
wherein the brightness statistics are determined from a region of interest in the masked image ("it may be performed only on a subset of the pixels of each image. The latter operation may be useful where the image includes less useful components such as a dark background behind a driver's face," paragraph [0084]).
Claim 9
Regarding claim 9, Edwards teaches a non-transitory computer readable media comprising computer readable code executable by one or more processors ("any form of computer processing device or portion of a device that processes electronic data, e.g., from registers and/or memory to transform that electronic data into other electronic data that, e.g., may be stored in registers and/or memory," paragraph [0067]) to:
obtain image data captured by one or more cameras of a device("System 100 is further configured for performing various image processing algorithms on the captured images," paragraph [0058]), wherein the image data is captured when one or more illuminators are illuminated ("LEDs 108 and 110 are activated and deactivated in synchronization with consecutive image frames captured by camera 106 to illuminate the driver during image capture," paragraph [0072]);
determine brightness statistics from the image data ("the detection of an object within the caution zone is estimated based on a brightness assessment of captured images," paragraph [0077]);
determine whether the brightness statistics satisfies a predetermined threshold, ("the detection of an object within the caution zone is estimated based on a brightness assessment of captured images," paragraph [0077])wherein the computer readable code to determine whether the brightness statistics satisfies the predetermined threshold further comprises computer readable code to:
determine, based on the number of pixels that satisfy the brightness range, whether the brightness statistics satisfies the predetermined threshold ("In one embodiment, the brightness measure is an average pixel intensity of a pixel region within the image. The pixel region may correspond to a face, an eye or both eyes of the subject," paragraph [0033]); and
determine that a proximate object is detected in accordance with a determination that the brightness statistics satisfies the predetermined threshold ("the detection of an object within the caution zone is estimated based on a brightness assessment of captured images," paragraph [0077]).
Edwards is not relied upon to explicitly teach all of number of pixels in the image data that satisfy a brightness range
However, Steever et al. teach identify a number of pixels in the image data that satisfy a brightness range, ("detecting the transparent surface presence includes identifying a population of errant pixels, wherein the population satisfies a predetermined population condition (example shown in FIG. 14). The predetermined population condition can include: matching a predetermined pattern (e.g., wherein the errant pixel population substantially matches the pattern, which can be machine-learned, associated with a modulation pattern or phase, or otherwise determined), having a predetermined number of pixels, having a predetermined concentration of pixels, having an average or median intensity above a threshold intensity, or satisfying any other suitable condition," paragraph [0133]).
Edwards and Steever et al. are combined as per claim 1.
Claim 10
Regarding claim 10, Edwards teaches the non-transitory computer readable medium of claim 9, wherein the one or more cameras are situated in front of an eye when the wearable device is worn ("the images relate to driver's face including the driver's eyes to monitor eye gaze and drowsiness," paragraph [0080], as noted above.
Edwards is not relied upon to explicitly teach all of number of pixels in the image data that satisfy a brightness range.
However, Steever et al. teach wherein the device comprises a wearable device ("Examples of host systems include robots, terrestrial vehicles, aerial vehicles, aquatic vehicles, security systems, or any other suitable host system. In a specific example, the sensing system can be mounted to and used as a safety curtain for industrial robots (e.g., as a sensing system located on an industrial robot arm). However, the sensing system can be otherwise used," paragraph [0043] where a robot host system teaches a wearable).
Edwards and Steever et al. are combined as per claim 1.
Claim 11
Regarding claim 11, Edwards teaches the non-transitory computer readable medium of claim 10, further comprising computer readable code to:
power on a display of the device in accordance with detecting that the object is proximate to the wearable device ("when an infrared light source has been switched off for a predetermined delay period, the controller is configured to reactivate the infrared light source for a test period during which one or more test images are captured, the controller being further configured to maintain the infrared light source in an active state if the brightness measure of the test images is equal to or greater than the predetermined brightness threshold, otherwise the controller deactivates the infrared light source," paragraph [0035] and "In addition to providing safety advantages, the disclosed embodiments also have applications in creating a lighting diagnostic function for a driver monitoring system. In these applications, the disclosed system is configured to detect if an infrared light is "blocked" by an obstacle at close range, which can be useful" paragraph [0100]).
Claim 13
Regarding claim 13, Edwards teaches the non-transitory computer readable medium of claim 9, wherein the computer readable code to determine whether the brightness statistics satisfies a predetermined threshold further comprises computer readable code to:
determine whether the identified number of pixels satisfies a proximity threshold ("The system described above allows for the proximity between a subject and an LED to be monitored (via the proxy measure of image brightness) and feedback control is fed to device controller 120," paragraph [0098]).
Claim 14
Regarding claim 14, Edwards teaches the non-transitory computer readable medium of claim 9, wherein the one or more cameras comprise one or more infrared cameras, and wherein the one or more illuminators comprise one or more light emitting diodes (LEDs) ("the initial stage 701 of illuminating a subject (e.g. driver 102) from two or more spaced apart infrared light sources (e.g. LEDs 108 and 110)," paragraph [0079]).
Claim 15
Regarding claim 15, Edwards teaches the non-transitory computer readable medium of claim 9, further comprising computer readable code to:
obtain additional image data captured by the one or more cameras ("in method 700, at least a subset of the captured images must be captured while the subject is illuminated by a single LED," paragraph [0079]);
determine whether brightness statistics for the additional image data satisfy a predetermined threshold ("Alternatively, the brightness threshold may be equal to a percentage of an average pixel intensity calculated over a number of past images, such as 75%, 50%, 40%, 30%, 25%, 20% or 10% of the past average pixel intensity," paragraph [0088]);
determine a lack of proximate object in accordance with a determination that the brightness statistics for the additional image data fail to satisfy the predetermined threshold ("In response to detecting a brightness measure below the predetermined brightness threshold, at stage 705A device controller 120 either switches off or reduces an output illumination intensity of one of the infrared LEDs," paragraph [0089]); and
power down a display in accordance with the determination of the lack of proximate object ("In response to detecting a brightness measure below the predetermined brightness threshold, at stage 705A device controller 120 either switches off or reduces an output illumination intensity of one of the infrared LEDs," paragraph [0089] where reducing the illumination intensity also teaches powering down a display).
Claim 16
Regarding claim 16, Edwards teaches the non-transitory computer readable medium of claim 9, wherein the computer readable code to determine brightness statistics from the image data further comprises computer readable code to:
identify a lit image and an unlit image in the image data ("average pixel intensity may be performed on all pixels in an image to calculate the overall average pixel intensity or it may be performed only on a subset of the pixels of each image. The latter operation may be useful where the image includes less useful components such as a dark background behind a driver's face," paragraph [0084]);
remove ambient light from the lit image based on the unlit image to obtain a modified lit image ("a bounding region around the feature may be designated to define the relevant pixel region from which the average pixel intensity is to be calculated," paragraph [0084]); and
apply a mask to the lit image to obtain a masked image ("a bounding region around the feature may be designated to define the relevant pixel region from which the average pixel intensity is to be calculated," paragraph [0084]),
wherein the brightness statistics are determined from a region of interest in the masked image ("it may be performed only on a subset of the pixels of each image. The latter operation may be useful where the image includes less useful components such as a dark background behind a driver's face," paragraph [0084]).
Claim 17
Regarding claim 17, Edwards teaches a method comprising:
obtaining image data captured by one or more cameras of a device("System 100 is further configured for performing various image processing algorithms on the captured images," paragraph [0058]), wherein the image data is captured when one or more illuminators are illuminated ("LEDs 108 and 110 are activated and deactivated in synchronization with consecutive image frames captured by camera 106 to illuminate the driver during image capture," paragraph [0072]);
determining brightness statistics from the image data ("the detection of an object within the caution zone is estimated based on a brightness assessment of captured images," paragraph [0077]);
determining whether the brightness statistics satisfies a predetermined threshold ("the detection of an object within the caution zone is estimated based on a brightness assessment of captured images," paragraph [0077])wherein the computer readable code to determine whether the brightness statistics satisfies the predetermined threshold further comprises computer readable code to:
determining, based on the number of pixels that satisfy the brightness range, whether the brightness statistics satisfies the predetermined threshold ("In one embodiment, the brightness measure is an average pixel intensity of a pixel region within the image. The pixel region may correspond to a face, an eye or both eyes of the subject," paragraph [0033]); and
determining that a proximate object is detected in accordance with a determination that the brightness statistics satisfies the predetermined threshold ("the detection of an object within the caution zone is estimated based on a brightness assessment of captured images," paragraph [0077]).
Edwards is not relied upon to explicitly teach all of number of pixels in the image data that satisfy a brightness range.
However, Steever et al. teach identifying a number of pixels in the image data that satisfy a brightness range, ("detecting the transparent surface presence includes identifying a population of errant pixels, wherein the population satisfies a predetermined population condition (example shown in FIG. 14). The predetermined population condition can include: matching a predetermined pattern (e.g., wherein the errant pixel population substantially matches the pattern, which can be machine-learned, associated with a modulation pattern or phase, or otherwise determined), having a predetermined number of pixels, having a predetermined concentration of pixels, having an average or median intensity above a threshold intensity, or satisfying any other suitable condition," paragraph [0133]).
Edwards and Steever et al. are combined as per claim 1.
Claim 18
Regarding claim 18, Edwards teaches the method of claim 17, wherein the one or more cameras are situated in front of an eye when the wearable device is worn ("the images relate to driver's face including the driver's eyes to monitor eye gaze and drowsiness," paragraph [0080]), as noted above.
Edwards is not relied upon to explicitly teach all of number of pixels in the image data that satisfy a brightness range.
However, Steever et al. teach wherein the device comprises a wearable device ("Examples of host systems include robots, terrestrial vehicles, aerial vehicles, aquatic vehicles, security systems, or any other suitable host system. In a specific example, the sensing system can be mounted to and used as a safety curtain for industrial robots (e.g., as a sensing system located on an industrial robot arm). However, the sensing system can be otherwise used," paragraph [0043] where a robot host system is a wearable).
Edwards and Steever et al. are combined as per claim 1.
Claim 19
Regarding claim 19, Edwards teaches the method of claim 18, further comprising:
powering on a display of the device in accordance with detecting that the object is proximate to the wearable device ("when an infrared light source has been switched off for a predetermined delay period, the controller is configured to reactivate the infrared light source for a test period during which one or more test images are captured, the controller being further configured to maintain the infrared light source in an active state if the brightness measure of the test images is equal to or greater than the predetermined brightness threshold, otherwise the controller deactivates the infrared light source," paragraph [0035] and "In addition to providing safety advantages, the disclosed embodiments also have applications in creating a lighting diagnostic function for a driver monitoring system. In these applications, the disclosed system is configured to detect if an infrared light is "blocked" by an obstacle at close range, which can be useful" paragraph [0100]).
Claim 20
Regarding claim 20, Edwards teaches the method of claim 17, wherein determining the brightness statistics from the image data further comprises:
identifying a lit image and an unlit image in the image data ("average pixel intensity may be performed on all pixels in an image to calculate the overall average pixel intensity or it may be performed only on a subset of the pixels of each image. The latter operation may be useful where the image includes less useful components such as a dark background behind a driver's face," paragraph [0084]);
removing ambient light from the lit image based on the unlit image to obtain a modified lit image ("a bounding region around the feature may be designated to define the relevant pixel region from which the average pixel intensity is to be calculated," paragraph [0084]); and
applying a mask to the lit image to obtain a masked image ("a bounding region around the feature may be designated to define the relevant pixel region from which the average pixel intensity is to be calculated," paragraph [0084]),
wherein the brightness statistics are determined from a region of interest in the masked image ("it may be performed only on a subset of the pixels of each image. The latter operation may be useful where the image includes less useful components such as a dark background behind a driver's face," paragraph [0084]).
2nd Claim Rejections - 35 USC § 103
Claims 4 and 12 are rejected under 35 U.S.C. 103 as obvious over US Patent Publication 2024 0177500 A1, (Edwards) and US Patent Publication 2018 0143320 A1, (Steever et al.). in view of US Patent Publication 2019 0095602 A1, (Setlak et al.).
Claim 4
Regarding Claim 4, Edwards and Steever et al. teach the system of claim 2, as noted above.
[AltContent: textbox (Setlak et al. Fig. 3, showing a wearable device being worn.)]
PNG
media_image3.png
512
472
media_image3.png
Greyscale
Edwards and Steever et al. are not relied upon to explicitly teach all of determine that the wearable device is being worn.
However, Setlak et al. teach further comprising computer readable code to:
determine that the wearable device is being worn in response to a determination that the proximate object is detected ("the light field camera or another sensor of the watch 100 (e.g., a proximity sensor) may detect when the watch 100 is positioned on and/or attached to the user's forearm 300," paragraph [0047]); and
in response to a determination that the wearable device is being worn, boot up one or more systems of the device ("trigger the processor to operate the light emitter(s) and light field camera, obtain a light field image from the light field camera, and perform an operation," paragraph [0047]).
It would have been obvious to a person having ordinary skill in the art before the time of the effective filing date of the claimed invention of the instant application to modify “Infrared Light Source Protective System” as taught by Edwards and “Sensing System and Method” as taught by Steever et al. to use “Wearable Electronic Device Having a light field camera usable to perform bioauthentication from a dorsal side of a forearm near a wrist” as taught by Setlak et al.
The suggestion/motivation for doing so would have been that, “In some devices, such as a phone or tablet computer, a bioauthentication sensor may be provided adjacent ( or as part of) a display of the device. However, in a wearable electronic device such as a watch, there may be little or no room for providing a bioauthentication sensor adjacent ( or as part of) a display of the device. User authentication may therefore be provided by means of a password or similar input.” as noted by the Setlak et al. disclosure in paragraph [0003] which also motivates combination because the combination would predictably have a higher efficiency as there is a reasonable expectation that systems will need to save energy; and/or because doing so merely combines prior art elements according to known methods to yield predictable results.
Claim 12
Regarding claim 12, Edwards and Steever et al. teach the non-transitory computer readable medium of claim 10, as noted above.
Edwards and Steever et al. are not relied upon to explicitly teach all of determine that the wearable device is being worn.
However, Setlak et al. teach further comprising computer readable code to:
determine that the wearable device is being worn in response to a determination that the proximate object is detected ("the light field camera or another sensor of the watch 100 (e.g., a proximity sensor) may detect when the watch 100 is positioned on and/or attached to the user's forearm 300," paragraph [0047]); and
in response to a determination that the wearable device is being worn, boot up one or more systems of the device ("trigger the processor to operate the light emitter(s) and light field camera, obtain a light field image from the light field camera, and perform an operation," paragraph [0047]).
Edwards, Steever et al. and Setlak et al. are combined as per claim 4.
Reference Cited
The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure.
US Patent Publication 2012 0287035 A1 to Valko et al. discloses method of operating a computing device in a reduced power state and collecting a first set of data from at least one sensor. Based on the first set of data, the computing device determines a probability that an object is within a threshold distance of the computing device and, if so, the device activates at least one secondary sensor to collect a second set of data. Based on the second set of data, the device determines if the object is a person. If it is a person, a position of the person relative to the computing device is determined and the computing device changes its state based on the position of the person. If the object is not a person, the computing device remains in a reduced power state.
US Patent Publication 2020 0211500 A1 to Canberk et al. discloses An eyewear device includes an image display and an image display driver coupled to the image display to control a presented image and adjust a brightness level setting of the presented image. The eyewear device includes a user input device including an input surface on a frame, a temple, a lateral side, or a combination thereof to receive from the wearer a user input selection.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HEATH E WELLS whose telephone number is (703)756-4696. The examiner can normally be reached Monday-Friday 8:00-4:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ms. Jennifer Mehmood can be reached on 571-272-2976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/H.E.W/Examiner, Art Unit 2664
Date: 20 March 2026
/JENNIFER MEHMOOD/Supervisory Patent Examiner, Art Unit 2664