DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Summary
This action is responsive to the application filed on 08/25/2025. Applicant has submitted Claims 1-10 and 13-19 for examination.
Examiner finds the following: 1) Claims 1-10 and 13-19 are rejected; 2) no claims objected to; and 3) no claims allowable.
Response to Arguments and Remarks
Examiner respectfully acknowledges Applicant's arguments and remarks
Applicant’s arguments with respect to the claims have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
However, Examiner notes that Applicant’s remarks and Applicant’s amendments are not fully in line. Examiner believe this to be a simple clerical error, but Applicant calls out how Mathur fails to disclose:
However, [Mathur] is different from the detected light pattern of the present claims, where a flickering light pattern indicates an indoor environment, and a continuous light pattern indicates an outdoor environment. Mathur is silent with respect to any flickering or continuous light patterns that indicate an indoor or outdoor environment.
Examiner agrees with Applicant that Mathur fails to explicitly teach a flickering light pattern, and, as such, pivots with regards to Claims 1-10. However, Applicant did not amend the “flickering” limitation into Claim 13.
Perhaps this was intentional, and that Applicant argues that Mathur fails to disclose “continuous light patterns.” As disclosed and described by Mathur, for Mathur to operate as described, that is, for it to render augmented reality in real time, it is evaluating the continuous light and adjusting the renderings as needed. From [0176]:
It will be appreciated that the sizes and shapes of the volumes may be fixed during production of the display system, e.g., based upon expected tolerances in systems for determining the fixation point, and/or may be adjusted or set in the field depending upon a user's characteristics, the user's environment, and/or changes in software that change the tolerances for the systems for determining the fixation point.
[0191]:
[T]he display system transforms graphical content while preserving visual fidelity, and conserving processing power, as the user looks around their ambient environment.
[0279]:
It will be appreciated that the display system include or may have access to a three-dimensional map of the ambient environment, which can inform locations of any virtual content in this ambient environment.
Again, Examiner agrees that Mathur fails to explicitly disclose “flickering,” but is not persuaded that Mathur fails to disclose “continuous” given the manner in which Mathur operates. As such, Examiner maintains the rejection on the language amended into Claim 13 from now cancelled Claim 20.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
Determining the scope and contents of the prior art.
Ascertaining the differences between the prior art and the claims at issue.
Resolving the level of ordinary skill in the pertinent art.
Considering objective evidence present in the application indicating obviousness or non-obviousness.
Claims 1-7 and 9-10 are rejected under 35 U.S.C. 103 as being unpatentable over Page (US 20230128139 A1), in view of Mathur (US 20190287495 A1), and in further view of Gilbert (US 20250189368 A1).
Regarding Claim 1, Page discloses:
An eyewear (Page, FIG. 1, [0009], user device set 102, and [0009], “Since the pathway for light impacting circadian health is primarily through the eyes, the spectral sensor can be mounted in glasses, even contact lenses, or other head worn ornamentation or gear”) comprising:
at least one first sensor attached to a frame of the eyewear (Page, FIG. 1, [0009], “Each user device set 102 includes a spectral sensor”) and wherein the first sensor is selected from at least one of: ultraviolet light sensor (UVS) (Page, FIG. 1, [0009], “Some embodiments include sensors that track light in the near infrared and/or near ultraviolet frequency ranges”), photopic ambient light sensor (ALS) (Page, [0008], “including brightness (e.g., measured as photopic luminance or melanoptic-equivalent luminance),” and FIG. 1, [0090], “Each user device set 102 includes a spectral sensor that tracks exposure to each of several frequency bands in the visible light range”), and
a controller (Page, FIG. 1, [0010], circadian service 106) configured to:
- receive values for a photopic light level and an ultraviolet light level from the at least one first sensor (Page, FIG. 1, [0010], “The resulting time, place, and spectral data 104 can be uploaded to a cloud-based circadian service 106”),
- provide a timeline for the received values of the photopic light level and the ultraviolet light level (Page, FIG. 3, [0019], Step 311),
- calculate from the timeline, the photopic light level, and the ultraviolet light level, a probability of time duration the eyewear has been exposed to outdoor environment (Page, FIG. 1, [0013], “AI engine 112 can identify patterns and trends in spectral time, place, and data 104 that can be used by light-exposure model 114 to make spectral estimates without relying on contemporaneous data from device sets 102,” and FIG. 3, [0018], Steps 312-314), and
- provide the calculated probability of time duration for a user of the eyewear (Page, FIG. 3, [0019], Step 315, and [0089], “monitors personal light exposure and provides information and recommendations to the user on how to optimize their light exposure for their health”).
Page discloses the above but does not explicitly disclose:
… further comprises a light sensor configured to detect a light pattern of the exposed light,
wherein the controller is configured to determine whether the eyewear is exposed to indoor lighting or ambient lighting based on the detected light pattern of the exposed light, …
However, Mathur, in a similar field of endeavor (DEPTH BASED FOVEATED RENDERING FOR DISPLAY SYSTEMS), discloses:
… further comprises a light sensor configured to detect a light pattern of the exposed light (Mathur, FIG. 6, [0224], “In some embodiments, the out-coupling optical elements 570, 580, 590, 600, 610 are diffractive features that form a diffraction pattern, or “diffractive optical element” (also referred to herein as a “DOE”)”).
wherein the controller is configured to determine whether the eyewear is exposed to indoor lighting or ambient lighting based on the detected light pattern of the exposed light (Mathur, FIG. 6, [0224], “In some embodiments, the out-coupling optical elements 570, 580, 590, 600, 610 are diffractive features that form a diffraction pattern, or “diffractive optical element” (also referred to herein as a “DOE”)”).
It would have been obvious to PHOSITA before the effective filing date of the claimed invention to modify Page with the patterns of Mathur. PHOSITA would have known about the uses of patterns as disclosed by Mathur and how to use them to modify Page. PHOSITA would have been motivated to do this as a use of known technique to improve similar devices in the same way (See MPEP § 2143 (I)(C)), specifically the use of diffractive elements and patterns to control the incoming light.
The combination of Page and Mathur discloses the above but does not explicitly disclose:
… wherein a flickering light pattern indicates an indoor environment, and a continuous light pattern indicates an outdoor environment.
However, Gilbert, in a similar field of endeavor (CALCULATION MODULE FOR DETERMINING A LOCALIZATION, SYSTEM, EYEWEAR AND COMPUTER IMPLEMENTED METHOD), discloses:
… wherein a flickering light pattern indicates an indoor environment, and a continuous light pattern indicates an outdoor environment (Gilbert, [0068], “a detection of the indoor or outdoor localization can be realized using a flickering detector. Most of the commercially available indoor light sources generate light via a succession of impulsions at constant frequency, or frequencies when harmonics are present, where main frequency is high enough (typically >50 Hz) not to disturb the user thanks to the retinal persistence. If the frequency of these impulsions is high enough the user will not be disturbed by these impulsions because human photoreceptors have lower functioning frequency. When we detect that the light has impulsions, we can consider with high likelihood that the optical device 102 is located indoors”).
It would have been obvious to PHOSITA before the effective filing date of the claimed invention to modify the combination of Page and Mathur with the flickering detection of Gilbert. PHOSITA would have known about the uses of flickering detection as disclosed by Mathur and how to use them to modify the combination of Page and Mathur. PHOSITA would have been motivated to do this as a use of known technique to improve similar devices in the same way (See MPEP § 2143 (I)(C)), specifically the use of known methods to monitor for and detect flickering lights to determine a user’s surroundings.
Regarding Claim 2, the combination of Page, Mathur, and Gilbert discloses Claim 1, and Page further discloses:
… wherein the eyewear is deemed to be exposed to the outdoor environment when the value of the ultraviolet light level is higher than a predetermined threshold (Page, [0086], “A system would provide simple, actionable feedback to nursing staff (e.g. a dashboard that shows the actual vs target light exposure for all patients; alerts that can be set if specific patients are below/above specific exposure thresholds, etc.)”).
Regarding Claim 3, the combination of Page, Mathur, and Gilbert discloses Claim 1, but does not explicitly disclose:
… wherein the controller is further configured to determine a time of sunrise and a time of sunset via a navigation receiver.
However, Page discloses in [0103]:
A calibration app that uses tools on smartphones (GPS, camera, distance measurements) that assists in mapping the locations of all sensors installed in a space to each other and to other items of interest in the space (e.g. lights, windows, the user’s desk or chair).
Additionally in [0117]:
Such a wearable would monitor the user’s light exposure throughout the day and provide recommendations to the user (via an app and/or other user-friendly UI) on whether they should try to get more or less light at any given moment. The app could also show them when and how much light they should get in the hours ahead so they can plan their activities.
It would have been obvious to PHOSITA before the effective filing date of the claimed that Page tracks the sunrise and sunset tracking. PHOSITA would have known that a device that tracks the user’s position and can recommend when, where, and how to better get sunlight would have an understanding of when sunrise and sunset is. PHOSITA would have been motivated to do this as a combination of prior art elements according to known methods to yield predictable results (See MPEP § 2143 (I)(A)), specifically use of positioning and time to understand when the sun is out.
Regarding Claim 4, the combination of Page, Mathur, and Gilbert discloses Claim 3, but does not explicitly disclose:
… wherein the controller is further configured to adjust the probability of duration of time to be provided to the user based on the determined time of sunrise, or the time of sunset.
However, Page discloses in [0103]:
A calibration app that uses tools on smartphones (GPS, camera, distance measurements) that assists in mapping the locations of all sensors installed in a space to each other and to other items of interest in the space (e.g. lights, windows, the user’s desk or chair).
Additionally in [0117]:
Such a wearable would monitor the user’s light exposure throughout the day and provide recommendations to the user (via an app and/or other user-friendly UI) on whether they should try to get more or less light at any given moment. The app could also show them when and how much light they should get in the hours ahead so they can plan their activities.
It would have been obvious to PHOSITA before the effective filing date of the claimed that Page tracks the sunrise and sunset tracking. PHOSITA would have known that a device that tracks the user’s position and can recommend when, where, and how to better get sunlight would have an understanding of when sunrise and sunset is. PHOSITA would have been motivated to do this as a combination of prior art elements according to known methods to yield predictable results (See MPEP § 2143 (I)(A)), specifically use of positioning and time to understand when the sun is out.
Regarding Claim 5, the combination of Page, Mathur, and Gilbert discloses Claim 1, and Page further discloses:
… wherein the frame comprises a first temple and a second temple (Page, FIG. 1, [0009], user device set 102, and [0009], “Since the pathway for light impacting circadian health is primarily through the eyes, the spectral sensor can be mounted in glasses, even contact lenses, or other head worn ornamentation or gear.” Examiner notes that temples are part of the basic structure of glasses are inherent to the art), and wherein the at least one first sensor is attached to a front part of the frame to maximize the amount of the light received by the at least one first sensor (Page, [0070], “More sophisticated light exposure trackers that measure a wider spectral range, have better accuracy and/or resolution, and/or are located at or closer to the eye (e.g glasses, contact lenses).” Examiner notes that a sensor located closer the eye on a pair of glasses would inherently place the sensor on the front part of the frame).
Regarding Claim 6, the combination of Page, Mathur, and Gilbert discloses Claim 1, and Mathur further discloses:
… further comprises at least one inertial measurement unit (IMU) for determining an orientation, or a change of orientation, of the eyewear (Mathur, FIG. 9D, [0249], “The data may include data a) captured from sensors (which may be, e.g., operatively coupled to the frame 80 or otherwise attached to the user 90), such as image capture devices (such as cameras), microphones, inertial measurement units, accelerometers, compasses, GPS units, radio devices, gyros, and/or other sensors disclosed herein”).
It would have been obvious to PHOSITA before the effective filing date of the claimed invention to modify the combination of Page, Mathur, and Gilbert with the orientation tracking of Mathur. PHOSITA would have known about the uses of orientation tracking as disclosed by Mathur and how to use them to modify the combination of Page, Mathur, and Gilbert. PHOSITA would have been motivated to do this as a use of known technique to improve similar devices in the same way (See MPEP § 2143 (I)(C)), specifically the tracking of orientation to better understand what and how the user is looking at things.
Regarding Claim 7, the combination of Page, Mathur, and Gilbert discloses Claim 1, and Mathur further discloses:
… wherein the controller is further configured to determine an angle of ultraviolet light, or photopic light, received by the eyewear based on at least one of the photopic light level, the ultraviolet light level, the orientation, or the change of orientation of the eyewear (Mathur, FIG. 12A, [0277], Block 1202).
It would have been obvious to PHOSITA before the effective filing date of the claimed invention to modify the combination of Page, Mathur, and Gilbert with the orientation tracking of Mathur. PHOSITA would have known about the uses of orientation tracking as disclosed by Mathur and how to use them to modify the combination of Page, Mathur, and Gilbert. PHOSITA would have been motivated to do this as a use of known technique to improve similar devices in the same way (See MPEP § 2143 (I)(C)), specifically the tracking of orientation to better understand what and how the user is looking at things.
Regarding Claim 9, the combination of Page, Mathur, and Gilbert discloses Claim 1, and Page further discloses:
… further comprises at least one second sensor for measuring a level of infrared (IR) light received by the eyewear (Page, FIG. 1, [0009], “Some embodiments include sensors that track light in the near infrared and/or near ultraviolet frequency ranges”).
Regarding Claim 10, the combination of Page, Mathur, and Gilbert discloses Claim 1, and Page further discloses:
… wherein the controller is further configured to compute a ratio of the ultraviolet light level and the photopic light level to determine whether the eyewear is exposed to indoor lighting or ambient lighting (Page, [0115], “One or more Specks in the space could provide important information on the relative distribution of light on the ceiling to those areas of importance in the room itself as well as information on the light spectrum in the room (e.g. the amount and spectrum of natural light vs electric light in specific areas of the room)”).
Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Page (US 20230128139 A1), in view of Mathur (US 20190287495 A1), in further view of Gilbert (US 20250189368 A1), and in further view of Harrison (US 20210315083 A1).
Regarding Claim 8, the combination of Page, Mathur, and Gilbert discloses Claim 1, but does not explicitly disclose:
… further comprises a thermal sensor for measuring a temperature of the outdoor environment.
However, Harrison, in a similar field of endeavor (CIRCADIAN OUTDOOR EQUIVALENCY METRIC FOR ASSESSING PHOTIC ENVIRONMENT AND HISTORY), discloses:
… further comprises a thermal sensor for measuring a temperature of the outdoor environment (Harrison, [0016], “the external device may be configured to sense one or more of temperature, pressure, ambient lighting conditions, localized lighting conditions, lighting spectrum characteristics, humidity, UV light, sound, particles, pollutants, gases, radiation, location of objects or items, and motion”), or the frame, or the user of the eyewear (Harrison, [0017], “the one or more external devices are configured to sense one or more of a person's temperature, blood pressure, heart rate, oxygen saturation, activity type, activity level, galvanic skin response, respiratory rate, cholesterol level (including HDL, LDL and triglyceride), hormone or adrenal levels (e.g., Cortisol, thyroid, adrenaline, melatonin, and others), histamine levels, immune system characteristics, blood alcohol levels, drug content, macro and micro nutrients, mood, emotional state, alertness, and sleepiness”).
It would have been obvious to PHOSITA before the effective filing date of the claimed invention to modify the combination of Page, Mathur, and Gilbert with the temperature sensing of Harrison. PHOSITA would have known about the uses of temperature sensing as disclosed by Harrison and how to use them to modify the combination of Page, Mathur, and Gilbert. PHOSITA would have been motivated to do this as a use of known technique to improve similar devices in the same way (See MPEP § 2143 (I)(C)), specifically the use of temperature sensing for both the environment and the user to better understand the lighting conditions.
Claims 13-19 are rejected under 35 U.S.C. 103 as being unpatentable over Page (US 20230128139 A1) in view of Mathur (US 20190287495 A1).
Regarding Claim 13, Page discloses:
A method of determining user exposure via the eyewear (Page, FIG. 1, [0009], user device set 102, and [0009], “Since the pathway for light impacting circadian health is primarily through the eyes, the spectral sensor can be mounted in glasses, even contact lenses, or other head worn ornamentation or gear”), comprising:
- providing at least one first sensor attached to a frame of the eyewear (Page, FIG. 1, [0009], “Each user device set 102 includes a spectral sensor”) and wherein the first sensor is selected from at least one of: ultraviolet light sensor (UVS) (Page, FIG. 1, [0009], “Some embodiments include sensors that track light in the near infrared and/or near ultraviolet frequency ranges”), photopic ambient light sensor (ALS) (Page, [0008], “including brightness (e.g., measured as photopic luminance or melanoptic-equivalent luminance),” and FIG. 1, [0090], “Each user device set 102 includes a spectral sensor that tracks exposure to each of several frequency bands in the visible light range”);
- receiving values for a photopic light level and an ultraviolet light level from the at least one first sensor (Page, FIG. 1, [0010], “The resulting time, place, and spectral data 104 can be uploaded to a cloud-based circadian service 106”),
- providing a timeline for the received values of the photopic light level and the ultraviolet light level (Page, FIG. 3, [0019], Step 311),
- calculating from the timeline, the photopic light level and the ultraviolet light level, a probability of time that the eyewear has been exposed to outdoor environment (Page, FIG. 1, [0013], “AI engine 112 can identify patterns and trends in spectral time, place, and data 104 that can be used by light-exposure model 114 to make spectral estimates without relying on contemporaneous data from device sets 102,” and FIG. 3, [0018], Steps 312-314), and
- providing the probability of time for a user of the eyewear (Page, FIG. 3, [0019], Step 315, and [0089], “monitors personal light exposure and provides information and recommendations to the user on how to optimize their light exposure for their health”), …
Page discloses the above but does not explicitly disclose:
… further comprising detecting a light pattern of the exposed light via a light sensor to determine whether the eyewear is exposed to indoor lighting or ambient lighting.
However, Mathur, in a similar field of endeavor (DEPTH BASED FOVEATED RENDERING FOR DISPLAY SYSTEMS), discloses:
… further comprising detecting a light pattern of the exposed light via a light sensor to determine whether the eyewear is exposed to indoor lighting or ambient lighting (Mathur, FIG. 6, [0224], “In some embodiments, the out-coupling optical elements 570, 580, 590, 600, 610 are diffractive features that form a diffraction pattern, or “diffractive optical element” (also referred to herein as a “DOE”)”).
It would have been obvious to PHOSITA before the effective filing date of the claimed invention to modify Page with the patterns of Mathur. PHOSITA would have known about the uses of patterns as disclosed by Mathur and how to use them to modify Page. PHOSITA would have been motivated to do this as a use of known technique to improve similar devices in the same way (See MPEP § 2143 (I)(C)), specifically the use of diffractive elements and patterns to control the incoming light.
Regarding Claim 14, the combination of Page and Mathur discloses Claim 13, and Page further discloses:
… further comprising comparing the value of the ultraviolet light level with a predetermined threshold, wherein the eyewear is deemed to be exposed to the outdoor environment (Page, [0086], “A system would provide simple, actionable feedback to nursing staff (e.g. a dashboard that shows the actual vs target light exposure for all patients; alerts that can be set if specific patients are below/above specific exposure thresholds, etc.)”).
Regarding Claim 15, the combination of Page and Mathur discloses Claim 13, but does not explicitly disclose:
… further comprising determining time of sunrise and time of sunset based on the provided timeline.
However, Page discloses in [0103]:
A calibration app that uses tools on smartphones (GPS, camera, distance measurements) that assists in mapping the locations of all sensors installed in a space to each other and to other items of interest in the space (e.g. lights, windows, the user’s desk or chair).
Additionally in [0117]:
Such a wearable would monitor the user’s light exposure throughout the day and provide recommendations to the user (via an app and/or other user-friendly UI) on whether they should try to get more or less light at any given moment. The app could also show them when and how much light they should get in the hours ahead so they can plan their activities.
It would have been obvious to PHOSITA before the effective filing date of the claimed that Page tracks the sunrise and sunset tracking. PHOSITA would have known that a device that tracks the user’s position and can recommend when, where, and how to better get sunlight would have an understanding of when sunrise and sunset is. PHOSITA would have been motivated to do this as a combination of prior art elements according to known methods to yield predictable results (See MPEP § 2143 (I)(A)), specifically use of positioning and time to understand when the sun is out.
Regarding Claim 16, the combination of Page and Mathur discloses Claim 13, but does not explicitly disclose:
… further comprising adjusting the value of the probability of time to be provided to the user based on the time of sunrise and sunset.
However, Page discloses in [0103]:
A calibration app that uses tools on smartphones (GPS, camera, distance measurements) that assists in mapping the locations of all sensors installed in a space to each other and to other items of interest in the space (e.g. lights, windows, the user’s desk or chair).
Additionally in [0117]:
Such a wearable would monitor the user’s light exposure throughout the day and provide recommendations to the user (via an app and/or other user-friendly UI) on whether they should try to get more or less light at any given moment. The app could also show them when and how much light they should get in the hours ahead so they can plan their activities.
It would have been obvious to PHOSITA before the effective filing date of the claimed that Page tracks the sunrise and sunset tracking. PHOSITA would have known that a device that tracks the user’s position and can recommend when, where, and how to better get sunlight would have an understanding of when sunrise and sunset is. PHOSITA would have been motivated to do this as a combination of prior art elements according to known methods to yield predictable results (See MPEP § 2143 (I)(A)), specifically use of positioning and time to understand when the sun is out.
Regarding Claim 17, the combination of Page and Mathur discloses Claim 13, and Mathur further discloses:
… further comprising determining an orientation, or a change of orientation, of the eyewear via at least one inertial measurement unit (IMU) (Mathur, FIG. 9D, [0249], “The data may include data a) captured from sensors (which may be, e.g., operatively coupled to the frame 80 or otherwise attached to the user 90), such as image capture devices (such as cameras), microphones, inertial measurement units, accelerometers, compasses, GPS units, radio devices, gyros, and/or other sensors disclosed herein”).
It would have been obvious to PHOSITA before the effective filing date of the claimed invention to modify Page with the orientation tracking of Mathur. PHOSITA would have known about the uses of orientation tracking as disclosed by Mathur and how to use them to modify Page. PHOSITA would have been motivated to do this as a use of known technique to improve similar devices in the same way (See MPEP § 2143 (I)(C)), specifically the tracking of orientation to better understand what and how the user is looking at things.
Regarding Claim 18, the combination of Page and Mathur discloses Claim 13, and Mathur further discloses:
… further comprising determining an angle of ultraviolet or photopic light received by the eyewear based on the photopic light level or the ultraviolet light level, and the orientation or change of orientation (Mathur, FIG. 12A, [0277], Block 1202).
It would have been obvious to PHOSITA before the effective filing date of the claimed invention to modify Page with the orientation tracking of Mathur. PHOSITA would have known about the uses of orientation tracking as disclosed by Mathur and how to use them to modify Page. PHOSITA would have been motivated to do this as a use of known technique to improve similar devices in the same way (See MPEP § 2143 (I)(C)), specifically the tracking of orientation to better understand what and how the user is looking at things.
Regarding Claim 19, the combination of Page and Mathur discloses Claim 13, and further discloses:
… further comprising computing a ratio of the ultraviolet light level and the photopic light level to determine whether the eyewear is exposed to indoor lighting or ambient lighting (Page, [0115], “One or more Specks in the space could provide important information on the relative distribution of light on the ceiling to those areas of importance in the room itself as well as information on the light spectrum in the room (e.g. the amount and spectrum of natural light vs electric light in specific areas of the room)”).
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from Examiner should be directed to CHAD ANDREW REVERMAN whose telephone number is (571) 270-0079. Examiner can normally be reached Mon-Fri 9-5 EST (8-4 CST).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, Applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach Examiner by telephone are unsuccessful, Examiner' s Supervisor, Uzma Alam can be reached on (571) 272-3995. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHAD ANDREW REVERMAN/Examiner, Art Unit 2877
/Kara E. Geisel/Supervisory Patent Examiner, Art Unit 2877