Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
1. This communication is responsive to the Amendment filed 12/1/2025.
2. Claims 1-3 and 5-34 are pending in this application. Claims 1, 10, 24 and 32-33 are independent claims. In the instant Amendment, claims 1, 10, 24 and 32-33 were amended and claim 34 was added. This is a Non-Final action on the RCE filed 1/22/2026.
Claim Rejections - 35 USC § 103
3. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
4. Claim(s) 1-3 and 5-33 are rejected under 35 U.S.C. 103 as being unpatentable over Hayter et al (“Hayter” US 2021/0030323) in view of Fransen et al (“Fransen” US 2020/0265080).
Regarding claim 1, Hayter discloses a method comprising:
receiving, at a computing device, a user input to retroactively log an event to a position on a glucose graph (see paragraph [0094]; e.g., “a meal event can be logged by the user at 302. The user can input meal information directly into reader device 120 (via a user interface) at his or her own discretion, before, during, or after consumption of the meal.”), wherein the user input comprises an image of the event (see paragraphs [0138]-[0141]; e.g., log meals with photo of meal);
responsive to the user input, identifying an image with associated timestamp of the event indicating a time when the image was captured (see paragraphs [0094] and [0138]-[0141]; e.g., retrieve image from photo library and log meals with time on graph);
determining a predicted impact of the event on the glucose measurements of the user based on historical effects of similar events; and classifying the event as having the predicted impact (see paragraphs [0007], [0010], [0012], [0056], [0119]-[0125], [0133] and [0143]; e.g., labeling previous meals); and
displaying, in a user interface, the image of the event at a position on the glucose graph corresponding to the image with associated timestamp of the event (see paragraphs [0138]-[0141] and [0225]; e.g., glucose graph with image of meal plotted at time of meal) as well as a visual indication of the classification of the event (see paragraphs [0170]-[0180]; e.g., display magnitude of glucose response to meal; meal sensitivity settings).
Hayter does not expressly disclose a timestamp stored with the image.
However, it is well-known in the art that upon capture of an image, metadata including timestamps are stored with the image. For instance, Fransen discloses “the “metadata” or “photo metadata” is information that provides context about a digital image, and is information that can be stored and communicated with an image file of a digital image. The metadata associated with a digital image can include descriptive information added by a user, as well as information based on automated capture with a camera device, such as an identifier of the photo, the date and time of day a photo is captured, a geolocation of where the photo is captured, dimensions and photographic properties of the photo (e.g., lens aperture, focal length, shutter speed, lighting, etc.), a stored location of the photo, keywords that identify content captured in the photo image, rights management information (e.g., copyright and access), and/or any other type of information about a digital image (see paragraph [0017]).” It would have been obvious to a person having ordinary skill in the art at the time of the invention to allow identification of a timestamp stored with an image as such teaching is a common practice in the art and thus produces an expected result.
Regarding claim 2, Hayter discloses wherein the event comprises a meal consumed by the user (see paragraphs [0138]-[0141] and [0225]; e.g., glucose graph with image of meal plotted at time of meal).
Regarding claim 3, Hayter discloses wherein the image of the meal consumed by the user is captured with a camera of the computing device (see paragraphs [0138]-[0141] and [0225]; e.g., camera to take picture of meal).
Regarding claim 5, Hayter discloses receiving, at the computing device, a modification by the user of the position of the meal on the glucose graph; displaying, in the user interface, the image of the meal at a modified position on the glucose (see figs 5 and 10; e.g., user can update glucose graph with image of meal plotted at time of meal).
Regarding claim 6, Hayter discloses wherein the user input comprises at least one voice command to log the event (see paragraph [0066]; voice input).
Regarding claim 7, Hayter discloses wherein the user input further comprises a selection of the event from a user interface that displays multiple events (see fig 5A, e.g., select meal).
Regarding claim 8, Hayter discloses receiving, at the computing device, a user selection of the meal; identifying historical meals similar to the selected meal; and displaying, in the user interface, a representation of historical effects of a similar meal to the selected meal on the glucose graph of the user (see paragraphs [0010], [0012], [0120]-[0123], [0133], [0143] and [0168]; e.g., user can select image of meal and make historical comparisons of similar meals; plotted on graph).
Regarding claim 9, Hayter discloses wherein displaying, in the user interface, the representation of the historical effects of the similar meal on the glucose graph of the user includes displaying, in the user interface, overlays of historical glucose graphs of the user for one or more instances in which the user has consumed the similar meal to the selected meal (see paragraphs [0010], [0012], [0120]-[0123], [0133], [0143] and [0168]; e.g., user can select image of meal and make historical comparisons of similar meals; plotted on graph).
Regarding claim 10, claim 10 is similar in scope to claim 1 and is therefore rejected under similar rationale. Hayter further discloses obtaining, at a computing device, glucose measurements of a user, the glucose measurements collected by a glucose monitoring device (see fig 2A, reader 120).
Regarding claim 11, Hayter discloses wherein the user interface further includes a selectable meal logging control (see paragraphs [0138]-[0141] and [0225]; e.g., log meals), and wherein the receiving the user input data further comprises receiving, by the computing device, user input to select the meal logging control (see paragraphs [0138]-[0141] and [0225]; e.g., log meals).
Regarding claim 12, Hayter discloses receiving, at the computing device, a modification by the user of the position of the meal on the glucose graph; displaying, in the user interface, the image of the meal at a modified position on the glucose (see figs 5 and 10; e.g., user can update glucose graph with image of meal plotted at time of meal).
Regarding claim 13, Hayter discloses wherein displaying the meal representation comprises displaying the captured image of the meal as the meal representation for the meal consumed by the user (see paragraphs [0138]-[0141] and [0225]; e.g., saving picture of meal along with time).
Regarding claim 14, Hayter discloses wherein the image is received from a meal logging application (see paragraphs [0138]-[0141] and [0225]; e.g., log meals).
Regarding claim 15, Hayter discloses wherein the meal representation is user selectable, and wherein the method further comprises: receiving user input to select the meal representation; and displaying additional information associated with the meal in the user interface (see paragraph [0175]; e.g., “clicking on a meal in ranking report 500 can initiate a screen with additional information about that particular meal.”).
Regarding claim 16, Hayter discloses wherein the additional information includes an additional glucose graph associated with at least one similar meal consumed by the user at a previous time (see figs 14A and 14B).
Regarding claim 17, Hayter discloses wherein the meal representation comprises an image of the meal consumed by the user, and wherein the additional information comprises nutritional information obtained by processing the image of the meal consumed by the user to identify the meal consumed by the user (see paragraphs [0116], [0133], [0140] and [0188]; e.g., nutritional information).
Regarding claim 18, Hayter discloses wherein the meal representation includes a visual indication of a classification of the meal consumed by the user (see paragraphs [0042] and [0083]; e.g., classify meals).
Regarding claim 19, Hayter discloses wherein the visual indication classifies the meal as being high in carbohydrates or low in carbohydrates (see paragraphs [0116], [0133], [0140] and [0188]; e.g., carbohydrates).
Regarding claim 20, Hayter discloses wherein the visual indication classifies the meal as having a negative, neutral, or positive effect on the glucose measurements of the user (see paragraph [0083]; e.g., good meals).
Regarding claim 21, Hayter discloses further comprising: receiving activity data indicative of an activity performed by the user and a time at which the activity was performed; and displaying, in the user interface, an activity representation for the activity performed by the user, the activity representation overlaying the glucose graph at a position corresponding to the time at which the activity was performed (see paragraphs [0138]-[0141] and [0225]; e.g., glucose graph with image of meal plotted at time of meal).
Regarding claim 22, Hayter discloses wherein the activity data is received from a third-party application that is separate from a glucose application that displays the user interface (see paragraphs [0188] and [0219]).
Regarding claim 23, Hayter discloses wherein the glucose monitoring device comprises a wearable glucose monitoring device that is wirelessly coupled to the computing device (see paragraph [0064]; e.g., wearable electronics).
Regarding claim 24, claim 24 is similar in scope to claim 1 and is therefore rejected under similar rationale. Hayter further discloses obtaining, at a computing device, glucose measurements of a user, the glucose measurements collected by a glucose monitoring device (see fig 2A, reader 120).
Regarding claim 25, Hayter discloses wherein the event comprises a meal consumed by the user, and the image of the event comprises an image of the meal (see paragraphs [0138]-[0141] and [0225]; e.g., saving picture of meal along with time).
Regarding claim 26, Hayter discloses further comprising a camera, and wherein the image of the meal is captured by the camera (see paragraphs [0138]-[0141] and [0225]; e.g., capture picture of meal along with camera).
Regarding claim 27, Hayter discloses wherein the event comprises an activity performed by the user (see paragraph [0219]; e.g., duration of exercise).
Regarding claim 28, Hayter discloses wherein the activity performed by the user comprises exercise performed by the user (see paragraph [0219]; e.g., duration of exercise).
Regarding claim 29, Hayter discloses wherein the event representation includes a visual indication of an intensity or duration of the exercise performed by the user (see paragraph [0219]; e.g., duration of exercise).
Regarding claim 30, Hayter discloses wherein the image of the event is received from a third-party application (see paragraphs [0188] and [0219]).
Regarding claim 31, Hayter discloses wherein the event representation is selectable to view additional information associated with the event (see paragraph [0081]; e.g., selecting an entry opens summary).
Regarding claim 32, claim 32 is similar in scope to claim 1 and is therefore rejected under similar rationale. Hayter further discloses obtaining, at a computing device, glucose measurements of a user, the glucose measurements collected by a glucose monitoring device (see fig 2A, reader 120).
Regarding claim 33, claim 33 is similar in scope to claim 1 and is therefore rejected under similar rationale.
Regarding claim 33, Hayter discloses automatically processing the image of the event to identify one or more items depicted in the image; and deriving, based on the identified items, nutritional information associated with the event; wherein the predicted impact of the event on the glucose measurements of the user is further determined based on the derived nutritional information (see paragraphs [0133], [0170]-[0180]; e.g., “the imagery of the meal can be analyzed using image recognition techniques to identify attributes of the meal such as recognizable components of the meal.”).
Response to Arguments
5. Applicant's arguments filed 12/1/2025 have been fully considered but they are not persuasive.
Regarding Applicant’s arguments concerning Hayter failing to disclose “determining a predicted impact of the event on the glucose measurements of the user based on historical effects of similar events; and classifying the event as having the predicted impact,” the Examiner respectfully disagrees.
Hayter discloses labeling/storing nutritional information and glucose response to meals (see paragraphs [0007], [0010], [0012], [0056], [0119]-[0125], [0133] and [0143]).
Conclusion
6. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Grove et al (US 8,892,677).
7. Any inquiry concerning this communication or earlier communications from the examiner should be directed to RASHAWN N TILLERY whose telephone number is (571)272-6480. The examiner can normally be reached M-F 9:00a - 5:30p.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, William L Bashore can be reached on (571) 272-4088. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/RASHAWN N TILLERY/Primary Examiner, Art Unit 2174