Prosecution Insights
Last updated: April 19, 2026
Application No. 18/646,570

APPARATUS AND METHOD FOR CONTACTLESSLY SENSING BIOLOGICAL SIGNAL AND RECOGNIZING USER HEALTH INFORMATION USING SAME

Final Rejection §102§112
Filed
Apr 25, 2024
Examiner
ALDARRAJI, ZAINAB MOHAMMED
Art Unit
3797
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
ELECTRONICS AND TELECOMMUNICATIONS RESEARCH INSTITUTE
OA Round
2 (Final)
67%
Grant Probability
Favorable
3-4
OA Rounds
3y 5m
To Grant
83%
With Interview

Examiner Intelligence

Grants 67% — above average
67%
Career Allow Rate
81 granted / 121 resolved
-3.1% vs TC avg
Strong +16% interview lift
Without
With
+16.1%
Interview Lift
resolved cases with interview
Typical timeline
3y 5m
Avg Prosecution
29 currently pending
Career history
150
Total Applications
across all art units

Statute-Specific Performance

§101
2.8%
-37.2% vs TC avg
§103
50.2%
+10.2% vs TC avg
§102
20.4%
-19.6% vs TC avg
§112
21.6%
-18.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 121 resolved cases

Office Action

§102 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment The proposed reply filed on 09/19/2025 has been entered. Claims 1-14 remain pending in the current application. The amendment to claim 10 has overcome the claim objection of claim 10. Claim Objections Claims 8 and 14 are objected to because of the following informalities: Claim 8 recites the limitation “the sensing apparatus” should read “the contactless sensing apparatus”. Claim 14 recites the limitation “the contactless sensing apparatus is fixedly attached to a side of a display device worn on a user's face” should read “the contactless sensing apparatus is fixedly attached to a side of the display device worn on the user's face” Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claim 1-14 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 1 recites the limitation “thereby simplifying or eliminating a step of recognizing and tracking the user's face from the video data” it is unclear how the method simplifies other processes or eliminates some unnecessary step? If the limitation is claiming that no other tracking of the user’s face is preformed? The examiner notes that this is a subjective determination for which the metes and bounds cannot be determined. The examiner is interpreting the limitation and no face tracking is performed. Claim 8 recites the limitation “wherein the sensing apparatus is configured as a stand-alone type when the display device does not comprise a controller, and as an integrated type when the display device comprises a controller” it is unclear how the display device comprise the controller and the display device does not comprise the controller at the same time. The examiner is interpreting the limitation as the sensing apparatus is configured as a stand-alone type when the display device does not comprise a controller or as an integrated type when the display device comprises a controller. Claim 9 recites the limitation “thereby simplifying or eliminating a step of recognizing and tracking the user's face from the video data” it is unclear how the apparatus simplifies other apparatus or eliminates some unnecessary step? If the limitation is claiming that no other tracking of the user’s face is preformed? The examiner notes that this is a subjective determination for which the metes and bounds cannot be determined. The examiner is interpreting the limitation and no face tracking is performed. Claim 14 recites the limitation “thereby simplifying or eliminating a step of recognizing and tracking the user's face from the video data” it is unclear how the apparatus simplifies other apparatus or eliminates some unnecessary step? If the limitation is claiming that no other tracking of the user’s face is preformed? The examiner notes that this is a subjective determination for which the metes and bounds cannot be determined. The examiner is interpreting the limitation and no face tracking is performed. All dependent claims are rejected based on their dependency. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-14 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Frank et al. (US 2020/0397306). Regarding claim 1, Frank teaches an operating method of a sensing apparatus for contactlessly sensing a biological signal, the method comprising (para. 0262; a system configured to calculate an extent of CHF and/or identify exacerbation of CHF.): performing an initial setting in a contactless sensing apparatus (paras. 0263 and 0269-0272; The smartglasses are configured to be worn on a user's head. Optionally, various sensors and/or cameras that are physically coupled to the smartglasses, e.g., by being attached to and/or embedded in the frame of the smartglasses, are used to measure the user while the user wears the smartglasses. the system may include an optical emitter configured to direct electromagnetic radiation at an area on the user's head that appears in images captured by an inward-facing camera. Utilize optical emitters directed at a region of interest (ROI), such as an area appearing in images captured by an inward-facing camera, the optical emitter may be positioned in various locations relative to the ROI. the sensor plane is tilted by a fixed angle greater than 2° relative to the lens plane according to the Scheimpflug principle in order to capture a sharper image when the smartglasses are worn by the user. The examiner notes that the initial setting process starts by wearing the smartglasses and attaching the sensors and the optical emitters at desired position to focus on a desired region), obtaining video data from the contactless sensing apparatus, wherein the contactless sensing apparatus is fixedly attached to a side of a display device worn on a user's face to directly capture a predetermined skin area of the user as a region of interest, thereby simplifying or eliminating a step of recognizing and tracking the user's face from the video data (figure 2b, elements 802b and 802c, paras. 0262-0263, 0265, and 0267; the system includes at least a pair of smartglasses (e.g., smartglasses 800 or smartglasses 805, illustrated in FIG. 2b and FIG. 2c, respectively), and an inward-facing camera 820, such a camera from among cameras 802a, 802b, and 802c illustrated in FIG. 2b, or a camera from among cameras 806a and 806b illustrated in FIG. 2c. Inward-facing cameras 802b and 802c are located on the left and right sides of the smartglasses 800, respectively; they capture images that include areas 803b and 803c on the left and right sides of the user's face, respectively. The video camera may capture images at various rates. The examiner notes that the contactless sensors are video cameras 802b and 802c that are fixed to the sides of the head mounted display device to obtain video (image data) of a region of interest (left and right cheeks) of the user); removing noise from the video data (para. 0271; The computer 828 may utilize various preprocessing approaches in order to assist in calculations and/or in extraction of an iPPG signal from the images 821. Additionally or alternatively, images may undergo various preprocessing to improve the signal, such as color space transformation (e.g., transforming RGB images into a monochromatic color or images in a different color space), blind source separation using algorithms such as independent component analysis (ICA) or principal component analysis (PCA), and various filtering techniques, such as detrending, bandpass filtering, and/or continuous wavelet transform (CWT). Various preprocessing techniques known in the art that may assist in extracting an iPPG signal from the images 821 are discussed in Zaunseder et at (2018), “Cardiovascular assessment by imaging photoplethysmography a review”, Biomedical Engineering 63(5), 617-634. An example of preprocessing that may be used in some embodiments is given in U.S. Pat. No. 9,020,185, titled “Systems and methods for non-contact heart rate sensing”, which describes how times-series signals obtained from video of a user can be filtered and processed to separate an underlying pulsing signal by, for example, using an ICA algorithm.); extracting only the region of interest from the video data (para. 0284; sentences of the form “a facial blood flow pattern recognizable in the images (of an area comprising skin on the user's head)” refer to effects of blood volume changes due to pulse waves that may be extracted from one or more images of the area. The examiner notes that the facial blood flow pattern is extracted for the region of interest in the image data); and estimating biological activity information by analyzing the region of interest (paras. 0279-0290, 0287, and 0306; The images 821 may provide values of coloration intensities (i.e., intensities detected at one or more light wavelengths) at different portions of the area on the user's head, which correspond to the different pixels in the images, the computer 828 may utilize various computational techniques described herein to extract a photoplethysmogram signal (iPPG signal) from the images 821. The coloration intensities may represent a facial blood flow pattern that is recognizable in the images 821. A facial blood flow pattern, such as one of the examples described above, may be calculated, in some embodiments, from the images 821 by the computer 828. Optionally, the facial blood flow pattern may be utilized to generate one or more feature values that are used in a machine learning-based approach by the computer 828 to calculate the extent of CHF and/or identify an exacerbation of CHF. the computer 828 calculates first and second series of heart rate values from portions of iPPG signals extracted from the first and second sets of images, respectively. The computer 828 may calculate the extent of the CHF also based on the extent to which heart rate values in the second series were above heart rate values in the first series. For example, the computer 828 may generate one or more feature values indicative of these differences, and utilize them in the calculation of the extent of the CHF. The examiner notes that the data representing the region of interest is extracted from the image data and the computer preforms calculation and analysis of the data to generate the extent of congestive heart failure and heart rate values). Regarding claim 2, Frank teaches the method of claim 1, wherein the performing the initial setting in the contactless sensing apparatus comprises adjusting a focus of the contactless sensing apparatus or operating a light source (paras. 0263 and 0269-0272; The smartglasses are configured to be worn on a user's head. Optionally, various sensors and/or cameras that are physically coupled to the smartglasses, e.g., by being attached to and/or embedded in the frame of the smartglasses, are used to measure the user while the user wears the smartglasses. the system may include an optical emitter configured to direct electromagnetic radiation at an area on the user's head that appears in images captured by an inward-facing camera. Utilize optical emitters directed at a region of interest (ROI), such as an area appearing in images captured by an inward-facing camera, the optical emitter may be positioned in various locations relative to the ROI. the sensor plane is tilted by a fixed angle greater than 2° relative to the lens plane according to the Scheimpflug principle in order to capture a sharper image when the smartglasses are worn by the user. The examiner notes that the initial setting process starts by wearing the smartglasses and attaching the sensors and the optical emitters at desired position to focus on a desired region). Regarding claim 3, Frank teaches the method of claim 1, wherein the obtaining the video data from the contactless sensing apparatus comprises obtaining data based on at least one of an RGB sensor, a near-infrared sensor, or a thermal imaging sensor (para. 0268; at least one of the inward-facing cameras may capture light in the near infrared spectrum (NIR).). Regarding claim 4, Frank teaches the method of claim 1, wherein the obtaining the video data from the contactless sensing apparatus comprises contactlessly obtaining the video data based on at least one of a cheek, or an ear region of the user's face (figure 2b, elements 803b and 803c, paras. 0265 and 0272; Inward-facing cameras 802b and 802c are located on the left and right sides of the smartglasses 800, respectively; they capture images that include areas 803b and 803c on the left and right sides of the user's face, respectively. The area captured by images taken by said camera (e.g., when the area is on, and/or includes a portion of, the forehead or a cheek)). Regarding claim 5, Frank teaches the method of claim 1, wherein the estimating the biological activity information by analyzing the region of interest comprises estimating the biological activity information based on an Al model or a pixel-based analysis algorithm (para. 0287; the facial blood flow pattern may be utilized to generate one or more feature values that are used in a machine learning-based approach by the computer 828 to calculate the extent of CHF and/or identify an exacerbation of CHF). Regarding claim 6, Frank teaches the method of claim 1, wherein the biological activity information comprises a-heart activity information related to a heart rate or a heart rate variability (para. 0306; Dynamics of the user's heart rate following physical activity may also be used to calculate the extent of CHF. In one embodiment, the computer 828 calculates first and second series of heart rate values from portions of iPPG signals extracted from the first and second sets of images, respectively). Regarding claim 7, Frank teaches the method of claim 6, wherein the estimated heart activity information is provided to the user through the display device (figure 2a, para. 0246; The user interface 388 may be utilized to present values calculated by the computer 380. Optionally, the user interface 388 is a component of a device of the user, such as an augmented reality display). Regarding claim 8, Frank teaches the method of claim 1, wherein the sensing apparatus is configured as a stand-alone type when the display device does not comprise a controller, and as an integrated type when the display device comprises a controller (para. 0278; the computer 828 may refer to different components and/or a combination of components. In some embodiments, the computer 828 may include a processor located on the smartglasses (as illustrated in FIG. 2c).). Regarding claim 9, Frank teaches a sensing apparatus for contactlessly sensing a biological signal, the apparatus comprising (para. 0262; a system configured to calculate an extent of CHF and/or identify exacerbation of CHF.): a memory (figure 2a, element 828, para. 0249; a system including a processor and memory); a communication device (para. 0262; The system also includes computer 828. The system may include additional elements such as a user interface 832.); and a processor operably connected to the memory and the communication device (figure 2a, para. 0262; The system also includes computer 828. The system may include additional elements such as a user interface 832. The examiner notes that the computer includes the processor and the memory which is connected to a user interface); wherein the processor performs an initial setting in the contactless sensing apparatus (para. 0272; In order to improve the sharpness of images captured by said camera, camera may be configured to operate in a way that takes advantage of the Scheimpflug principle. In one embodiment, camera includes a sensor and a lens; the sensor plane is tilted by a fixed angle greater than 2° relative to the lens plane according to the Scheimpflug principle in order to capture a sharper image when the smartglasses are worn by the user (where the lens plane refers to a plane that is perpendicular to the optical axis of the lens, which may include one or more lenses). In another embodiment, the camera includes a sensor, a lens, and a motor, the motor tilts the lens relative to the sensor according to the Scheimpflug principle. The tilt improves the sharpness of images when the smartglasses are worn by the user. Additional details regarding the application of the Scheimpflug principle. The examiner notes that the computer drives the motor to tilt the sensor and the lens to adjust focus.), obtains video data from the contactless sensing apparatus, wherein the contactless sensing apparatus is fixedly attached to a side of a display device worn on a user's face to directly capture a predetermined skin area of the user as a region of interest, thereby simplifying or eliminating a step of recognizing and tracking the user's face from the video data (figure 2b, elements 802b and 802c, paras. 0262-0263, 0265, and 0267; the system includes at least a pair of smartglasses (e.g., smartglasses 800 or smartglasses 805, illustrated in FIG. 2b and FIG. 2c, respectively), and an inward-facing camera 820, such a camera from among cameras 802a, 802b, and 802c illustrated in FIG. 2b, or a camera from among cameras 806a and 806b illustrated in FIG. 2c. Inward-facing cameras 802b and 802c are located on the left and right sides of the smartglasses 800, respectively; they capture images that include areas 803b and 803c on the left and right sides of the user's face, respectively. The video camera may capture images at various rates. The examiner notes that the contactless sensors are video cameras 802b and 802c that are fixed to the sides of the head mounted display device to obtain video (image data) of a region of interest (left and right cheeks) of the user); removes noise from the video data (para. 0271; The computer 828 may utilize various preprocessing approaches in order to assist in calculations and/or in extraction of an iPPG signal from the images 821. Additionally or alternatively, images may undergo various preprocessing to improve the signal, such as color space transformation (e.g., transforming RGB images into a monochromatic color or images in a different color space), blind source separation using algorithms such as independent component analysis (ICA) or principal component analysis (PCA), and various filtering techniques, such as detrending, bandpass filtering, and/or continuous wavelet transform (CWT). Various preprocessing techniques known in the art that may assist in extracting an iPPG signal from the images 821 are discussed in Zaunseder et at (2018), “Cardiovascular assessment by imaging photoplethysmography a review”, Biomedical Engineering 63(5), 617-634. An example of preprocessing that may be used in some embodiments is given in U.S. Pat. No. 9,020,185, titled “Systems and methods for non-contact heart rate sensing”, which describes how times-series signals obtained from video of a user can be filtered and processed to separate an underlying pulsing signal by, for example, using an ICA algorithm.); extracts only the region of interest from the video data (para. 0284; sentences of the form “a facial blood flow pattern recognizable in the images (of an area comprising skin on the user's head)” refer to effects of blood volume changes due to pulse waves that may be extracted from one or more images of the area. The examiner notes that the facial blood flow pattern is extracted for the region of interest in the image data); and estimates biological activity information by analyzing the region of interest (paras. 0279-0290, 0287, and 0306; The images 821 may provide values of coloration intensities (i.e., intensities detected at one or more light wavelengths) at different portions of the area on the user's head, which correspond to the different pixels in the images, the computer 828 may utilize various computational techniques described herein to extract a photoplethysmogram signal (iPPG signal) from the images 821. The coloration intensities may represent a facial blood flow pattern that is recognizable in the images 821. A facial blood flow pattern, such as one of the examples described above, may be calculated, in some embodiments, from the images 821 by the computer 828. Optionally, the facial blood flow pattern may be utilized to generate one or more feature values that are used in a machine learning-based approach by the computer 828 to calculate the extent of CHF and/or identify an exacerbation of CHF. the computer 828 calculates first and second series of heart rate values from portions of iPPG signals extracted from the first and second sets of images, respectively. The computer 828 may calculate the extent of the CHF also based on the extent to which heart rate values in the second series were above heart rate values in the first series. For example, the computer 828 may generate one or more feature values indicative of these differences, and utilize them in the calculation of the extent of the CHF. The examiner notes that the data representing the region of interest is extracted from the image data and the computer preforms calculation and analysis of the data to generate the extent of congestive heart failure and heart rate values). Regarding claim 10, Frank teaches the apparatus of claim 9, wherein the processor adjusts a focus of the contactless sensing apparatus or operates a light source in order to perform the initial setting in the contactless sensing apparatus (paras. 0263 and 0269-0272; The smartglasses are configured to be worn on a user's head. Optionally, various sensors and/or cameras that are physically coupled to the smartglasses, e.g., by being attached to and/or embedded in the frame of the smartglasses, are used to measure the user while the user wears the smartglasses. the system may include an optical emitter configured to direct electromagnetic radiation at an area on the user's head that appears in images captured by an inward-facing camera. Utilize optical emitters directed at a region of interest (ROI), such as an area appearing in images captured by an inward-facing camera, the optical emitter may be positioned in various locations relative to the ROI. the sensor plane is tilted by a fixed angle greater than 2° relative to the lens plane according to the Scheimpflug principle in order to capture a sharper image when the smartglasses are worn by the user. The examiner notes that the initial setting process starts by wearing the smartglasses and attaching the sensors and the optical emitters at desired position to focus on a desired region). Regarding claim 11, Frank teaches the apparatus of claim 9,wherein the processor obtains the video data based on at least one of an RGB sensor, a near-infrared sensor, or a thermal image sensor in order to obtain the video data from the contactless sensing apparatus (para. 0268; at least one of the inward-facing cameras may capture light in the near infrared spectrum (NIR).). Regarding claim 12, Frank teaches the apparatus of claim 9, wherein the processor estimates the biological activity information based on an Al model or a pixel-based analysis algorithm in order to estimate the biological activity information by analyzing the region of interest (para. 0287; the facial blood flow pattern may be utilized to generate one or more feature values that are used in a machine learning-based approach by the computer 828 to calculate the extent of CHF and/or identify an exacerbation of CHF). Regarding claim 13, Frank teaches the apparatus of claim 9, wherein the biological information comprises a-heart activity information related to a heart rate or a heart rate variability (para. 0306; Dynamics of the user's heart rate following physical activity may also be used to calculate the extent of CHF. In one embodiment, the computer 828 calculates first and second series of heart rate values from portions of iPPG signals extracted from the first and second sets of images, respectively). Regarding claim 14, Frank teaches a sensing apparatus for contactlessly sensing a biological signal in an integrated form that is included as part of a display device worn on a user's face, the apparatus comprising (figures 2a-2c, paras. 0262-0263; a system configured to calculate an extent of CHF and/or identify exacerbation of CHF. The system includes at least a pair of smartglasses (e.g., smartglasses 800 or smartglasses 805, illustrated in FIG. 2b and FIG. 2c, respectively), and an inward-facing camera 820, such a camera from among cameras 802b, and 802c illustrated in FIG. 2b.): a memory (figure 2a, element 828, para. 0249; a system including a processor and memory); a communication device (para. 0262; The system also includes computer 828. The system may include additional elements such as a user interface 832.); and a processor operably connected to the memory and the communication device (figure 2a, para. 0262; The system also includes computer 828. The system may include additional elements such as a user interface 832. The examiner notes that the computer includes the processor and the memory which is connected to a user interface); wherein the processor performs an initial setting in the contactless sensing apparatus (para. 0272; In order to improve the sharpness of images captured by said camera, camera may be configured to operate in a way that takes advantage of the Scheimpflug principle. In one embodiment, camera includes a sensor and a lens; the sensor plane is tilted by a fixed angle greater than 2° relative to the lens plane according to the Scheimpflug principle in order to capture a sharper image when the smartglasses are worn by the user (where the lens plane refers to a plane that is perpendicular to the optical axis of the lens, which may include one or more lenses). In another embodiment, the camera includes a sensor, a lens, and a motor, the motor tilts the lens relative to the sensor according to the Scheimpflug principle. The tilt improves the sharpness of images when the smartglasses are worn by the user. Additional details regarding the application of the Scheimpflug principle. The examiner notes that the computer drives the motor to tilt the sensor and the lens to adjust focus.), obtains video data from the contactless sensing apparatus, wherein the contactless sensing apparatus is fixedly attached to a side of a display device worn on a user's face to directly capture a predetermined skin area of the user as a region of interest, thereby simplifying or eliminating a step of recognizing and tracking the user's face from the video data (figure 2b, elements 802b and 802c, paras. 0262-0263, 0265, and 0267; the system includes at least a pair of smartglasses (e.g., smartglasses 800 or smartglasses 805, illustrated in FIG. 2b and FIG. 2c, respectively), and an inward-facing camera 820, such a camera from among cameras 802a, 802b, and 802c illustrated in FIG. 2b, or a camera from among cameras 806a and 806b illustrated in FIG. 2c. Inward-facing cameras 802b and 802c are located on the left and right sides of the smartglasses 800, respectively; they capture images that include areas 803b and 803c on the left and right sides of the user's face, respectively. The video camera may capture images at various rates. The examiner notes that the contactless sensors are video cameras 802b and 802c that are fixed to the sides of the head mounted display device to obtain video (image data) of a region of interest (left and right cheeks) of the user); removes noise from the video data (para. 0271; The computer 828 may utilize various preprocessing approaches in order to assist in calculations and/or in extraction of an iPPG signal from the images 821. Additionally or alternatively, images may undergo various preprocessing to improve the signal, such as color space transformation (e.g., transforming RGB images into a monochromatic color or images in a different color space), blind source separation using algorithms such as independent component analysis (ICA) or principal component analysis (PCA), and various filtering techniques, such as detrending, bandpass filtering, and/or continuous wavelet transform (CWT). Various preprocessing techniques known in the art that may assist in extracting an iPPG signal from the images 821 are discussed in Zaunseder et at (2018), “Cardiovascular assessment by imaging photoplethysmography a review”, Biomedical Engineering 63(5), 617-634. An example of preprocessing that may be used in some embodiments is given in U.S. Pat. No. 9,020,185, titled “Systems and methods for non-contact heart rate sensing”, which describes how times-series signals obtained from video of a user can be filtered and processed to separate an underlying pulsing signal by, for example, using an ICA algorithm.); extracts only the region of interest from the video data (para. 0284; sentences of the form “a facial blood flow pattern recognizable in the images (of an area comprising skin on the user's head)” refer to effects of blood volume changes due to pulse waves that may be extracted from one or more images of the area. The examiner notes that the facial blood flow pattern is extracted for the region of interest in the image data); and estimates biological activity information by analyzing the region of interest (paras. 0279-0290, 0287, and 0306; The images 821 may provide values of coloration intensities (i.e., intensities detected at one or more light wavelengths) at different portions of the area on the user's head, which correspond to the different pixels in the images, the computer 828 may utilize various computational techniques described herein to extract a photoplethysmogram signal (iPPG signal) from the images 821. The coloration intensities may represent a facial blood flow pattern that is recognizable in the images 821. A facial blood flow pattern, such as one of the examples described above, may be calculated, in some embodiments, from the images 821 by the computer 828. Optionally, the facial blood flow pattern may be utilized to generate one or more feature values that are used in a machine learning-based approach by the computer 828 to calculate the extent of CHF and/or identify an exacerbation of CHF. the computer 828 calculates first and second series of heart rate values from portions of iPPG signals extracted from the first and second sets of images, respectively. The computer 828 may calculate the extent of the CHF also based on the extent to which heart rate values in the second series were above heart rate values in the first series. For example, the computer 828 may generate one or more feature values indicative of these differences, and utilize them in the calculation of the extent of the CHF. The examiner notes that the data representing the region of interest is extracted from the image data and the computer preforms calculation and analysis of the data to generate the extent of congestive heart failure and heart rate values). Response to Arguments Applicant’s arguments with respect to claim(s) 1-14 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ZAINAB M ALDARRAJI whose telephone number is (571)272-8726. The examiner can normally be reached Monday-Thursday7AM-5PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Carey Michael can be reached at (571) 270-7235. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ZAINAB MOHAMMED ALDARRAJI/ Patent Examiner, Art Unit 3797 /MICHAEL J CAREY/ Supervisory Patent Examiner, Art Unit 3795
Read full office action

Prosecution Timeline

Apr 25, 2024
Application Filed
Jun 17, 2025
Non-Final Rejection — §102, §112
Sep 19, 2025
Response Filed
Dec 02, 2025
Final Rejection — §102, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12599331
Hyperspectral Image-Guided Ocular Imager for Alzheimer's Disease Pathologies
2y 5m to grant Granted Apr 14, 2026
Patent 12594038
ESTIMATION OF CONTACT FORCE OF CATHETER EXPANDABLE ASSEMBLY
2y 5m to grant Granted Apr 07, 2026
Patent 12588887
MEDICAL DEVICE POSITION SENSING COMPONENTS
2y 5m to grant Granted Mar 31, 2026
Patent 12582479
METHOD AND SYSTEM FOR AUTOMATIC PLANNING OF A MINIMALLY INVASIVE THERMAL ABLATION AND METHOD FOR TRAINING A NEURAL NETWORK
2y 5m to grant Granted Mar 24, 2026
Patent 12569189
DEVICE, METHOD AND COMPUTER PROGRAM FOR DETERMINING SLEEP EVENT USING RADAR
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
67%
Grant Probability
83%
With Interview (+16.1%)
3y 5m
Median Time to Grant
Moderate
PTA Risk
Based on 121 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month