Prosecution Insights
Last updated: April 19, 2026
Application No. 18/330,925

INFORMATIVE DISPLAY FOR NON-CONTACT PATIENT MONITORING

Final Rejection §103
Filed
Jun 07, 2023
Examiner
MCCORMACK, ERIN KATHLEEN
Art Unit
3791
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Covidien LP
OA Round
2 (Final)
14%
Grant Probability
At Risk
3-4
OA Rounds
3y 10m
To Grant
74%
With Interview

Examiner Intelligence

Grants only 14% of cases
14%
Career Allow Rate
3 granted / 22 resolved
-56.4% vs TC avg
Strong +60% interview lift
Without
With
+60.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 10m
Avg Prosecution
100 currently pending
Career history
122
Total Applications
across all art units

Statute-Specific Performance

§101
10.9%
-29.1% vs TC avg
§103
43.5%
+3.5% vs TC avg
§102
13.5%
-26.5% vs TC avg
§112
32.1%
-7.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 22 resolved cases

Office Action

§103
DETAILED ACTION Applicant’s arguments, filed on 12/17/2025, have been fully considered. The following rejections and/or objections are either reiterated or newly applied. They constitute the complete set presently being applied to the instant application. Applicants have amended their claims, filed on 12/17/2025, and therefore rejections newly made in the instant office action have been necessitated by amendment. Claims 1-20 are the current claims hereby under examination. Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Objections Claims 1, 8, and 15 are objected to because of the following informalities: In claim 1, line 17, “non-respirator” should read “non-respiratory” In claim 1, line 22, “overly” should read “overlay” In claim 8, line 21, “non-respirator” should read “non-respiratory” In claim 15, line 14, “the graph” should read “a graph”, as there is no antecedent basis for the graph Appropriate correction is required. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-2, 5, 7-9, 12, 14-17, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Heinrich (US 10687712) in further view of Gore (US 20130226527) and Regan (US 20110074788). Regarding independent claim 1, Heinrich teaches a patient monitoring system (Column 2, lines 15-16: “a patient monitoring system”), comprising: an image capture sensor (Column 3, lines 26-32: “The system monitors individual subjects 12 with at least one camera 14 such as a video camera, a thermal camera, a near infrared camera, e.g. nightvision, or a combination thereof. The camera continuously stream images of the subject such as hospital patients in normal light room conditions and darkened room conditions such as during the night.”); a display (Column 4, lines 43-45: “A monitoring unit 54 receives and configures for display the classified motion and corresponding history portion of the video image, e.g. time segment of video”) having a graphical user interface configured to provide a graph of patient respiratory data over time (Column 4, lines 43-45: “A monitoring unit 54 receives and configures for display the classified motion and corresponding history portion of the video image, e.g. time segment of video”; Column 6, lines 52-55: “The illustrated difference signal is plotted with the difference signal value on the y-axis and time on the x-axis. The signal peaks correspond to respiratory cycles”); at least one processor (Column 2, lines 15-19: “a patient monitoring system includes a plurality of thermal or near infrared video cameras which include at least one camera configured to continuously receive video of one patient, and at least one configured processor”); and at least one memory storing computer-executable instructions that when executed by the at least one processor cause the patient monitoring system to perform operations (Column 3, lines 42-54: “The video stream from each camera is streamed to one or more processors 30 such as the processor of a computer or workstation 32. The video images can be stored in a video image data store 34. The data store can include random access memory (RAM) or non-transitory computer readable media such as disk, solid state disk, server storage, etc. The data store can include file structure, database structure, and the like. The processing can include separate monitoring, e.g. dedicated workstation, or combined monitoring, e.g. configured server. The workstation can include other functions such as central monitoring of vital signs for one or more subjects. The workstation can be part of or connect to a central monitoring system with alerts or alarms”), comprising: capturing, at a first time by the image capture sensor, a first image of a region of a patient; capturing, at a second time by the image capture sensor, a second image of the region of the patient (Column 3, lines 55-56: “The system includes a motion unit 40 which receives the video images or streamed video of the subject”; Column 2, lines 19-23: “The at least one processor is configured for the at least one camera to compute a difference signal based on the absolute differences between a current image and a plurality of reference images in a temporal neighborhood from the received video of the subject”); comparing the first captured image and the second captured image to identify a motion pattern (Column 6, lines 21-23: “absolute difference images are computed between the current image and the reference image”; Column 2, lines 2-5: “The motion unit identifies clusters of motion of the subject based on the received video of the subject. The segmentation unit segments body parts of the subject based on the identified clusters of subject motion.”), the motion pattern satisfying a non-respiratory motion condition (Column 5, lines 41-42: “Non-respiratory motion clusters are identified”); in response to the motion pattern satisfying the non-respirator motion condition, classifying a motion of the patient as non-respiratory motion (Column 2, lines 19-28: “The at least one processor is configured for the at least one camera to compute a difference signal based on the absolute differences between a current image and a plurality of reference images in a temporal neighborhood from the received video of the subject, and identify a cluster of respiratory motion of the subject based on the difference signal. The at least one processor is further configured to segment an upper body of the body parts based on the identified cluster of respiratory motion of the subject and identify at least one cluster of non-respiratory motion”; Column 5, lines 41-42: “Non-respiratory motion clusters are identified”). However, Heinrich does not teach displaying a visual indicator via the graphical user interface, the visual indicator being a partially transparent graphical overly displayed on at least a portion of the graph between the first time and the second time, the visual indicator providing an indication of a reduced veracity of the patient respiratory data displayed between the first time and the second time due to the classified non-respiratory motion. Gore discloses a system and method for determining physiological parameters. Specifically, Gore teaches displaying a visual indicator via the graphical user interface ([0051]: “a graph 160 illustrating the results of various embodiments using signal pre-conditioning and MIMO AIC for extracting respiratory/breathing data along with motion/activity index information for conditions of normal breathing, as well as patient motion (illustrated in the portion 162), deep breathing (illustrated in the portion 164), apnea (illustrated in the portion 166), shallow breathing (illustrated in the portion 168), and fast breathing (illustrated in the portion 170). … It should be noted that the MIMO architecture also provides non-respiratory motion information shown by the ambulatory motion index”; Fig. 9), the visual indicator providing an indication of a reduced veracity of the patient respiratory data displayed between the first time and the second time due to the classified non-respiratory motion (Fig. 9 shows patient motion in portion 162; [0002]: “Conventional measurement methods and systems that rely on capturing chest motion often suffer from poor accuracy due to motion artifacts, thus making the measurements unsatisfactory for monitoring”; [0031]: “various embodiments provide signal conditioning and a data acquisition method or algorithm for the separation, characterization and/or event attribution of physiological signals captured using electrical impedance measurements in the presence of noise sources and motion artifacts”; [0052]: “the channel 1 signal (signal 172) may be used as a threshold. In particular, if only breathing is detected, then the data from the remaining channels (signals 176) may be weighted less in classifying the current breathing condition of the patient. However, if motion is detected, then there is less confidence in the breathing signal and the signals from the remaining channels (signals 176) are given more weight. Thus, the channel 1 signal (signal 172) tracks a breathing rate, which is supplemented by the ambulatory motion index”. Fig. 9 shows the graph of respiratory data and the box 162 is the visual indicator. The visual indicator 162 indicates a reduced veracity of the respiratory data, since the visual indicator 162 shows then there is motion present, and as stated in paragraph [0052], when there is motion detected, there is less confidence in the breathing signal.). Heinrich and Gore are analogous arts as they are both related to measuring respiratory parameters and motions of a user. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the non-respiratory motion on the graph from Gore into the method from Heinrich as it allows the method to display more information to the user, which can provide a more comprehensive analysis and gives the user more information about their measured parameters. However, the Heinrich/Gore combination does not teach the visual indicator being a partially transparent graphical overly displayed on at least a portion of the graph between the first time and the second time. Regan discloses methods and apparatus for displaying and analyzing medical data. Specifically, Regan teaches the visual indicator being a partially transparent graphical overlay displayed on at least a portion of the graph ([0042]: “the overlaid graph 502 is overlying the graph illustrated in FIG. 4 and is semi-transparent such that the underlying graph is viewable concurrently with the overlaid graph 502”). Heinrich, Gore, and Regan are analogous arts as they are all related to measuring respiratory parameters of a user and displaying the results. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the partially transparent visual indicator from Regan into the Heinrich/Gore combination as the combination as they both contain known visual indicators, and therefore it would be a simple substitution. Regarding claim 2, the Heinrich/Gore/Regan combination teaches the patient monitoring system of claim 1, the operations further comprising: generating a first distance signal based on at least one first distance between at least one point in the region of the patient and a distance sensor of the image capture sensor at the first time; and generating a second distance signal based on at least one second distance from the at least one point in the region of the patient and the distance sensor at the second time, wherein the comparing is further based on the first generated distance signal and the second generated distance signal, and the motion pattern is identified based on the first generated distance signal and the second generated distance signal (Heinrich, Column 2, lines 29-34: “The at least one processor is yet further configured to segment at least a head and a trunk of the body parts based on the identified at least one cluster of non-respiratory motion and body proportions, and classify subject motion based on a frequency and a change in distance of identified motion and the segmented body parts”; Column 4, lines 8-12: “A classification unit 44 classifies subject motion based on a frequency and measures of motion of the clusters and segmented body parts such as angle, speed, location, distance, acceleration, and the like”; Column 7, lines 42-45: “The selected feature set attributes are compared with the attributes of motions clusters such as size, distance, direction, speed, orientation, frequency, etc. and association with segmented body parts to classify the movement”). Regarding claim 5, the Heinrich/Gore/Regan combination teaches the patient monitoring system of claim 1. However, the Heinrich/Gore/Regan combination does not teach wherein the patient respiratory data over time is obtained from one or more of a transthoracic impedance sensor, an electrocardiograph, capnograph, spirometer, pulse oximeter, or a manual user entry. Gore teaches wherein the patient respiratory data over time is obtained from one or more of a transthoracic impedance sensor, an electrocardiograph, capnograph, spirometer, pulse oximeter, or a manual user entry ([0021]: “One embodiment of an impedance measurement system 20 is illustrated in FIG. 1, which may be a transducer-based system, for example, an electrode-based system, such as a patient monitor that may form part of a patient monitoring device, such as an electrocardiography (ECG) monitoring device or an impedance cardiography module … electrical impedance measurements obtained may be used in at least one embodiment to separate respiratory rate from patient motion”; [0051]: “the primary channel … tracks the referenced spirometer signal”). Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the respiratory data being obtained from the sensors from Gore into the Heinrich/Gore/Regan combination as the combination is silent on what is used to measure respiratory data, and Gore discloses suitable sensors to measure respiratory sensors in an analogous device. Regarding claim 7, the Heinrich/Gore/Regan combination teaches the patient monitoring system of claim 1, the operations further comprising: obtaining the patient respiratory data during periods in which the motion of the patient is classified as respiratory motion of the patient (Heinrich, Column 6, lines 54-57: “The signal peaks correspond to respiratory cycles, e.g. peak to peak corresponds to one respiratory cycle. The periodicity of the signal clearly indicates movement indicative of respiration”). Regarding independent claim 8, Heinrich teaches an image-based patient monitoring system (Column 2, lines 15-16: “a patient monitoring system”), comprising: a display (Column 4, lines 43-45: “A monitoring unit 54 receives and configures for display the classified motion and corresponding history portion of the video image, e.g. time segment of video”) having a graphical user interface configured to provide a graph of patient respiratory data over time (Column 4, lines 43-45: “A monitoring unit 54 receives and configures for display the classified motion and corresponding history portion of the video image, e.g. time segment of video”; Column 6, lines 52-55: “The illustrated difference signal is plotted with the difference signal value on the y-axis and time on the x-axis. The signal peaks correspond to respiratory cycles”); one or more hardware processors (Column 2, lines 15-19: “a patient monitoring system includes a plurality of thermal or near infrared video cameras which include at least one camera configured to continuously receive video of one patient, and at least one configured processor”); a distance sensor (Column 7, lines 42-45: “The selected feature set attributes are compared with the attributes of motions clusters such as size, distance, direction, speed, orientation, frequency, etc. and association with segmented body parts to classify the movement”; Column 3, lines 26-32: “The system monitors individual subjects 12 with at least one camera 14 such as a video camera, a thermal camera, a near infrared camera, e.g. nightvision, or a combination thereof. The camera continuously stream images of the subject such as hospital patients in normal light room conditions and darkened room conditions such as during the night.”); and memory storing computer-executable instructions that when executed by the one or more hardware processors cause the image-based patient monitoring system to perform operations (Column 3, lines 42-54: “The video stream from each camera is streamed to one or more processors 30 such as the processor of a computer or workstation 32. The video images can be stored in a video image data store 34. The data store can include random access memory (RAM) or non-transitory computer readable media such as disk, solid state disk, server storage, etc. The data store can include file structure, database structure, and the like. The processing can include separate monitoring, e.g. dedicated workstation, or combined monitoring, e.g. configured server. The workstation can include other functions such as central monitoring of vital signs for one or more subjects. The workstation can be part of or connect to a central monitoring system with alerts or alarms”), comprising: generate, at a first time by the distance sensor, a first distance signal based on detecting a first distance between at least one point in a region of the patient and the distance sensor; generate, at a second time by the distance sensor, a second distance signal based on detecting a second distance from the at least one point in the region of the patient and the distance sensor (Column 3, lines 55-56: “The system includes a motion unit 40 which receives the video images or streamed video of the subject”; Column 2, lines 19-23: “The at least one processor is configured for the at least one camera to compute a difference signal based on the absolute differences between a current image and a plurality of reference images in a temporal neighborhood from the received video of the subject”; Column 2, lines 29-34: “The at least one processor is yet further configured to segment at least a head and a trunk of the body parts based on the identified at least one cluster of non-respiratory motion and body proportions, and classify subject motion based on a frequency and a change in distance of identified motion and the segmented body parts”; Column 4, lines 8-12: “A classification unit 44 classifies subject motion based on a frequency and measures of motion of the clusters and segmented body parts such as angle, speed, location, distance, acceleration, and the like”; Column 7, lines 42-45: “The selected feature set attributes are compared with the attributes of motions clusters such as size, distance, direction, speed, orientation, frequency, etc. and association with segmented body parts to classify the movement”); process the first generated distance signal and the second generated distance signal to identify a motion pattern (Column 6, lines 21-23: “absolute difference images are computed between the current image and the reference image”; Column 2, lines 2-5: “The motion unit identifies clusters of motion of the subject based on the received video of the subject. The segmentation unit segments body parts of the subject based on the identified clusters of subject motion.”), the motion pattern satisfying a non-respiratory motion condition (Column 5, lines 41-42: “Non-respiratory motion clusters are identified”); in response to the motion pattern satisfying the non-respirator motion condition, classify a motion of the patient as non-respiratory motion when the non-respiratory motion (Column 2, lines 19-28: “The at least one processor is configured for the at least one camera to compute a difference signal based on the absolute differences between a current image and a plurality of reference images in a temporal neighborhood from the received video of the subject, and identify a cluster of respiratory motion of the subject based on the difference signal. The at least one processor is further configured to segment an upper body of the body parts based on the identified cluster of respiratory motion of the subject and identify at least one cluster of non-respiratory motion”; Column 5, lines 41-42: “Non-respiratory motion clusters are identified”). However, Heinrich does not teach displaying a visual indicator via the graphical user interface, the visual indicator being a partially transparent graphical overly displayed on at least a portion of the graph between the first time and the second time, the visual indicator providing an indication of a reduced veracity of the patient respiratory data displayed between the first time and the second time due to the classified non-respiratory motion. Gore discloses a system and method for determining physiological parameters. Specifically, Gore teaches displaying a visual indicator via the graphical user interface ([0051]: “a graph 160 illustrating the results of various embodiments using signal pre-conditioning and MIMO AIC for extracting respiratory/breathing data along with motion/activity index information for conditions of normal breathing, as well as patient motion (illustrated in the portion 162), deep breathing (illustrated in the portion 164), apnea (illustrated in the portion 166), shallow breathing (illustrated in the portion 168), and fast breathing (illustrated in the portion 170). … It should be noted that the MIMO architecture also provides non-respiratory motion information shown by the ambulatory motion index”; Fig. 9), the visual indicator providing an indication of a reduced veracity of the patient respiratory data displayed between the first time and the second time due to the classified non-respiratory motion (Fig. 9 shows patient motion in portion 162; [0002]: “Conventional measurement methods and systems that rely on capturing chest motion often suffer from poor accuracy due to motion artifacts, thus making the measurements unsatisfactory for monitoring”; [0031]: “various embodiments provide signal conditioning and a data acquisition method or algorithm for the separation, characterization and/or event attribution of physiological signals captured using electrical impedance measurements in the presence of noise sources and motion artifacts”; [0052]: “the channel 1 signal (signal 172) may be used as a threshold. In particular, if only breathing is detected, then the data from the remaining channels (signals 176) may be weighted less in classifying the current breathing condition of the patient. However, if motion is detected, then there is less confidence in the breathing signal and the signals from the remaining channels (signals 176) are given more weight. Thus, the channel 1 signal (signal 172) tracks a breathing rate, which is supplemented by the ambulatory motion index”. Fig. 9 shows the graph of respiratory data and the box 162 is the visual indicator. The visual indicator 162 indicates a reduced veracity of the respiratory data, since the visual indicator 162 shows then there is motion present, and as stated in paragraph [0052], when there is motion detected, there is less confidence in the breathing signal.). Heinrich and Gore are analogous arts as they are both related to measuring respiratory parameters and motions of a user. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the non-respiratory motion on the graph from Gore into the method from Heinrich as it allows the method to display more information to the user, which can provide a more comprehensive analysis and gives the user more information about their measured parameters. However, the Heinrich/Gore combination does not teach the visual indicator being a partially transparent graphical overly displayed on at least a portion of the graph between the first time and the second time. Regan discloses methods and apparatus for displaying and analyzing medical data. Specifically, Regan teaches the visual indicator being a partially transparent graphical flag displayed on at least a portion of the graph ([0042]: “the overlaid graph 502 is overlying the graph illustrated in FIG. 4 and is semi-transparent such that the underlying graph is viewable concurrently with the overlaid graph 502”[0286]: “the generated health indicator can be output for overlay on at least a portion of a display, such as a display for displaying the representation of the surgical site. The overlay may be partially transparent”). Heinrich, Gore, and Regan are analogous arts as they are all related to measuring respiratory parameters of a user and displaying the results. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the partially transparent visual indicator from Regan into the Heinrich/Gore combination as the combination as they both contain known visual indicators, and therefore it would be a simple substitution. Regarding claim 9, the Heinrich/Gore/Regan combination teaches the image-based patient monitoring system of claim 8, further comprising an image capture sensor, the operations further comprising capture, by the image capture sensor (Heinrich, Column 3, lines 26-27: “The system monitors individual subjects 12 with at least one camera”), a first image of the region of the patient at the first time; and capture, by the image capture sensor, a second image of the region of the patient at the second time (Heinrich, Column 3, lines 55-56: “The system includes a motion unit 40 which receives the video images or streamed video of the subject”; Column 2, lines 19-23: “The at least one processor is configured for the at least one camera to compute a difference signal based on the absolute differences between a current image and a plurality of reference images in a temporal neighborhood from the received video of the subject”), wherein identifying the motion pattern further includes comparing the first captured image and the second captured image (Heinrich, Column 10, lines 6-9: “identify a cluster of non-respiratory motion in the streamed video of the subject different from the identified clusters of respiratory and body part motion”; Column 5, lines 42-46: “Each motion cluster can include attributes of a size, a shape, a direction, a distance, physical location relative to chest or other identified cluster, and/or a velocity. The attributes can be relative to the respiratory cluster and be relative to the body proportions of the subject”). Regarding claim 12, the Heinrich/Gore/Regan combination teaches the image-based patient monitoring system of claim 9. However, the Heinrich/Gore/Regan combination does not teach wherein the patient respiratory data over time is obtained from one or more of a transthoracic impedance sensor, an electrocardiograph, capnograph, spirometer, pulse oximeter, or a manual user entry. Gore teaches wherein the patient respiratory data over time is obtained from one or more of a transthoracic impedance sensor, an electrocardiograph, capnograph, spirometer, pulse oximeter, or a manual user entry ([0021]: “One embodiment of an impedance measurement system 20 is illustrated in FIG. 1, which may be a transducer-based system, for example, an electrode-based system, such as a patient monitor that may form part of a patient monitoring device, such as an electrocardiography (ECG) monitoring device or an impedance cardiography module … electrical impedance measurements obtained may be used in at least one embodiment to separate respiratory rate from patient motion”; [0051]: “the primary channel … tracks the referenced spirometer signal”). Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the respiratory data being obtained from the sensors form Gore into the Heinrich/Gore/Regan combination as the combination is silent on what is used to measure respiratory data, and Gore discloses suitable sensors to measure respiratory sensors in an analogous device. Regarding claim 14, the Heinrich/Gore/Regan combination teaches the image-based patient monitoring system of claim 9, the operations further comprising: determine patient respiratory data during periods of time in which the motion of the patient is classified as substantially respiratory motion of the patient (Heinrich, Column 6, lines 54-57: “The signal peaks correspond to respiratory cycles, e.g. peak to peak corresponds to one respiratory cycle. The periodicity of the signal clearly indicates movement indicative of respiration”). Regarding independent claim 15, Heinrich teaches a patient monitoring system (Column 2, lines 15-16: “a patient monitoring system”), comprising: a display (Column 4, lines 43-45: “A monitoring unit 54 receives and configures for display the classified motion and corresponding history portion of the video image, e.g. time segment of video”) having a graphical user interface configured to provide patient respiratory data over time (Column 4, lines 43-45: “A monitoring unit 54 receives and configures for display the classified motion and corresponding history portion of the video image, e.g. time segment of video”; Column 6, lines 52-55: “The illustrated difference signal is plotted with the difference signal value on the y-axis and time on the x-axis. The signal peaks correspond to respiratory cycles”); at least one processor (Column 2, lines 15-19: “a patient monitoring system includes a plurality of thermal or near infrared video cameras which include at least one camera configured to continuously receive video of one patient, and at least one configured processor”); and memory storing computer-executable instructions that when executed by the at least one processor cause the patient monitoring system to perform operations (Column 3, lines 42-54: “The video stream from each camera is streamed to one or more processors 30 such as the processor of a computer or workstation 32. The video images can be stored in a video image data store 34. The data store can include random access memory (RAM) or non-transitory computer readable media such as disk, solid state disk, server storage, etc. The data store can include file structure, database structure, and the like. The processing can include separate monitoring, e.g. dedicated workstation, or combined monitoring, e.g. configured server. The workstation can include other functions such as central monitoring of vital signs for one or more subjects. The workstation can be part of or connect to a central monitoring system with alerts or alarms”), comprising: receive, from a depth camera (Column 3, lines 26-32: “The system monitors individual subjects 12 with at least one camera 14 such as a video camera, a thermal camera, a near infrared camera, e.g. nightvision, or a combination thereof. The camera continuously stream images of the subject such as hospital patients in normal light room conditions and darkened room conditions such as during the night.”; Column 7, lines 42-45: “The selected feature set attributes are compared with the attributes of motions clusters such as size, distance, direction, speed, orientation, frequency, etc. and association with segmented body parts to classify the movement”), time series image data of a patient over a period of time (Column 3, lines 55-56: “The system includes a motion unit 40 which receives the video images or streamed video of the subject”; Column 2, lines 19-23: “The at least one processor is configured for the at least one camera to compute a difference signal based on the absolute differences between a current image and a plurality of reference images in a temporal neighborhood from the received video of the subject”); process the time series image data to identify a motion pattern over the period of time (Column 6, lines 21-23: “absolute difference images are computed between the current image and the reference image”; Column 2, lines 2-5: “The motion unit identifies clusters of motion of the subject based on the received video of the subject. The segmentation unit segments body parts of the subject based on the identified clusters of subject motion.”), the motion pattern corresponding to non-respiratory motion (Column 5, lines 41-42: “Non-respiratory motion clusters are identified”); in response to identifying the motion pattern, determine patient motion over the period of time is non-respiratory motion (Column 2, lines 19-28: “The at least one processor is configured for the at least one camera to compute a difference signal based on the absolute differences between a current image and a plurality of reference images in a temporal neighborhood from the received video of the subject, and identify a cluster of respiratory motion of the subject based on the difference signal. The at least one processor is further configured to segment an upper body of the body parts based on the identified cluster of respiratory motion of the subject and identify at least one cluster of non-respiratory motion”; Column 5, lines 41-42: “Non-respiratory motion clusters are identified”). However, Heinrich does not teach displaying a visual indicator via the graphical user interface, the visual indicator being a partially transparent graphical overly displayed on at least a portion of the graph between the first time and the second time, the visual indicator providing an indication of a reduced veracity of the patient respiratory data displayed between the first time and the second time due to the classified non-respiratory motion. Gore discloses a system and method for determining physiological parameters. Specifically, Gore teaches displaying, via the graphical user interface on a portion of the graph corresponding to the period of time, a visual indicator ([0051]: “a graph 160 illustrating the results of various embodiments using signal pre-conditioning and MIMO AIC for extracting respiratory/breathing data along with motion/activity index information for conditions of normal breathing, as well as patient motion (illustrated in the portion 162), deep breathing (illustrated in the portion 164), apnea (illustrated in the portion 166), shallow breathing (illustrated in the portion 168), and fast breathing (illustrated in the portion 170). … It should be noted that the MIMO architecture also provides non-respiratory motion information shown by the ambulatory motion index”; Fig. 9), the visual indicator providing an indication of a reduced veracity of the patient respiratory data displayed between the first time and the second time due to the classified non-respiratory motion (Fig. 9 shows patient motion in portion 162; [0002]: “Conventional measurement methods and systems that rely on capturing chest motion often suffer from poor accuracy due to motion artifacts, thus making the measurements unsatisfactory for monitoring”; [0031]: “various embodiments provide signal conditioning and a data acquisition method or algorithm for the separation, characterization and/or event attribution of physiological signals captured using electrical impedance measurements in the presence of noise sources and motion artifacts”; [0052]: “the channel 1 signal (signal 172) may be used as a threshold. In particular, if only breathing is detected, then the data from the remaining channels (signals 176) may be weighted less in classifying the current breathing condition of the patient. However, if motion is detected, then there is less confidence in the breathing signal and the signals from the remaining channels (signals 176) are given more weight. Thus, the channel 1 signal (signal 172) tracks a breathing rate, which is supplemented by the ambulatory motion index”. Fig. 9 shows the graph of respiratory data and the box 162 is the visual indicator. The visual indicator 162 indicates a reduced veracity of the respiratory data, since the visual indicator 162 shows then there is motion present, and as stated in paragraph [0052], when there is motion detected, there is less confidence in the breathing signal.). Heinrich and Gore are analogous arts as they are both related to measuring respiratory parameters and motions of a user. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the non-respiratory motion on the graph from Gore into the method from Heinrich as it allows the method to display more information to the user, which can provide a more comprehensive analysis and gives the user more information about their measured parameters. However, the Heinrich/Gore combination does not teach the visual indicator being a partially transparent graphical overly displayed on at least a portion of the graph between the first time and the second time. Regan discloses methods and apparatus for displaying and analyzing medical data. Specifically, Regan teaches the visual indicator being a partially transparent graphical flag ([0042]: “the overlaid graph 502 is overlying the graph illustrated in FIG. 4 and is semi-transparent such that the underlying graph is viewable concurrently with the overlaid graph 502”). Heinrich, Gore, and Regan are analogous arts as they are all related to measuring respiratory parameters of a user and displaying the results. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the partially transparent visual indicator from Regan into the Heinrich/Gore combination as the combination as they both contain known visual indicators, and therefore it would be a simple substitution. Regarding claim 16, the Heinrich/Gore/Regan combination teaches the patient monitoring system of claim 15, wherein the time series image data is a video of the patient (Heinrich, Column 2, lines 15-18: “a patient monitoring system includes a plurality of thermal or near infrared video cameras which include at least one camera configured to continuously receive video of one patient”). Regarding claim 17, the Heinrich/Gore/Regan combination teaches the patient monitoring system of claim 15, wherein the partially transparent graphical flag is displayed over or under the portion of the graph corresponding to the period of time (Gore, Fig. 9 shows patient motion in portion 162 displayed over the graph). Regarding claim 20, the Heinrich/Gore/Regan combination teaches the patient monitoring system of claim 15, the operations further comprising: emit an alert when the partially transparent graphical flag is displayed via the graphical user interface (Heinrich, Column 6, lines 11-13: “Classifying movement can include providing alerts and/or alarms based on the classification.”). However, the Heinrich/Gore/Regan combination does not teach the alert being an audio alert. Gore teaches an audio alert ([0052]: “using this monitoring approach different warnings or notifications (e.g., visual or audible notifications) may be provided”). Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the audio alert from Gore into the Heinrich/Gore/Regan combination as the combination is silent on the type of alert, and Gore provides a suitable type of alert in an analogous device. Claims 3 and 10 are rejected under 35 U.S.C. 103 as being unpatentable over the Heinrich/Gore/Regan combination as applied to claims 1 and 8 above, and further in view of Xia (US 20210393215). Regarding claim 3, the Heinrich/Gore/Regan combination teaches the patient monitoring system of claim 1. However, the Heinrich/Gore/Regan combination does not teach wherein the non-respiratory motion condition includes a magnitude of motion threshold, wherein the non-respiratory motion condition includes a magnitude of motion threshold, and wherein the motion pattern satisfies the magnitude of motion threshold. Xia discloses systems and methods for motion detection. Specifically, Xia teaches wherein the motion condition includes a magnitude of motion threshold, wherein the motion condition includes a magnitude of motion threshold, and wherein the motion pattern satisfies the magnitude of motion threshold ([0117]: “the physiological motion at a certain time point may be regarded as being smooth or minimal if the motion amplitude at the certain time point is below a first threshold”; [0110]: “The information relating to a physiological motion may include a motion rate, a motion amplitude (or displacement), a motion cycle, a motion phase, or the like, or any combination thereof. In some embodiments, the motion data may include … a posture signal relating to the posture motion of the subject”. The physiological motion can be the motion of the user, which is analyzed and determined to be minimal when the motion is below a first threshold. Therefore, the opposite is true and the motion is classified as not minimal and therefore impacting data when it is greater than a threshold, which can be interpreted as the magnitude of motion threshold.). Heinrich, Gore, and Xia are analogous arts as they are all related to monitoring respiratory data and motion from a user. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the threshold from Xia into the Heinrich/Gore/Regan combination as the combination is silent on the specific process used to determine non-respiratory data, and Xia discloses a suitable threshold in an analogous device. Regarding claim 10, the Heinrich/Gore/Regan combination teaches the image-based patient monitoring system of claim 8. However, the Heinrich/Gore/Regan combination does not teach wherein the non-respiratory motion condition includes a magnitude of motion threshold, wherein the non-respiratory motion condition includes a magnitude of motion threshold, and wherein the motion pattern satisfies the magnitude of motion threshold. Xia discloses systems and methods for motion detection. Specifically, Xia teaches wherein the non-respiratory motion condition includes a magnitude of motion threshold, and wherein the motion pattern satisfies the magnitude of motion threshold ([0117]: “the physiological motion at a certain time point may be regarded as being smooth or minimal if the motion amplitude at the certain time point is below a first threshold”; [0110]: “The information relating to a physiological motion may include a motion rate, a motion amplitude (or displacement), a motion cycle, a motion phase, or the like, or any combination thereof. In some embodiments, the motion data may include … a respiratory signal relating to a respiratory motion of the subject, a posture signal relating to the posture motion of the subject”). Heinrich, Gore, and Xia are analogous arts as they are all related to monitoring respiratory data and motion from a user. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the threshold from Xia into the Heinrich/Gore/Regan combination as the combination is silent on the specific process used to determine non-respiratory data, and Xia discloses a suitable threshold in an analogous device. Claims 4 and 11 are rejected under 35 U.S.C. 103 as being unpatentable over the Heinrich/Gore/Regan combination as applied to claims 2 and 9 above, and further in view of Strasser (US 20250281116). Regarding claim 4, the Heinrich/Gore/Regan combination teaches the patient monitoring system of claim 2, wherein the non-respiratory motion condition includes a motion condition, and wherein the motion condition is satisfied when the first generated distance signal and the second generated distance signal represent movement of portions of the region of the patient relative to a medial axis of the region of the patient (Heinrich, Column 4, lines 8-26: “classification unit 44 classifies subject motion based on a frequency and measures of motion of the clusters and segmented body parts such as angle, speed, location, distance, acceleration, and the like. For example, a movement of the trunk/chest area of the subject from the bed to a floor with the body axis remaining parallel to the bed/floor is indicative of a patient falling out of bed. In another example, a movement of the trunk/chest area of the subject from the bed to an elevated level and a change in body axis from parallel to the floor to perpendicular to the floor is indicative of a patient getting out of bed. The classification unit can interpret the repetitiveness of body part motions and motion measures of higher-level motions. For example, higher-level motions such as pinching skin, grabbing at the air indicative of delirium can be classified. The classification unit can also de-identify recorded video, e.g. insert into the video a covering over a patient's face and other parts of the body which may identify the patient”). However, the Heinrich/Gore/Regan combination does not teach that the movement relative to the medial axis is asymmetrical. Strasser teaches systems and methods for analyzing a user’s movement. Specifically, Strasser teaches wherein the motion condition includes an asymmetrical motion condition relative to an axis ([0154]: “a device for stroke detection may include a wearable device for measuring changes in motion (e.g., in three axes), for example asymmetrical changes in motion to detect tremors”). Heinrich, Gore, and Strasser are analogous arts as they are all related to systems that monitor and classify the motion of a user for analysis. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the asymmetrical motion condition into the Heinrich/Gore/Regan combination as it allows the combination to classify tremors and provide a more comprehensive analysis to the user. Regarding claim 11, the Heinrich/Gore/Regan combination teaches the image-based patient monitoring system of claim 9, wherein the non-respiratory motion condition includes a motion condition, and wherein the motion condition is satisfied when the first generated distance signal and the second generated distance signal represent movement of portions of the region of the patient relative to a medial axis of the region of the patient (Heinrich, Column 4, lines 8-26: “classification unit 44 classifies subject motion based on a frequency and measures of motion of the clusters and segmented body parts such as angle, speed, location, distance, acceleration, and the like. For example, a movement of the trunk/chest area of the subject from the bed to a floor with the body axis remaining parallel to the bed/floor is indicative of a patient falling out of bed. In another example, a movement of the trunk/chest area of the subject from the bed to an elevated level and a change in body axis from parallel to the floor to perpendicular to the floor is indicative of a patient getting out of bed. The classification unit can interpret the repetitiveness of body part motions and motion measures of higher-level motions. For example, higher-level motions such as pinching skin, grabbing at the air indicative of delirium can be classified. The classification unit can also de-identify recorded video, e.g. insert into the video a covering over a patient's face and other parts of the body which may identify the patient”). However, the Heinrich/Gore/Regan combination does not teach that the movement relative to the medial axis is asymmetrical. Strasser teaches systems and methods for analyzing a user’s movement. Specifically, Strasser teaches wherein the motion condition includes an asymmetrical motion condition relative to an axis ([0154]: “a device for stroke detection may include a wearable device for measuring changes in motion (e.g., in three axes), for example asymmetrical changes in motion to detect tremors”). Heinrich, Gore, and Strasser are analogous arts as they are all related to systems that monitor and classify the motion of a user for analysis. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the asymmetrical motion condition into the Heinrich/Gore/Regan combination as it allows the combination to classify the motion into tremors and provide a more comprehensive analysis to the user. Claims 6, 13, and 18-19 are rejected under 35 U.S.C. 103 as being unpatentable over the Heinrich/Gore/Regan combination as applied to claims 1, 9, and 15 above, and further in view of Redtel (US 20210068670). Regarding claim 6, the Heinrich/Gore/Regan combination teaches the patient monitoring system of claim 1. However, the Heinrich/Gore/Regan combination does not teach wherein the partially transparent graphical overly of the visual indicator is displayed in a contrasting color relative to the graph plotting the patient respiratory data over time. Redtel teaches devices to record and analyze images of a user. Specifically, Redtel teaches wherein the partially transparent graphical overly of the visual indicator is displayed in a contrasting color relative to the graph plotting the patient respiratory data over time ([0213]: “The color or transparency of the overlay in the regions of the tiles is derived from the currently determined intensity of the color change.”). Heinrich, Gore, Regan, and Redtel are analogous arts as they are both related to measuring respiratory parameters of a user and analyzing images. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the color from Redtel into the Heinrich/Gore/Regan combination as the combination is silent on the color of the indicator, and Redtel discloses a suitable color in an analogous device. Regarding claim 13, the Heinrich/Gore/Regan combination teaches the image-based patient monitoring system of claim 9. However, the Heinrich/Gore/Regan combination does not teach wherein the partially transparent graphical overly of the visual indicator is displayed in a contrasting color relative to the graph plotting the patient respiratory data over time. Redtel teaches devices to record and analyze images of a user. Specifically, Redtel teaches wherein the partially transparent graphical flag of the visual indicator is displayed in a different color based on an extent of the reduced veracity of the patient respiratory data between the first time and the second time ([0213]: “The color or transparency of the overlay in the regions of the tiles is derived from the currently determined intensity of the color change.”). Heinrich, Gore, Regan, and Redtel are analogous arts as they are both related to measuring respiratory parameters of a user and analyzing images. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the color from Redtel into the Heinrich/Gore/Regan combination as the combination is silent on the color of the indicator, and Redtel discloses a suitable color in an analogous device. Regarding claim 18, the Heinrich/Gore/Regan combination teaches the patient monitoring system of claim 15. However, the Heinrich/Gore/Regan combination does not teach wherein the partially transparent graphical overly of the visual indicator is displayed in a contrasting color relative to the graph plotting the patient respiratory data over time. Redtel teaches devices to record and analyze images of a user. Specifically, Redtel teaches wherein the partially transparent graphical flag is displayed in a different color based on an extent of the reduced veracity of the patient respiratory data over the period of time ([0213]: “The color or transparency of the overlay in the regions of the tiles is derived from the currently determined intensity of the color change.”). Heinrich, Gore, Regan, and Redtel are analogous arts as they are both related to measuring respiratory parameters of a user and analyzing images. Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the color from Redtel into the Heinrich/Gore/Regan combination as the combination is silent on the color of the indicator, and Redtel discloses a suitable color in an analogous device. Regarding claim 19, the Heinrich/Gore/Regan combination teaches the patient monitoring system of claim 18, wherein the extent of the reduced veracity corresponds to an extent of the motion of the patient being attributable to the non-respiratory motion during the period of time (Gore, Fig. 9 shows patient motion in portion 162; [0002]: “Conventional measurement methods and systems that rely on capturing chest motion often suffer from poor accuracy due to motion artifacts, thus making the measurements unsatisfactory for monitoring”; [0031]: “various embodiments provide signal conditioning and a data acquisition method or algorithm for the separation, characterization and/or event attribution of physiological signals captured using electrical impedance measurements in the presence of noise sources and motion artifacts”; [0052]: “the channel 1 signal (signal 172) may be used as a threshold. In particular, if only breathing is detected, then the data from the remaining channels (signals 176) may be weighted less in classifying the current breathing condition of the patient. However, if motion is detected, then there is less confidence in the breathing signal and the signals from the remaining channels (signals 176) are given more weight. Thus, the channel 1 signal (signal 172) tracks a breathing rate, which is supplemented by the ambulatory motion index”; [0051]: “It should be noted that the MIMO architecture also provides non-respiratory motion information shown by the ambulatory motion index”. The system monitors both breathing rate and motion, and determines that if motion is detected, then there is less confidence in the respiratory signal.). Response to Arguments All of applicant’s argument regarding the rejections and objections previously set forth have been fully considered and are persuasive unless directly addressed subsequently. Applicant’s arguments with respect to claims 1-14 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ERIN K MCCORMACK whose telephone number is (703)756-1886. The examiner can normally be reached Mon-Fri 7:30-5. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jason Sims can be reached at 5712727540. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /E.K.M./Examiner, Art Unit 3791 /MATTHEW KREMER/Primary Examiner, Art Unit 3791
Read full office action

Prosecution Timeline

Jun 07, 2023
Application Filed
Sep 12, 2025
Non-Final Rejection — §103
Dec 17, 2025
Response Filed
Mar 09, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12558004
SENSOR DEVICE MONITORS FOR CALIBRATION
2y 5m to grant Granted Feb 24, 2026
Patent 12484793
APPARATUS AND METHOD FOR ESTIMATING BLOOD PRESSURE
2y 5m to grant Granted Dec 02, 2025
Patent 12419557
PRESSURE SENSOR ARRAY FOR URODYNAMIC TESTING AND A TEST APPARATUS INCLUDING THE SAME
2y 5m to grant Granted Sep 23, 2025
Study what changed to get past this examiner. Based on 3 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
14%
Grant Probability
74%
With Interview (+60.0%)
3y 10m
Median Time to Grant
Moderate
PTA Risk
Based on 22 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month