Prosecution Insights
Last updated: April 19, 2026
Application No. 17/693,579

SENSOR DEVICE WITH A SELECTIVELY ACTIVATABLE DISPLAY

Final Rejection §103§112
Filed
Mar 14, 2022
Examiner
KREMER, MATTHEW
Art Unit
3791
Tech Center
3700 — Mechanical Engineering & Manufacturing
Assignee
Leaf Healthcare Inc.
OA Round
4 (Final)
44%
Grant Probability
Moderate
5-6
OA Rounds
4y 5m
To Grant
96%
With Interview

Examiner Intelligence

Grants 44% of resolved cases
44%
Career Allow Rate
196 granted / 448 resolved
-26.2% vs TC avg
Strong +52% interview lift
Without
With
+51.9%
Interview Lift
resolved cases with interview
Typical timeline
4y 5m
Avg Prosecution
58 currently pending
Career history
506
Total Applications
across all art units

Statute-Specific Performance

§101
6.2%
-33.8% vs TC avg
§103
35.5%
-4.5% vs TC avg
§102
14.0%
-26.0% vs TC avg
§112
36.2%
-3.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 448 resolved cases

Office Action

§103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Interpretation The following is a quotation of 35 U.S.C. 112(f): (f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph: An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof. The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked. As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph: (A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function; (B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and (C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function. Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function. Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function. Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. No claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. Claim Objections Claims 36 and 38 are objected to because of the following informalities: in claim 36, line 1: “Claim 0” should be “claim 34”; and in claim 38, line 1: “Claim 0” should be “claim 34”; Appropriate correction is required. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. Claims 23-25, 27, 29, 31-32, 34, 36, and 38 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 23 recites “without manual interaction with the user-wearable sensor device: automatically identity a repositioning of the user that remedies the defined pressurization-related condition based on the monitored orientation of the user over time; and automatically switch the display unit to the inactive state in response to the automatic identification of the repositioning of the user that remedies the defined pressurization-related condition” in lines 21-26, which renders the claim unclear in light of the specification. When a user is repositioned, the sensor device detects this movement (see paragraph 00275 of the specification) such that there is clearly manual interaction with the sensor device so as to automatically identify the repositioning. This understanding of the specification runs counter to the claim language such that there is confusion as to what is meant by the recitation in light of the teachings of the specification. This confusion renders claim 23 indefinite. Claims 24-25 and 27 are rejected by virtue of their dependence from claim 23. Claim 29 recites “without manual interaction with the user-wearable sensor device: automatically identify a repositioning of the user that remedies the defined pressurization-related condition based on the monitored orientation of the user over time; and in response to the automatic identification of the repositioning of the user that remedies the defined pressurization-related condition, automatically switch the display unit to the inactive state” in lines 22-27, which renders the claim unclear in light of the specification. When a user is repositioned, the sensor device detects this movement (see paragraph 00275 of the specification) such that there is clearly manual interaction with the sensor device so as to automatically identify the repositioning. This understanding of the specification runs counter to the claim language such that there is confusion as to what is meant by the recitation in light of the teachings of the specification. This confusion renders claim 29 indefinite. Claims 31-32 are rejected by virtue of their dependence from claim 29. Claim 34 recites “without manual interaction with the user-wearable sensor device: automatically identifying a repositioning of the user that remedies the defined pressurization-related condition based on the monitored orientation of the user over time, and in response to the automatic identification of the repositioning of the user that remedies the defined pressurization-related condition, automatically switching the display unit from the activated state to the inactive state” in lines 20-25, which renders the claim unclear in light of the specification. When a user is repositioned, the sensor device detects this movement (see paragraph 00275 of the specification) such that there is clearly manual interaction with the sensor device so as to automatically identify the repositioning. This understanding of the specification runs counter to the claim language such that there is confusion as to what is meant by the recitation in light of the teachings of the specification. This confusion renders claim 34 indefinite. Claims 36 and 38 are rejected by virtue of their dependence from claim 34. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 23-25, 27, 29, 31-32, 34, 36, and 38 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent U.S. Patent Application Publication No. 2011/0263950 (Larson)(previously cited), in view of U.S. Patent Application Publication No. 2006/0097983 (Haggman)(previously cited), and in view of U.S. Patent No. 6,397,190 (Goetz)(previously cited), and further in view of U.S. Patent Application Publication No. 2014/0118138 (Cobelli) (previously cited). Larson teaches a user-wearable sensor device (paragraphs 0014, 0051-0055, 0057, 0059, 0069, 0181, 0189, and 0226-0227 and FIG. 3 of Larson; paragraph 0054 of Larson teaches that the sensors can be embedded in articles worn by the patient, such as shirts or underwear bracelets, belts, or collars) configured to be directly or indirectly secured to a user or to an article worn by the user, the user wearable sensor device comprising: a sensor device housing (paragraph 0054 of Larson teaches that the sensors can be embedded in articles worn by the patient, such as shirts or underwear bracelets, belts, or collars such that the shirt, underwear bracelet, belt, or collar is considered to be the sensor housing with the sensor housing schematically drawn in FIG. 3 of Larson); at least one accelerometer (the accelerometer of Larson; paragraphs 0014, 0051-0055, 0057, 0059, 0069, 0181, 0189, and 0226-0227 and FIG. 3 of Larson); a display unit including at least one visual indicator (the display of Larson; paragraphs 0047-0048, 0050-0053, 0139, and 279 of Larson); and a battery (the battery of Larson; paragraphs 0052-0054, 0114, 0164, 0171 of Larson). Paragraph 0052 of Larson teaches that the viewing terminal can be integrated into the patient sensor. Also, Larson teaches that the host system may provide directions and display messages (paragraph 0047 of Larson) and that the host functionality can largely reside in the sensor itself (paragraph 0052 of Larson). Thus, Larson teaches or suggests that the display unit may be integrated into or reside in the sensor device. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have the display unit integrated into and/or reside in the sensor device since (1) Larson teaches that viewing terminals can be integrated into the patient senor and/or the hosting functionality, including the directions and display messages, may reside on the sensor; (2) it permits the review of the display at the patient itself without a separate monitor; (3) it provides a more compact system; and/or (4) it is a simple substitution of one known element for another to obtain predictable results. Larson teaches of various modes of activating the sensing device including by switch (paragraphs 0053, 0115, 0117, and 0135 of Larson). Haggman discloses a method of interaction with an input component on the device in the form of a tapping interface along a plurality of sides of the device using one or more accelerometers as the motion sensors (FIGS. 3-4C and paragraphs 0020-0032 of Haggman). Haggman further teaches that the accelerometers already existing in the device may be used in the tapping interface (paragraph 0020 of Haggman).1 It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to use the tapping control of Haggman in the system and method of Larson since it provides a simple interface for input and/or it is a simple substitution of one known element for another to obtain predictable results. Further, it would have been obvious to use the already existing accelerometers of Larson as the accelerometers in the tapping control since it reduces the number of accelerometers. Goetz teaches that placing devices in sleep mode may conserve battery life in which the device is placed in sleep mode when there is no activity or alarm condition (col. 7, lines 25-56 of Goetz). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to place the sensing device including display into active mode when there is activity or an alarm condition and into sleep mode when there is no activity or alarm condition after a predetermined amount of time, as suggested by Goetz, since it would conserve battery life while giving the user an opportunity to monitor the device when desired. Cobelli teaches that a transition between sleep/inactive mode to active mode can be instituted after the conditions of an alarm condition are met (paragraphs 0146 and 0163-0165 of Cobelli) and teaches that a sleep/inactive mode can be instituted after the conditions of an alarm condition are no longer met (paragraph 0178-0180 of Cobelli). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to place the sensing device including display into an active mode from a sleep mode after conditions of an alarm condition are met and into a sleep mode after an alarm condition when the conditions for the alarm condition are no longer met, as suggested by Cobelli, so as to partially automate the process. With respect to claim 23, the combination teaches or suggests a user-wearable sensor device (paragraphs 0014, 0051-0055, 0057, 0059, 0069, 0181, 0189, and 0226-0227 and FIG. 3 of Larson; paragraph 0054 of Larson teaches that the sensors can be embedded in articles worn by the patient, such as shirts or underwear bracelets, belts, or collars) configured to be directly or indirectly secured to a user or to an article worn by the user, the user-wearable sensor device comprising: a sensor device housing (paragraph 0054 of Larson teaches that the sensors can be embedded in articles worn by the patient, such as shirts or underwear bracelets, belts, or collars such that the shirt, underwear bracelet, belt, or collar is considered to be the sensor housing with the sensor housing schematically drawn in FIG. 3 of Larson); at least one accelerometer (the accelerometer of Larson; paragraphs 0014, 0051-0055, 0057, 0059, 0069, 0181, 0189, and 0226-0227 and FIG. 3 of Larson); a display unit provided in or on the sensor device housing (the display unit of Larson integrated into and/or reside in the sensor device, which necessarily means that the display unit is provided in or on the sensor device housing); a control system comprising at least one processor configured to: identify a defined user interaction with the user-wearable sensor device based on first acceleration data generated by the at least one accelerometer (the tapping activation suggested by Haggman using the acceleration data from the accelerometer of Larson); in response to identifying the defined user interaction with the user-wearable sensor device, activate the display unit from an inactive state to an activated state to visually indicate a pressurization at one or more body regions (turning on the device including the display of Larson from the sleep mode as suggested by Goetz using the tapping activation suggested by Haggman; paragraphs 0053, 0115, 0117, and 0135 of Larson; paragraphs 0047-0048, 0050-0053, 0139, and 0279 of Larson); subsequently return the display unit to the inactive state (instigating the sleep mode after a predetermined period of time (no activity) of the sleep mode operation suggested by Goetz); automatically monitor an orientation of the user over time based on second acceleration data generated by the at least one accelerometer (the monitoring in FIG. 2B of Larson); automatically determine a defined pressurization-related condition based on the monitored orientation of the user over time (the step 245 in FIG. 2B of Larson); in response to the automatic determination of the defined pressurization-related condition, automatically switch the display unit from the inactive state to the activated state to visually indicate the defined pressurization-related condition or a repositioning notification (the step 260 in FIG. 2B of Larson with the display turned on from the sleep mode as suggested by Cobelli; paragraphs 0047-0048, 0050-0053, 0115, 0117, 0135, 0139, and 0279 of Larson); without manual interaction with the user-wearable sensor device: automatically identity a repositioning of the user that remedies the defined pressurization-related condition based on the monitored orientation of the user over time (detecting the removal of the alarm condition so as to instigate the sleep mode as suggested by Cobelli); and automatically switch the display unit to the inactive state in response to the automatic identification of the repositioning of the user that remedies the defined pressurization-related condition (instigating the sleep mode after removal of the alarm condition as suggested by Cobelli). With respect to claim 24, the combination teaches or suggests that the at least one processor configured to subsequently return the display unit to the inactive state comprises the at least one processor configured to automatically switch the display unit from the activated state to the inactive state upon reaching a defined time-out period (instigating the sleep mode after a predetermined period of time (no activity) of the sleep mode operation suggested by Goetz). With respect to claim 25, the combination teaches or suggests that the defined pressurization-related condition comprises a needed repositioning condition (the step 245 in FIG. 2B of Larson). With respect to claim 27, the combination teaches or suggests that the display unit comprises an LED array or an LCD display (paragraphs 0047-0048, 0050-0053, 0139, and 0279 of Larson; paragraph 0279 of Larson teaches that an LCD display is a suitable form of display such that it would have been obvious to have the display unit of Larson integrated into and/or reside in the sensor device be an LCD display since a type of display is required and Larson teaches one such display and/or it is a simple substitution of one known element for another to obtain predictable results). With respect to claim 29, the combination teaches or suggests a user-wearable sensor device (paragraphs 0014, 0051-0055, 0057, 0059, 0069, 0181, 0189, and 0226-0227 and FIG. 3 of Larson; paragraph 0054 of Larson teaches that the sensors can be embedded in articles worn by the patient, such as shirts or underwear bracelets, belts, or collars) configured to be directly or indirectly secured to a user or to an article worn by the user, the user-wearable sensor device comprising: a sensor device housing (paragraph 0054 of Larson teaches that the sensors can be embedded in articles worn by the patient, such as shirts or underwear bracelets, belts, or collars such that the shirt, underwear bracelet, belt, or collar is considered to be the sensor housing with the sensor housing schematically drawn in FIG. 3 of Larson); at least one accelerometer (the accelerometer of Larson; paragraphs 0014, 0051-0055, 0057, 0059, 0069, 0181, 0189, and 0226-0227 and FIG. 3 of Larson); a display unit including a plurality of display elements provided in or on the sensor device housing (the display unit of Larson integrated into and/or reside in the sensor device, which necessarily means that the display unit is provided in or on the sensor device housing; paragraph 0279 of Larson teaches that an LCD display is a suitable form of display such that it would have been obvious to have the display unit of Larson integrated into and/or reside in the sensor device be an LCD display since a type of display is required and Larson teaches one such display and/or it is a simple substitution of one known element for another to obtain predictable results; the LCD display of Larson has a plurality of display elements with its liquid crystals); a control system comprising at least one processor configured to: identify a defined user interaction with the user-wearable sensor device based on first acceleration data generated by the at least one accelerometer (the tapping activation suggested by Haggman using the acceleration data from the accelerometer of Larson); in response to identifying the defined user interaction with the user-wearable sensor device, automatically awaken the display unit from an inactive state to visually indicate a pressurization at one or more body regions (turning on the device including the display of Larson from the sleep mode as suggested by Goetz using the tapping activation suggested by Haggman; paragraphs 0053, 0115, 0117, and 0135 of Larson; paragraphs 0047-0048, 0050-0053, 0139, and 0279 of Larson); subsequently return the display unit to the inactive state (instigating the sleep mode after a predetermined period of time (no activity) of the sleep mode operation suggested by Goetz); automatically monitor an orientation of the user over time based on second acceleration data generated by the at least one accelerometer (the monitoring in FIG. 2B of Larson); automatically determine a defined pressurization-related condition based on the monitored orientation of the user over time (the step 245 in FIG. 2B of Larson); in response to the automatic determination of the defined pressurization-related condition, awaken the display unit from the inactive state to visually indicate the defined pressurization-related condition (the step 260 in FIG. 2B of Larson with the display turned on from the sleep mode as suggested by Cobelli; paragraphs 0047-0048, 0050-0053, 0115, 0117, 0135, 0139, and 0279 of Larson); and without manual interaction with the user-wearable sensor device: automatically identify a repositioning of the user that remedies the defined pressurization-related condition based on the monitored orientation of the user over time (detecting the removal of the alarm condition so as to instigate the sleep mode as suggested by Cobelli); and in response to the automatic identification of the repositioning of the user that remedies the defined pressurization-related condition, automatically switch the display unit to the inactive state (instigating the sleep mode after removal of the alarm condition as suggested by Cobelli). With respect to claim 31, the combination teaches or suggests that the display unit comprises an LED array or an LCD display (the LCD display of Larson). With respect to claim 32, the combination teaches or suggests that the defined pressurization-related condition comprises a needed repositioning condition (the step 245 in FIG. 2B of Larson); and the at least one processor configured to automatically awaken the display unit from the inactive state to visually indicate the defined pressurization-related condition comprises the at least one processor configured to selectively control the plurality of display elements to visually indicate the needed repositioning condition (the step 260 in FIG. 2B of Larson with the display turned on from the sleep mode as suggested by Cobelli; paragraphs 0053, 0115, 0117, and 0135 of Larson; paragraphs 0047-0048, 0050-0053, 0139, and 0279 of Larson). With respect to claim 34, the combination teaches or suggests a method of operating a user-wearable sensor device (paragraphs 0014, 0051-0055, 0057, 0059, 0069, 0181, 0189, and 0226-0227 and FIG. 3 of Larson; paragraph 0054 of Larson teaches that the sensors can be embedded in articles worn by the patient, such as shirts or underwear bracelets, belts, or collars) configured to be directly or indirectly secured to a user or to an article worn by the user, the user-wearable sensor device including at least one accelerometer (the accelerometer of Larson; paragraphs 0014, 0051-0055, 0057, 0059, 0069, 0181, 0189, and 0226-0227 and FIG. 3 of Larson) and a display unit including a plurality of display elements onboard the user-wearable sensor device (the display unit of Larson integrated into and/or reside in the sensor device, which necessarily means that the display unit is onboard the sensor device; paragraph 0279 of Larson teaches that an LCD display is a suitable form of display such that it would have been obvious to have the display unit of Larson integrated into and/or reside in the sensor device be an LCD display since a type of display is required and Larson teaches one such display and/or it is a simple substitution of one known element for another to obtain predictable results; the LCD display of Larson has a plurality of display elements with its liquid crystals), the method comprising: identifying a defined user interaction with the user-wearable sensor device based on first acceleration data generated by the at least one accelerometer (the tapping activation suggested by Haggman using the acceleration data from the accelerometer of Larson); in response to identifying the defined user interaction with the sensor device, automatically awakening the display unit from an inactive state to an activated state, wherein the display elements are selectively controlled to indicate a pressurization at one or more body regions (turning on the device including the display of Larson from the sleep mode as suggested by Goetz using the tapping activation suggested by Haggman; paragraphs 0053, 0115, 0117, and 0135 of Larson; paragraphs 0047-0048, 0050-0053, 0139, and 0279 of Larson); subsequently return the display unit to the inactive state (instigating the sleep mode after a predetermined period of time (no activity) of the sleep mode operation suggested by Goetz); while the display unit is in an inactive state, automatically monitoring an orientation of the user based on second acceleration data generated by the at least one accelerometer (the monitoring in FIG. 2B of Larson); and automatically determining a defined pressurization-related condition based on the monitored orientation of the user over time (the step 245 in FIG. 2B of Larson); in response to the automatic determination of the defined pressurization-related condition, automatically switching the display unit from the inactive state to the activated state, wherein the display elements are selectively controlled as a function of the defined pressurization-related condition (the step 260 in FIG. 2B of Larson with the display turned on from the sleep mode as suggested by Cobelli; paragraphs 0047-0048, 0050-0053, 0115, 0117, 0135, 0139, and 0279 of Larson); and without manual interaction with the user-wearable sensor device: automatically identifying a repositioning of the user that remedies the defined pressurization-related condition based on the monitored orientation of the user over time (detecting the removal of the alarm condition so as to instigate the sleep mode as suggested by Cobelli); and in response to the automatic identification of the repositioning of the user that remedies the defined pressurization-related condition, automatically switching the display unit from the activated state to the inactive state (instigating the sleep mode after removal of the alarm condition as suggested by Cobelli). With respect to claim 36, the combination teaches or suggests that the display element unit comprises an LED display or an LCD display (the LCD display of Larson). With respect to claim 38, the combination teaches or suggests that: the defined pressurization-related condition comprises a needed repositioning condition (the step 245 in FIG. 2B of Larson); and the plurality of display elements are selectively controlled to visually indicate the needed repositioning condition (the step 260 in FIG. 2B of Larson; paragraphs 0047-0048, 0050-0053, 0115, 0117, 0135, 0139, and 0279 of Larson). Response to Arguments The Applicant’s arguments filed 1/12/2026 have been fully considered. Claim objections In view of the claim amendments filed on 1/12/2026, there are new claim objections. 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph With respect to the use of the expression “without manual interaction with the user-wearable sensor device” in claims 23, 29, and 23, the Applicant asserts: PNG media_image1.png 319 910 media_image1.png Greyscale This argument is not persuasive since “without manual interaction” means more than simply interaction not performed by the hands. For example, the definition by Merriam-Webster.com Dictionary that accompanies this Office Action provides that “manual” also includes “requiring or using physical skill or energy” (see page 1 of the accompanying entry from Merriam-Webster.com). Also, the definition by Cambridge.org Dictionary that accompanies this Office Action provides that “manual” also includes “relating to physical work rather than mental work” (see page 5 of the accompanying entry from Cambridge.org). From these definitions, the expression “without manual interaction” includes definitions in which “there is no interaction that requires physical energy” or “there is no interaction that requires physical work”. Both of these definitions run counter to the specification. When a user is repositioned (that is, there is physical energy or physical work that moves the user), the sensor device detects this movement (see paragraph 00275 of the specification) such that there is clearly manual interaction with the sensor device (that is, there is physical energy or physical work performed on the sensor device) so as to automatically identify the repositioning. This understanding of the specification runs counter to the claim language such that there is confusion as to what is meant by the recitation in light of the teachings of the specification. This confusion renders claims 23, 29, and 34 indefinite. 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, fourth paragraph In view of the claim amendments filed on 1/12/2026, the claim rejections under 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, fourth paragraph, are withdrawn. Prior art rejections In view of the claim amendments filed on 1/12/2026, the prior art rejections have been modified so as to constitute new grounds of rejections that have been necessitated by the claim amendments. The Applicant asserts: PNG media_image2.png 417 935 media_image2.png Greyscale … PNG media_image3.png 268 921 media_image3.png Greyscale PNG media_image4.png 92 916 media_image4.png Greyscale This argument is not persuasive. Paragraph 0052 of Larson teaches that the viewing terminal can be integrated into the patient sensor. Also, Larson teaches that the host system may provide directions and display messages (paragraph 0047 of Larson) and that the host functionality can largely reside in the sensor itself (paragraph 0052 of Larson). Thus, Larson teaches or suggests that the display unit may be integrated into or reside in the sensor device. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have the display unit integrated into and/or reside in the sensor device since (1) Larson teaches that viewing terminals can be integrated into the patient senor and/or the hosting functionality, including the directions and display messages, may reside on the sensor; (2) it permits the review of the display at the patient itself without a separate monitor; (3) it provides a more compact system; and/or (4) it is a simple substitution of one known element for another to obtain predictable results. Since the display unit of Larson is integrated into and/or reside in the sensor device, the display unit is necessarily provided in or on the sensor device housing and/or is necessarily onboard the sensor device. Thus, the combination teaches or suggests the display unit of claims 23, 29, and 34. The Applicant asserts: PNG media_image5.png 126 923 media_image5.png Greyscale After citing page 12 of the Office Action mailed on 10/24/2025, the Applicant asserts: PNG media_image6.png 319 916 media_image6.png Greyscale Further, after citing to page 20 of the Office Action mailed on 10/24/2025, the Applicant asserts: PNG media_image7.png 241 921 media_image7.png Greyscale These arguments are not persuasive. Cobelli is relied upon for the generalized teaching of a transition between sleep/inactive mode to active mode that can be instituted after the conditions of an alarm condition are met (paragraphs 0146 and 0163-0165 of Cobelli) and a sleep/inactive mode that can be instituted after the conditions of an alarm condition are no longer met (paragraph 0178-0180 of Cobelli). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to place the sensing device including display into an active mode from a sleep mode after conditions of an alarm condition are met (that is, when there is a determination as to whether the data suggests that the patient should be repositioned soon in step 245 of Larson) and into a sleep mode after an alarm condition when the conditions for the alarm condition are no longer met (that is, when the caregiver at step 265 of Larson accepts the suggestion for a new position after receiving the message sent to the caregiver suggesting new position at step 260 of Larson), as suggested by Cobelli, so as to partially automate the process. It is Larson that teaches the automatically monitoring of an orientation of the user over time based on second acceleration data generated by the at least one accelerometer (the monitoring in FIG. 2B of Larson); the automatic determination of a defined pressurization-related condition based on the monitored orientation of the user over time (the step 245 in FIG. 2B of Larson when there is a determination as to whether the data suggests that the patient should be repositioned soon in step 245 of Larson); and in response to the automatic determination of the defined pressurization-related condition, visually indicate the defined pressurization-related condition or a repositioning notification (the step 260 in FIG. 2B of Larson; paragraphs 0047-0048, 0050-0053, 0115, 0117, 0135, 0139, and 0279 of Larson). The incorporation of Cobelli means that the step 260 in FIG. 2B of Larson is carried out by the display turning on from the sleep mode, as suggested by Cobelli, with the detection of the defined pressurization-related alarm condition. Next, Larson also teaches that, after an alarm condition, the conditions for the alarm condition are no longer met. This takes the form of the caregiver at step 265 accepting the suggestion for a new position after receiving the message sent to the caregiver suggesting new position at step 260. Cobelli teaches that a sleep/inactive mode can be instituted after the conditions of an alarm condition are no longer met (paragraph 0178-0180 of Cobelli). The incorporation of Cobelli means that, at the step 265 in FIG. 2B of Larson when the caregiver at step 265 accepts the suggestion for a new position, the control system triggers the institution of the sleep/inactive mode based on the detection of the alarm conditions no longer being met, as suggested by Cobelli. Such a condition means that there is a step to automatically identity a repositioning of the user that remedies the defined pressurization-related condition based on the monitored orientation of the user over time since Cobelli discloses the detection of the removal of the alarm condition (which occurs at step 265 of Larson) so as to institute the sleep mode as suggested by Cobelli. One cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). For the above reasons, the combination teaches or suggests all the features of claims 23, 29, and 34 along with all the dependent claims. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MATTHEW KREMER whose telephone number is (571)270-3394. The examiner can normally be reached Monday - Friday 8 am to 6 pm; every other Friday off. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JACQUELINE CHENG can be reached at (571) 272-5596. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /MATTHEW KREMER/Primary Examiner, Art Unit 3791 1 Paragraph 0120 of U.S. Patent Application Publication No. 2017/0316677 (previously cited) also teaches that an accelerometer in a device can have multiple functions including changing an LCD display mode and detecting movement. Additionally, paragraph 0061 of U.S. Patent Application Publication No. 2015/0334079 (previously cited) teaches that an accelerometer may have dual functions of sensing movement of an object and activate the electronic display device of the object.
Read full office action

Prosecution Timeline

Mar 14, 2022
Application Filed
Jan 29, 2025
Non-Final Rejection — §103, §112
Mar 12, 2025
Response Filed
Apr 08, 2025
Final Rejection — §103, §112
Jun 30, 2025
Response after Non-Final Action
Oct 10, 2025
Request for Continued Examination
Oct 16, 2025
Response after Non-Final Action
Oct 21, 2025
Non-Final Rejection — §103, §112
Jan 12, 2026
Response Filed
Feb 25, 2026
Final Rejection — §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594008
PUSH-TO-CHARGE LANCING DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12594220
METHOD AND APPARATUS FOR MONITORING MANUAL CHEST COMPRESSION EFFICIENTLY DURING CPR
2y 5m to grant Granted Apr 07, 2026
Patent 12558075
DEVICE FOR COLLECTING A BIOLOGICAL SAMPLE
2y 5m to grant Granted Feb 24, 2026
Patent 12484825
STRETCH-DEFORMING ELECTRODE AND BIOLOGICAL SENSING SYSTEM
2y 5m to grant Granted Dec 02, 2025
Patent 12419619
ASPIRATION DEVICE
2y 5m to grant Granted Sep 23, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
44%
Grant Probability
96%
With Interview (+51.9%)
4y 5m
Median Time to Grant
High
PTA Risk
Based on 448 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month