DETAILED ACTION
This action is responsive to the Applicant’s response filed 12/17/25.
As indicated in Applicant’s response, claims 1, 4, 6, 8, 10-13, 16, 18-19 have been amended, and claim 5 cancelled. Claims 1-4, 6-20 remain and are pending a next office action.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1, 16 and 19 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. Claim(s) 1, 16, 19 is/are directed to an Abstract Idea. The claim(s) does/do not include additional elements that are sufficient to amount to significantly more than the judicial exception because of the 2 step Eligibility Analysis as follows.
I) Claim 1:
Step I: the claim is directed to a system/device category of Subject Matter
Step II, A:
The system of claim 1 includes (i) arrangement of devices to provide a firmware service provided from firmware instructions executed on plural devices, among which one device operates as orchestrator configured to (ii) monitor, determine or detect change to engagement of a user based on inferences made on information captured from a camera; so that (iii), responsive to a detected change to that engagement, deliver interrupted workflow notifications to one or more devices.
There is no specific details regarding use of firmware instruction in conjunction with execution by the devices in the IHS to clearly describe how monitoring or inferencing from captured camera information will yield a determination that necessarily preclude one achieved by a mental process by which a human can track camera data and determine a change based thereon. That is, the steps of (ii), monitoring, making inferences and detecting a change can be viewed as mental steps performed on basis of presented video information/capture, and this typically belongs to a Judicial Exception of a Abstract Idea type, notwithstanding first, interpretation of the devices arrangement per (i) in that this arrangement can be seen as a pre-activity setting which fails to put forth or define how the acts of monitoring, inferencing and detecting are specifically implemented beyond mere use of visual information for a human to make derivation, and second, the act of delivering notifications per (iii) which is construed as a post-activity feature that does not bring more novelty to how monitoring/inferencing or detecting based on captured video is actually done. As a whole, the “system” claim cannot be seen as carried out by very particular techniques that rise above and beyond generic use of displayed/video information to make derivation or detection of change. The steps of monitoring, making inferences and detecting without specific implementation details are construed as leading to an Abstract Idea type of Judicial Exception, in that no particular Practical Application can be achieved thereby.
Step II, B:
The additional elements such as “heterogenous computing platform”, memory stored “set of firmware instructions”, “execution by a respective device among the plurality” thereof, “enables … to provide a corresponding firmware service”, “engagement of a user”, “orchestrator configured to … deliver notifications to one or more devices” have been considered but for lack of specificity, these elements fail to render the identified “Judicial Exception” to amount to significantly more than one or more mere process steps that can be performed and retained inside a human mind. For example, a orchestrator device to deliver notifications cannot be viewed as a novel way of deriving inference or determination data based on captured video or image view; instead it is viewed as a post-activity that is a mere routine to take only when the mentally derived result has been made available. That is, step B analysis has made it clear that the additional elements when integrated with the eligibility analysis fail to make the Judicial Exception from step A to amount to much more than an Abstract Idea deficiency or a non-practical Application.
II) Claims 16 and 19.
Step I: claim 16 belongs to a device category of subject matter and claim 19 belongs to a method category.
Step II, A:
The device claim 16 and method claim 19 contain, each the same features of the system claim 1, including steps of ‘monitor engagement’, (the engagement is) “determined through inferences” (on video content), “detect a change” (in the engagement) which constitute activities that, when interpreted with the claim language as a whole, are construed as mere activities that can be achieved by a mental process. Hence, based on the analysis of claim 1 (step II, A), the device of claim 16 and method of claim 19 are each directed to a Judicial Exception of an Abstract Idea type, since the mental steps of monitoring, inferring and detecting per the above step II, A can lead to results that stay and remain inside a human mind, as no Practical Application can be necessarily generated from these internal activities.
Step II, B:
The additional elements recited for claim 16 and claim 19 include the same “heterogenous computing platform”, memory stored “set of firmware instructions” (claim 16), “execution by a respective device among the plurality of devices … to provide a firmware service” (claim 16), “first firmware service on an orchestrator” (claim 19) “engagement of a user”, “orchestrator device” (claim 16), “delivering by a second firmware service” (claim 19) ; hence, when integrated with this step II, B analysis, the additional elements to claims 16 and 19 fail to make the Judicial Exception of claim 16 and 19, (identified from step IIA) to amount to much more than the 101 statutory deficiency as set forth therewith.
Therefore, both Claim 16, Claim 19 are rejected as infringing the statute of 35 USC § 101 for reciting an Abstract Idea type of Judicial Exception which cannot be transformed into a practical application.
III) Eligibility Analysis of dependent claims.
Claims 2-3 recite examples of computing platform and orchestrator, hence do add significant teaching to the state of the Judicial exception regarding act of monitoring, inferencing and detecting.
Claim 4 recites basis for evaluating user engagement, in terms of a combining telemetry data and inference into the evaluation, and this cannot significantly render the mental process of claim 1 to become significantly much more than its Judicial exception status.
Claim 6 recites characterization of “interrupted workflow notifications” such as images capture, but this “additional element” fails to convert the mental steps of monitoring, inferencing and detecting in a practical application.
Claim 7 recites response of the I.H.S. regarding audio power but this response fails to significantly convert the mental process aspect of the Judicial Exception into the status of a Practical Application.
Claims 8-9 characterizes what constitutes change in a user engagement based on audio stream and keywords indicative of user presence; but this ““additional element” provides no concrete means that realistically converts or uplifts the mental steps of claim 1 into a practical application.
Claims 10-11 depicts possible destination for the delivery of notification; but this feature cannot be seen at rendering the mental process of claim 1 much more significant than an Abstract Idea type of Judicial Exception.
Claims 12-15 recite nature of the interface, the network or composing structure thereof, to which the delivery of notifications is directed, and like for claims 10-11, are considered “additional elements” of no significance toward resolving the Judicial Exception of claim 1.
Claim 17 recites a possible response to the notifications relates to audio adjust; but this response is remote from converting the mental process of claim 16 into a practical application that is necessarily realized with algorithm and machine.
Claim 18 recites nature of a destination to which delivery of notifications is directed, and like in claims 10-12, this ‘additional element’ is not rendering the Judicial Exception of claim 16 sufficiently for it to become a Practical Application.
Claim 20 for reciting the audio setting adjust of claim 7, equally fails to render the Judicial Exception of claim 19 significantly much more than the Judicial exception aspect thereof.
In all, claims 1, 16, 19 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-4, 6, 8-20 is/are rejected under § 35 U.S.C. 103 as being unpatentable over Lee et al, USPubN: 2019/0353379 (herein Lee) in view of Wright et al, USPubN: 2023/0177948 (herein Wright), Baba Kenji et al, JP 2010154134 (translation), 07-08-2010, 13 pgs (herein Kenji) and Shimura Tomoya, JP 2018067755 (translation) 4-26-2018, 38 pgs (herein Tomoya).
As per claim 1, Lee discloses an Information Handling System (I.H.S. - management system, BMS - Fig. 4, 6; para 0071-0072), comprising: a heterogeneous computing platform comprising a plurality of devices (para 0227, 0263); a memory coupled to the heterogeneous computing platform, wherein the memory comprises (programmable processor, FPGA, ASIC code that constitutes processor firmware - para 0263) a plurality of sets of firmware instructions,
wherein each of the sets of firmware instructions, upon execution by a respective device among the plurality of devices (see ASIC, FPGA from above), enables the respective device to provide a corresponding firmware service (Fig. 6, Fig. 11 – Note1: firmware as SW used in the monitoring reporting 634, coupled with BMS Interface/Building subsystems and data Collector to implement services of security 622, analytics 624, Timeseries 628, Entity 626, Agent 638, Voice Assist 640 - Fig. 6; Fig. 9 ;Voice Assist devices 1144, 1148 - Fig. 11 - reads on firmware associated instructions of the plurality of devices - para 0227, 0263 - in the BMS to support via a collector or monitoring service, execution of respective services - Fig. 12, para 0227-0236 - such as security, analytics, entity, Timeseries related voice Assist services, HVAC control and classification), and
wherein at least one of the plurality of devices operates as an orchestrator (see aggregator – para 0189, 0193, 0197 – Note2: aggregator functionality of one of the smart BMS entities - see below for monitoring/reporting - para 0093, 0097 - and supplying collected data and enacting execution of one of the services - security 622, analytics 624, Timeseries 628, Entity 626, Agent 638, Voice Assist 640 - Fig. 6 - or a device function such as notifications or workflow actions - see below - from acquired or collected occupancy information and usage mode data - see below - reads on orchestrator entity or service) configured to:
monitor engagement of a user of the I.H.S. (occupancy status – para 0254, 0246; occupancy sensor, that zone or space is no longer or not occupied - para 0236; feedback from utterance data – para 0246), wherein the engagement (occupant feedback received via the voice assist devices ... in response to the action taken ... occupant feedback (from utterance data) - para 0246; series of statements made by an occupant - para 0131; sentiment in the received utterance data – para 0259) is determined in part through inferences (expected occupancy (predicted using historical ... data - para 0246, para 0105 - Note3: layer of the BMS using acquired usage information on predicted occupancy, cost, usage, pricing and energy availability, projected pricing for configuring a optimization or from expected occupancy learned from Q-earning - para 0244- and reinforcement machine learning - para 0249 - to predictively determine potential temperature setting associated with room occupancy reads on determination made in part via inference or projected derivation by a layer or predictive control entity - para 0086 – in regard to the IHS usage, occupancy or sensor-based telemetry to provide derivation, adjustment configuration over costs, usage, distribution, storage, price and operational setpoint, all as result from determined user engagement with the I.H.S. expressed via a) the occupancy – para 0245 – user utterance – conversations ... uttered between occupants – para 0232 - b) responsive feedback – occupant feedback - para 0246 - over reward or to query by the system on climate/temperature setting – para 0247; depending on the sentiment in the received utterance data – para 0259 – or c) manifestation, reactive behavior captured from the user ; e.g. engagement expressed on occupancy – para 0245 - utterance – conversations ... uttered between occupants – para 0232 - or manifestation/response – occupant feedback - para 0246 - over action reward or its climate/temperature setting – para 0247; sentiment in the received utterance data – para 0259 from above) of the IHS;
based on the monitored engagement of the user (see user manifestation such as occupancy expression, utterance, and sentiment feedback/response to climate/setpoint settings from Note3; see para 0246-0247, 0259) of the IHS, detect a change in the engagement (that the zone or space is no longer or not occupied – para 0236; detected statements for utterances relating to temperature – para 0232; detecting a utterance relating to the temperature, may acknowledge utterance or describe a propose action “increasing the temperature” – para 0233; feedback received via voice … in response to the action taken for a state … occupancy status or time to expected occupancy, temperature setpoint, Feedback: “Feeling warm”, “Feeling alright”, “Feeling cold” – 0246-0247) of the user of the I.H.S.
A) Lee does not explicitly disclose user engagement determined through inferences in terms of orchestrator device to
monitor engagement of a user with video content presented on the I.H.S, where the engagement is determined through inferences based on information captured by a camera of the I.H.S.
Lee discloses use of cameras and video recorder for surveillance or security subsystem such in those for detecting occupancy and motion (para 0096, 0222) as aspect of user engagement, in that a movement of the occupant into or out of a premise indicates a change in the user engagement with occupying a space; and a response to a system query in terms of feedback indicating sentiment, feeling of the user (Cold, warm – para 0246-0247) also indicates a change in the user engagement with the system toward attempt/effort to improve comfort and climate.
Wright discloses recording occupancy and engagement activities of the user using sensing devices in monitoring controlled space environment, including motion detector, cameras and combination thereof (para 0182, 0266, 0268, 0269, 0275, 0295), the information or images captured by the detectors or cameras including date/time, occurrence, event and/or alert, software and power status (para 0229) or indicative of a serious problem (para 0342), an alarm, notifications, escalation event (para 0346, 0351) and associated commands (para 0220, 0255, 0291) requiring response from authorized operators, personnel or tenants (para 0202, 0263)
Therefore, as capture of user engagement can be determined by images of the user entering or leaving a premise from a surveillance camera, it would have been obvious for one of ordinary skill in the art before the effective filing date of the invention to implement use of surveillance camera and occupancy detector in Lee so that user engagement determined through inferences would be established via analysis and determination of information captured from video content presented on the I.H.S, where inferences on the user engagement is determined based on images – as in Wright - captured by a camera of the I.H.S; because
Video or image capture indicative of the degree of interest or engagement of a person with respect to a premise or a building entrance provide the most solid and easily discernable and mostly irrefutable evidence to the effect that interest of a person reflective via activities timeline or chain of actions by the person with respect to a location, premise has been interrupted, disengaged or ended, which can be indicative of a change or new condition for which a management service or subsystem such as the surveillance system in Lee can provide a response or initiate proper counter-measure, including adjust made to accommodate change in climate affected by diminishing occupancy.
B) Lee does not explicitly disclose orchestrator device to
in response to the detected change in the engagement, of the user of the I.H.S. with the video content presented on the I.H.S, deliver one or more interrupted user workflow notifications to one or more of the plurality of devices of the heterogeneous computing platform
A security alarm and surveillance system is always equipped with notification subsystem to raise alarm or emit notification in dire fault situation or upon detection of critical issue, and this shown in Lee in terms of automated responses upon detection of a fault (e.g. the responses to detected … faults can include providing an alert message to … a maintenance scheduling system or a control algorithm – para 0113)
Wright discloses motion detectors or cameras to operate in surveillance system with transmission of information/signals based on the tracking by sensors and cameras in response to detection of serious conditions or escalation events (para 0342, 0346, 0351) indicative of alarm, or catastrophic threat that require responders’ actions (para 0202, 0263).
Further, Kenji discloses monitoring camera arranged as security device for capturing image in monitoring a person passing through a gate, with notification of the person information from an entering/leaving management device to cause storage of the video storage and meta-information associated thereof (see Abstract; pg. 4) in that entrance/exit time recordation for the person is combined with person shape and affiliation name authentication in form of time series and
stored as shape or behavior pattern (pg. 5-6) such that upon notification by the camera caused by a
person entering/leaving a space, e.g. where the stored metadata and image can be mapped with the
actual capture of image and meta-information obtained under entering/leaving management device to detect a deviant event (pg. 13) as part of video surveillance established with notification
coupling the entering/leaving management device and the gate camera (pg. 10). Hence notification by a surveillance front camera enabling an exit/entry management device to record
a person attribute and shape in order establish legitimacy of the person entering or leaving a
monitored space entails delivery of one or more interrupted user workflow (entering/exiting event) in form of camera capture or notifications transmitted to an entry/exit management device.
Tomoya discloses use of surveillance camera to return image and time series (pg. 14) expressing motion/features of a human as part of an occupancy management device that track occupants along with their graphical characteristics (pg. 3-4) as a person being present or exiting a room; and this entails interruption of a user engagement established from a real-time user presence/absence interruption in accordance with one or more images captured from a graphics frame buffer of a front-facing camera (pg. 5-6) disposed at a entrance management and information system
Therefore, as state of user engagement can be determined (from a surveillance camera) by video images of the user or tenant entering or leaving a premise and change in condition of a premise can be affected by this change in tenant occupancy, it would have been obvious for one of ordinary skill in the art before the effective filing date of the invention to implement response by orchestrator device in Lee so that in response to the detected change in the engagement by the user of the I.H.S. on basis of the video content presented on the I.H.S, the orchestrator would deliver one or more interrupted user workflow notifications – as shown the camera frame by Tomoya, or alarm signal in Wright, or image capture per Kenji - to one or more of the plurality of devices of the heterogeneous computing platform for remediation and management purposes as set forth Kenji and Wright; because
Service or system by a I.H.S operating as security surveillance is intended to prevent, avert, or remedy to alarming, serious events and catastrophic conditions/threats to occupant or person within environment and premises whose security is being managed or monitored by the I.H.S. surveillance equipment such as motion, occupancy detectors by camera or video capture as set forth above, and capability to identify a interruption in the user engagement or disinterest in the building from captured video surveillance indicative of a user leaving or exiting a premise or entrance as set forth above, coupled with a real-time response by the surveillance system to deliver notifications indicative of state of an user, a tenant or an occupant’s interrupted activity or behavior would enable timely communication of notice or message indicator to a responder subsystem by which remediation action or operational adjustment can be undertaken or initiated to meet the critical, affected condition of the premises; or avert further worsening of the conditions caused by the change in the user behavior, which in case of tenant monitoring system by Lee can include making change to premise climate setpoint caused by detected change in occupancy per effect of a tenant disengaging himself/herself from entertaining further business or interest with staying in the premise.
As per claim 2, Lee discloses IHS of claim 1, wherein the heterogeneous computing platform comprises at least one of: a System-On-Chip (SoC), a Field-Programmable Gate Array (e.g. FPGA - para 0127) or an Application-Specific Integrated Circuit (ASIC - para 0099, para 0200)
As per claim 3, Lee discloses IHS of claim 1, wherein the orchestrator comprises at least one of: a sensing hub Smart Hub device ... local devices may include thermostats, valves, actuators, sensors and the likes - para 0220), an Embedded Controller (microprocessors ... embedded in another device - para 0266), or a Baseboard Management Controller (BMC).
As per claim 4, Lee discloses IHS of claim 1, wherein the engagement of the user is evaluated based on a user engagement score (reward … points ... for an action taken - para 0244-0245; gesture recognition - para 0222; voice assist ... may acknowledge ... a proposed action, setting the temperature - para 0233; analyzed utterance ... voice assist device ... temperature ... increased or decreased by an amount determined by the historical utterance data, setpoint ... according to utterance data - para 0235; second signal ... space is no longer occupied ...that time from the last ... utterance was detected as ... exceeds a threshold - para 0236 - Note5: reward obtained from user engagement data or proposed action obtained via a gesture service, utterance capture or voice assist service on basis of user audio/presence/gesture data captured within a space/location expressed in terms of an inference-based or computed reward based on questions - para 0233 - on adjustable increase or decrease in Temperature, action taken and dynamics of occupancy or non-occupancy determined via time exceeding threshold reads on user engagement data established as a score) that is determined based on the inferences (refer to Note3) and telemetry data (e.g. time-series indicating the performance of the BMS, measured values ... exhibit statistical characteristics, information about... a process (e.g. temperature ... flow control process - para 0115; Co2 sensors, motion sensors -para 0117; sensors send utterance data, measurement to the building management system, utterance data received ... ingest sensor data received, translate inbound sensor data into a common format, utterance data ... collected by the sensor - para 0119; series of statements ... and the corresponding times at which the statements were made by the occupant ... timeseries ID, timestamp identifies the time at which the ith sample was collected and value of the ith sample - para 0131).
As per claim 6, Lee does not explicitly disclose IHS of claim 1, wherein the delivery of the one or more interrupted user workflow notifications comprises delivery of one or more images captured from the graphics frame buffer of the camera of the IHS.
As per claim 8, Lee discloses IHS of claim 1, wherein the detected change in
the engagement of the user is determined based at least in part on interpreting an audio stream for
one or more keywords (monitor or listen to ... utterances related to temperature ... via keywords ... uttered between one or more occupants- para 0232) that are indicative of the user's engagement.
As per claim 9, Lee discloses IHS of claim 8, wherein the one or more keywords are indicative of an inquiry concerning the presence of the user (utterances related to temperature ...
uttered between one or more occupants - para 0232; "how hot are you? how cold are you" - para
0233).
As per claim 10, Lee discloses IHS of claim 1, wherein the delivery of the one or more interrupted user workflow notifications (refer to rationale B of claim 1) comprises delivery of the notifications to an optimization service of an OS (e.g. overrides of temperature setpoint, energy costs, lead to thermal efficiency, zone temperature, occupancy status ... decrease the temperature set-point - para 0253-0254; Fig. 13; decrease the temperature ... before the zone is expected to be occupied - para 0237; para 0251-0252; Fig. 9) of the IHS.
As per claim 11, Lee discloses IHS of claim 1, wherein the delivery of the one or more interrupted user workflow notifications comprises delivery of the notifications to a network device (client devices 448, remote applications 444 - Fig. 6; Network 504: Device 514,522,532,542 - Fig. 5) of the IHS.
As per claim 12, Lee discloses IHS of claim 11, wherein the delivery of the one or more interrupted user workflow notifications (refer to rationale B of claim 1) comprises a wireless notification (para 0097-0098; para 0124) delivered to the network device (see above).
As per claim 13, Lee discloses IHS of claim 1, wherein the delivery of the one or more interrupted user workflow notifications comprises delivery of notifications to a wireless device (Mobile Voice Assist Devices 1148 - Fig. 11; para 0225; para 0241) connected to the IHS.
As per claim 14, Lee discloses IHS of claim 13, wherein the notifications are delivered via a wireless interface (refer to claim 12; wireless sensors - para 0117; para 0124) implemented by a first of the plurality of devices (Fig. 5-6, Fig. 11) of the heterogeneous computing platform.
As per claim 15, Lee discloses IHS of claim 14, wherein the wireless device (mobile phone, tablet - para 0225) connected to the I.H.S. comprises at least one of a wireless phone and a tablet (para 0219).
As per claim 16, Lee discloses a memory device having a plurality of sets of firmware instructions, wherein each of the sets of firmware instructions is executable by a respective device among a plurality of devices of a heterogeneous computing platform to enable the respective device to provide a corresponding firmware service (see Note1), and wherein a given one of the plurality of sets of firmware instructions (refer to claim 1), upon execution by a given device, cause the selected device to:
monitor (refer to claim 1) engagement of a user of the IHS with video content presented on the IHS, wherein the engagement is determined in part through inferences (see Note3) based on information captured by a camera (refer to rationale A of claim 1) of the IHS;
based on the monitored engagement of the user (see Note3) of the IHS, detect a change in the engagement of the user (refer to claim 1) of the IHS with the video content (refer to rationale A of claim 1) presented on the IHS; and
in response to the detected change in the engagement (see Note3), deliver one or more interrupted user workflow notifications (refer to rationale B of claim 1) to one or more of the plurality of devices (refer to claim 1) of the heterogeneous computing platform.
As per claim 17, Lee discloses memory device of claim 16, wherein in response to the delivery of the one or more interrupted user workflow notifications, one or more of the plurality of devices of the heterogeneous computing platform respond by adjusting one or more IHS power settings (all of which having been addressed in claim 10)
As per claim 18, Lee discloses memory device of claim 16, wherein the delivery of the one or more interrupted user workflow notifications comprises delivery of the notifications to an optimization service (refer to claim 10) of an operating system of the IHS.
As per claim 19, Lee discloses a method comprising:
configuring, by a first firmware service (refer to claim 1, see Note1) running on an orchestrator device (see Note2) of a heterogeneous computing platform of an Information Handling System (IHS), the monitoring of engagement of a user the IHS with video content (refer to rationale A of claim 1) presented on the IHS,
wherein the engagement is determined in part through inferences (see Note3) based on information captured by a camera (refer to rationale A of claim 1) of the IHS;
sending, by the first firmware service, the inferences (see inferencing by a orchestrator device per Note2) to a second firmware service running in an embedded controller device (e.g. ASIC - para 0099, para 0200) of the heterogeneous computing platform (refer to claim 1) of the IHS;
based on the monitored engagement of the user (refer to claim 1) of the IHS, detecting, by the second firmware service (see above), a change in the engagement of the user (refer to claim 1) of the IHS with the video content (refer to rationale A of claim 1) presented on the IHS; and
delivering, by the second firmware service (see above), in response to the detected change the engagement, one or more of interrupted user workflow notifications (refer to rationale B of claim 1) to one or more of the plurality of devices (refer to claim 1) of the heterogeneous computing platform.
As per claim 20, Lee discloses method of claim 19, wherein, in response to the delivery of the one or more interrupted user workflow notifications, one or more of the plurality of devices of the heterogeneous computing platform respond by conserving IHS power (refer to claim 10) by adjusting one or more IHS settings (overrides of temperature setpoint, energy costs, lead to thermal efficiency, zone temperature, occupancy status ... decrease the temperature set-point - para 0253-0254; Fig. 13; decrease the temperature ... before the zone is expected to be occupied - para 0237; para 0251-0252; Fig. 9).
But delivery of capture video or image as a form of notifications destined for a management subsystem to take corrective measure or actions has been rendered obvious with the teachings by Wright, Kenji and Tomoya with use of front-end camera as set forth with the rationale B in claim 1.
Hence, delivery of the one or more interrupted user workflow notifications comprising one or more images captured from the graphics frame buffer of the camera of the HIS would have been obvious for the same reasons set forth with rationale B in claim 1.
Claims 7 is/are rejected under § 35 U.S.C. 103 as being unpatentable over Lee et al, USPubN: 2019/0353379 (herein Lee) in view of Wright et al, USPubN: 2023/0177948 (herein Wright), Baba Kenji et al, JP 2010154134 (translation), 07-08-2010, 13 pgs (herein Kenji) and Shimura Tomoya, JP 2018067755 (translation) 4-26-2018, 38 pgs (herein Tomoya) further in view of JP 3742474, (translation) 02-01-2006, 20 pgs (herein '474)
As per claim 7, Lee does not explicitly disclose
in response to the delivery of the one or more interrupted user workflow notifications, one or more of the plurality of devices of the heterogeneous computing platform respond by conserving IHS power by at least one of: muting audio of the IHS and adjusting IHS performance of one or more IHS processors.
Lee discloses response by I.H.S orchestrator performing adjust to control of operations by the HVAC based on the utterance engagement or presence conditions of the user (para 0251-0258; transmit a subsequent natural language utterance - para 0232; proposed action such as "increasing the
temperature', "lowering the temperature, "setting the temperature to 72 degrees" - para 0233)
The lowering temperature in Lee, entails managed provision and consumption of thermal power to mitigate energy cost caused by insufficient control in the consumption of energy by the devices and in the supply of energy to manage proper climate of the I.H.S to accommodate the user feedback.
Analogous to use of voice/utterances detect/capture service in Lee (para 0223,0225), '474 discloses an energy management system that targets proper power utilization by computers of the system, such as decreasing power supply of the audio amplifier as a way to save energy (para 0005), the audio power control thereof including muting the amplifier component of the audio (see Abstract) per effect of asserting a control that trigger speaker muting signal (para 0008) in conjunction with the muting of the amplifier, the muting being enhanced with elimination of pops, clicks and snaps likely caused by the computer power fluctuations that accompany the effect of the muting signal (para 0052); hence conservation of power or energy to computers per effect of sending a muting command to the audio components is recognized.
Therefore, as occupancy state require adjust to control over operational and energy set point to reduce power spending in Lee (para 0247-0253) it would have been obvious for one of ordinary skill in the art before the effective filing date of the invention to implement adjust to the settings of the heterogenous platform in Lee to that conservation on energy or over power expenditure by components and electronics of the platform can be made by increasing or decreasing performance of one or more IHS central processing unit clusters, IHS networking and one or more IHS graphical processing units, including a form of energy conserving as shown in '474 with a control that asserts a muting signal sent to a respective component to turn off audio output or amplification of sound, which in turn would reduce the energy consumption portion of the HW controller; because
activating workflow actions in response to indication or acquired information indicative of user engagement or level of comfort as intended by Lee building management system (BMS) would necessarily identify portions of various processing unit in accordance to performance or activities that can be optimized due to risk of excessive cost on energy spent, the conservation of power related thereof including elimination of excessive power expenditure carried out in part with adjusting made to relevant IHS setting to decrease IHS performance of one or more IHS central processing unit clusters, IHS networking and one or more IHS graphical processing units, such reduction or decrease in performance including for instance, sending a control that asserts a muting effect to a respective HW component (amplifier) of the voice assisting service to turn off audio output aspect of a service that provides and analyze received sound from the occupants inside a place or location, thereby conserving a measure of power consumption and alleviate relevant energy costs that likely to incur as part of the SW/HW automation provided by the voice assisting service in Lee.
Response to Arguments
Applicant's arguments filed 12/17/25 have been fully considered but they are not persuasive. Following are the Examiner’s observations in regard thereto.
Applicants have submitted that Lee’s use of voice assist device with occupancy sensor cannot teach or suggest “monitor engagement of a user with video content presented on the I.H.S.” so to “detect a change in the engagement of the user of the I.H.S” (Applicants Remarks pg. 8). The grounds of rejection has now been adjusted to prosecute this above-mentioned limitation and any premature boost/allegation over patentability merits of this feature is deemed non-prima facie case of presenting rebut.
Applicants have submitted that neither Kenji (personal authentication) and/or Tomoya (dual detectors on a same target) is seen as remedying to the deficiency by Lee in regard to the feature of “monitor engagement of a user with video content presented on the I.H.S.” so to “detect a change in the engagement of the user of the I.H.S” as now recited in claims 1, 16 ,19 (Applicants Remarks, bottom pg. 8, pg. 9). Any expressed position made in regard to merits of a feature/language of an amendment is construed as largely MOOT for the reasons set forth above in response to allegation against Lee.
In all, the claims as amended stand rejected as set forth above.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Tuan A Vu whose telephone number is (571) 272-3735. The examiner can normally be reached on 8AM-4:30PM/Mon-Fri.
If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Chat Do can be reached on (571)272-3721.
The fax phone number for the organization where this application or proceeding is assigned is (571) 273-3735 ( for non-official correspondence - please consult Examiner before using) or 571-273-8300 ( for official correspondence) or redirected to customer service at 571-272-3609.
Any inquiry of a general nature or relating to the status of this application should be directed to the TC 2100 Group receptionist: 571-272-2100.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
/Tuan A Vu/
Primary Examiner, Art Unit 2193
January 27, 2026