Prosecution Insights
Last updated: April 19, 2026
Application No. 18/286,633

ENTERTAINMENT CONTENT PROVIDER

Final Rejection §102
Filed
Oct 12, 2023
Examiner
JOHNSON-CALDERON, FRANK J
Art Unit
2425
Tech Center
2400 — Computer Networks
Assignee
Koninklijke Philips N V
OA Round
2 (Final)
57%
Grant Probability
Moderate
3-4
OA Rounds
2y 11m
To Grant
77%
With Interview

Examiner Intelligence

Grants 57% of resolved cases
57%
Career Allow Rate
127 granted / 222 resolved
-0.8% vs TC avg
Strong +20% interview lift
Without
With
+20.0%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
21 currently pending
Career history
243
Total Applications
across all art units

Statute-Specific Performance

§101
4.3%
-35.7% vs TC avg
§103
67.1%
+27.1% vs TC avg
§102
17.0%
-23.0% vs TC avg
§112
7.2%
-32.8% vs TC avg
Black line = Tech Center average estimate • Based on career data from 222 resolved cases

Office Action

§102
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments filed 12/18/2025 have been fully considered but they are not persuasive. Regarding claims 1 Applicant argues (pg.14 of the Remarks) that Chan fails to teach “at least one risk factor that is based on a previously determined detrimental effect of patient behavior on the outcome of the medical procedure” because Chan “discloses patient responses such as pain, anxiety (see 0012), and the need for compliance (see 0173), but these are general patient states or anticipated responses and not structured "risk factors" that are explicitly tied to previously determined detrimental effects of specific patient behaviors (such as motion, non-compliance, or other actions that negatively impact the procedure outcome).” Examiner respectfully disagrees. Both anxiety and pain are considered risk factors under the broadest reasonable interpretation and Chan teaches reframing that patient’s perception (i.e., patient behavior), so they have a positive experience (i.e., outcome of the medical procedure.) Chan teaches (¶0057-¶0058 and ¶0135) VR may provide an effective drug free way to reduce fear and pain associated with a variety of medical procedures, procedures/actions are associated with differing levels of pain/intensity (e.g., mild pain, moderate pain, severe pain); (¶0059 and ¶0127) the VR system aim to manipulate perceptions of pain by using VR experiences designed to facilitate re-imagining of a physical procedure experience by the patient. Reframing to modify the patient experience resulting in reduced pain and/or anxiety; (¶0135) different creative scenarios may characterize the physical sensation in an alternative way and therefore be applied for reframing each sensation in a VR experience; (¶0155, ¶0159, ¶0161) reframing, aims to alter the patients perception of the sensation into a non-threatening or more positive experience; (¶0156) A key aspect of the VR experience is to reframe perceptions (i.e., patient behavior) associated with the procedure. Therefore for an initial step it is important to understand the anticipated patient response associated with a procedure (i.e., previously determined), for example anticipated pain and anxiety responses and desired transposition through VR to target response perceptions. FIGS. 14a and 14b illustrate a model of pain and anxiety responses for each procedure and requirements for cognitive presence or response from the patient during the procedure. This model is used to develop a resource library of transposition VR or AR experiences for each procedure. The models may vary based on patient age and/or mental acuity; (¶0205) many different “creative” scenarios may be utilised in the transposition of sensations or reframing of procedural steps, the characteristics of the transposition is based on the procedural requirements or response modulation in a desired way in accordance with the model described with reference to FIGS. 14a-b and reframing the context of the VR experience based on the patients expected response; (¶0210) The VR transposition resource library stores for each procedural action associated with a physical sensation and potentially inducing an anxiety or pain response, defined characteristics of a VR transposition to modify perception for the action of any one or more of pain, anxiety or presence. Examiner note: Chan additionally teaches (¶0198-¶0200) using VR to keep the patient still during procedure when it is particularly critical for the patient to keep (¶0160-¶0161) and using VR for patient cooperation with a physical examination Applicant argues (pg. 14-15 of the Remarks) that Chen does not teach "an entertainment block length" or "predetermined expected effect data on one or more procedural steps from the procedural step database." Examiner respectfully disagrees. Chan teaches (¶0175) The procedure time may vary and VR experiences of different lengths be generated accordingly, for example 1 minute (e.g. for injection), 3 minutes (e.g. for venepuncture), and 7 minutes (e.g. for IV cannulation). Other types of procedures are also envisaged which may have different duration or number of steps;(¶0208) Obtaining the VR experience components may involve selecting the VR experience component from a library of pre-prepared VR experience components. For example, the library or database may store a plurality of VR experience components—short VR experience segments (i.e., entertainment blocks) which may be joined/edited together to form a longer VR experience—indexed by theme and VR transposition characteristics; (¶0210) These sequences define the steps required for performing each procedure and may optionally include some timing data such as typical time ranges for execution of the procedural step and/or whether or not the step maybe repeated in a sequence (for example cleaning wound may require multiple swipes depending on the size of the wound. The VR transposition resource library stores for each procedural action associated with a physical sensation and potentially inducing an anxiety or pain response, defined characteristics of a VR transposition to modify perception for the action of any one or more of pain, anxiety or presence. This library also stores a plurality of VR experience components for each defined VR transposition. The VR experience component is a small portion of a VR experience which is directly associated with the physical action and the VR experience component fulfils the characteristics of the defined VR transposition in the context of one or more VR experience themes; (¶0206) For example, in a VR context transposition actions (characteristics) are characterised as mimicking duration (i.e., length) and attributes of the physical sensation (such as pressure, vibration or sharpness) but in a context of an interaction which is typically associated with less pain or anxiety than the actual physical action. Examiner is glad to grant applicant an interview but requests that an agenda which indicates in advance what issues Applicant desire to discuss at the interview by submitting, in writing, a proposed amendment or argument. This would permit the examiner to prepare in advance for the interview. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claim(s) 1-7, 9-14, 16-22 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Chan et al. (US 20200233485, hereinafter Chan) Regarding claim 1, “An entertainment content provider configured to provide an entertainment sequence to a patient during a medical procedure” Chan teaches (¶0056) a virtual reality device providing a virtual reality (VR) experience to the wearer. The VR experiences can be designed to facilitate re-imagining of a physical procedure experience by the wearer. Examples of applications for this device and VR experiences include medical procedures for children such as giving injections or taking blood samples. As to “wherein the medical procedure comprises a succession of procedural steps” Chan teaches (Fig. 17, Fig. 20, and ¶0203) procedures involve various steps. As to “comprising: a procedural step database comprising a catalogue of procedural steps associated with the medical procedure and wherein each procedural step in the catalogue includes procedural step metadata including a procedural step identifier” Chan teaches (¶0208 and Fig. 20) library/database may store a plurality of VR experience components—short VR experience segments which may be joined/edited together to form a longer VR experience—indexed by theme and VR transposition characteristics. The VR experience components can be obtained by look up and selected based on theme and transposition characteristics for physical action. Alternatively, a VR experience component can be created based on the characteristics of the defined VR transposition; (¶0201-¶0203) The steps for common procedures may be defined in a medical resource library. As to “a procedural step length” Chan teaches (¶0210 and ¶0175) The medical procedure library stores one or more sequences of procedural actions for one or more medical procedures. These sequences define the steps required for performing each procedure and may optionally include some timing data such as typical time ranges for execution of the procedural step and/or whether or not the step maybe repeated in a sequence (for example cleaning wound may require multiple swipes depending on the size of the wound; (¶0201 and ¶0225) Choreography between the physical procedure and VR experience stems from the procedural requirements and in particular timing for the procedural steps. As to “and at least one risk factor that is based on a previously determined detrimental effect of patient behavior on the outcome of the medical procedure” Chan teaches (¶0057-¶0058 and ¶0135) VR may provide an effective drug free way to reduce fear and pain associated with a variety of medical procedures, procedures/actions are associated with differing levels of pain/intensity (e.g., mild pain, moderate pain, severe pain); (¶0059 and ¶0127) the VR system aim to manipulate perceptions of pain by using VR experiences designed to facilitate re-imagining of a physical procedure experience by the patient. Reframing to modify the patient experience resulting in reduced pain and/or anxiety; (¶0135) different creative scenarios may characterize the physical sensation in an alternative way and therefore be applied for reframing each sensation in a VR experience; (¶0155, ¶0159, ¶0161) reframing, aims to alter the patients perception of the sensation into a non-threatening or more positive experience; (¶0156) A key aspect of the VR experience is to reframe perceptions (i.e., patient behavior) associated with the procedure. Therefore for an initial step it is important to understand the anticipated patient response associated with a procedure (i.e., previously determined), for example anticipated pain and anxiety responses and desired transposition through VR to target response perceptions. FIGS. 14a and 14b illustrate a model of pain and anxiety responses for each procedure and requirements for cognitive presence or response from the patient during the procedure. This model is used to develop a resource library of transposition VR or AR experiences for each procedure. The models may vary based on patient age and/or mental acuity; (¶0205) many different “creative” scenarios may be utilised in the transposition of sensations or reframing of procedural steps, the characteristics of the transposition is based on the procedural requirements or response modulation in a desired way in accordance with the model described with reference to FIGS. 14a-b and reframing the context of the VR experience based on the patients expected response; (¶0210) The VR transposition resource library stores for each procedural action associated with a physical sensation and potentially inducing an anxiety or pain response, defined characteristics of a VR transposition to modify perception for the action of any one or more of pain, anxiety or presence. Examiner note: Chan additionally teaches (¶0198-¶0200) using VR to keep the patient still during procedure when it is particularly critical for the patient to keep (¶0160-¶0161) and using VR for patient cooperation with a physical examination As to “a curated content database, comprising a catalogue of entertainment blocks, wherein each entertainment block includes:- entertainment content; and - metadata including: - an entertainment block length; and - predetermined expected effect data on one or more procedural steps from the procedural step database” Chan teaches (¶0203) For each step of a medical procedure a VR transposition library may store multiple different VR scenarios that may be used to reframe the physical sensations of the procedure or encourage relaxation and patient compliance; (¶0210-¶0211) These sequences define the steps required for performing each procedure and may optionally include some timing data such as typical time ranges for execution of the procedural step and/or whether or not the step maybe repeated in a sequence (for example cleaning wound may require multiple swipes depending on the size of the wound. The VR transposition resource library stores for each procedural action associated with a physical sensation and potentially inducing an anxiety or pain response, defined characteristics of a VR transposition to modify perception for the action of any one or more of pain, anxiety or presence. This library also stores a plurality of VR experience components for each defined VR transposition. The VR experience component is a small portion of a VR experience which is directly associated with the physical action and the VR experience component fulfils the characteristics of the defined VR transposition in the context of one or more VR experience themes; (¶0173, ¶0175) The patient variable/controllable aspects relate to guiding selection of perception modulation strategies for building a patient and procedure specific VR (or AR) experience. The VR experiences can be designed and created from a VR resource library, which enables customization the VR experience based on the procedure, therapeutic intent, and individual end-user preferences. The procedure time may vary and VR experiences of different lengths be generated accordingly, for example 1 minute (e.g. for injection), 3 minutes (e.g. for venepuncture), and 7 minutes (e.g. for IV cannulation). Other types of procedures are also envisaged which may have different duration or number of steps; (¶0208) Obtaining the VR experience components may involve selecting the VR experience component from a library of pre-prepared VR experience components. For example, the library or database may store a plurality of VR experience components—short VR experience segments (i.e., entertainment blocks) which may be joined/edited together to form a longer VR experience—indexed by theme and VR transposition characteristics; (¶0206) For example, in a VR context transposition actions (characteristics) are characterised as mimicking duration (i.e., length) and attributes of the physical sensation (such as pressure, vibration or sharpness) but in a context of an interaction which is typically associated with less pain or anxiety than the actual physical action. As to “a content scheduler, configured to receive medical procedure data of the medical procedure, comprising data relating to the procedural steps in the succession of procedural steps, including an order of the procedural steps in the succession of procedural steps; consult the procedural step database and select procedural step options corresponding with the successive procedural steps of the medical procedure; consult the curated content database and select entertainment blocks by matching the risk factor of the selected procedural options with a most suitable predetermined expected effect data of an entertainment block; create an entertainment content schedule by listing the selected entertainment blocks based on the order of the procedural steps in the succession of procedural steps” Chan teaches (¶0134) Storyboard and wireframes are designed: multiple medical procedures are observed and timed, to identify activities within the procedure and specific time-points for each of these activities. This storyboarding is used to choreograph the VR experience with the procedure. A storyboard is designed to reflect the specific timing and activities that occur during these time-points; (¶0210-¶0211 and ¶0208) a virtual reality (VR) experience generation system. This system can comprise a medical procedure library, a VR transposition library and a VR experience compiler. The medical procedure library stores one or more sequences of procedural actions for one or more medical procedures. These sequences define the steps required for performing each procedure and may optionally include some timing data such as typical time ranges for execution of the procedural step and/or whether or not the step maybe repeated in a sequence (for example cleaning wound may require multiple swipes depending on the size of the wound. The VR transposition resource library stores for each procedural action associated with a physical sensation and potentially inducing an anxiety or pain response, defined characteristics of a VR transposition to modify perception for the action of any one or more of pain, anxiety or presence. This library also stores a plurality of VR experience components for each defined VR transposition. The VR experience component is a small portion of a VR experience which is directly associated with the physical action and the VR experience component fulfils the characteristics of the defined VR transposition in the context of one or more VR experience themes. For example, a nibble of a fish is a VR experience component associated with an action such as a needle prick, or plucking of a suture—it should be appreciated that the same VR experience component maybe suitable for association with more than one physical action if the VR transposition characterization is the same for each of the different actions. Each VR experience component is developed in accordance with a theme (for example, scuba diving, fishing, forest, space) to enable VR experiences to be generated using different components selected based on procedural actions but having a common theme so that the individual VR experience components can be compiled in to an end to end VR experience and narrative, ordered based on the procedural steps. As to “and a content executor, configured to: receive the entertainment content schedule from the content scheduler; retrieve the selected entertainment blocks in the content schedule from the curated content database;- provide entertainment content to the patient during the medical procedure by presenting the retrieved entertainment blocks according to the entertainment content schedule.” Chan teaches (¶0211) The VR experience compiler is configured to compile a VR experience for a medical procedure by retrieving from the medical procedure library a sequence of procedural actions for the medical procedure. The compiler selects from the VR transposition resource library a VR experience component for each defined VR transposition using a common VR experience theme. Compiling the selected VR experience components into a VR experience is based on the action sequence for the procedure. This may include adding additional VR experience components for linking the action based VR components into a sensible narrative and choreographing the VR experience with the procedure—for example linking VR components may be used (an allowed to be repeated or skipped) to ensure alignment of timing between physical and virtual actions during procedure execution. The VR generation system may be implemented using conventional computer hardware processing a memory resources, such as a PC, server or distributed (cloud) processing a memory resources, with the compiler implemented in software and the databases storing the medical procedure and VR transposition resource libraries. It should be appreciated that the data stored in the VR transposition library comprises VR transposition definitions and VR experience components as discussed above. The VR resource library may also store end to end VR experiences (having one or more themes) for procedures, for example for common procedures, to avoid the need to compile a new VR experience each time the VR experience is required. Regarding claim 2, “The entertainment content provider according to claim 1, wherein the content scheduler is further configured to also select entertainment blocks by matching or adapting an entertainment block length to a procedural step length.” Chan teaches (¶0171 and ¶0175) The VR experience for the procedure reflects the real-world experience, and the VR experience is coordinated and timed to the real-world experience; (¶0210) The medical procedure library stores one or more sequences of procedural actions for one or more medical procedures. These sequences define the steps required for performing each procedure and may optionally include some timing data such as typical time ranges for execution of the procedural step and/or whether or not the step maybe repeated in a sequence (for example cleaning wound may require multiple swipes depending on the size of the wound; (¶0211) This may include adding additional VR experience components for linking the action based VR components into a sensible narrative and choreographing the VR experience with the procedure—for example linking VR components may be used (an allowed to be repeated or skipped) to ensure alignment of timing between physical and virtual actions during procedure execution; (¶0213 and Fig. 21) the VR experience can be automatically modified; (¶0215) automatic modification of the VR experience (i.e. for needle procedure) as informed by sensor(s) (e.g. camera, voice, worn band or badge, in room sensors operating on light or infrared) to appropriately alter stimuli for example by inserting additional graphical figures to where the operator is or is not, additionally to alter the length of the experience, or play pre or post procedure music; see also (¶0225, ¶0192) Regarding claim 3, “The entertainment content provider according to claim 1, wherein the entertainment blocks further comprise metadata relating to storyline priority and wherein the content scheduler is further configured to also select entertainment blocks based on the storyline priority metadata.” Chan teaches (¶0205, ¶0210) many different “creative” scenarios may be utilized in the transposition of sensations or reframing of procedural steps, however the characteristics of the transposition is based on the procedural requirements or response modulation in a desired way in accordance with the model described with reference to FIGS. 14a-b. For example, for the steps of applying a face mask and taking deep breaths to inhale the anesthetic the anticipated patient response would be one of anxiety rather than pain, but the patient needs to cognitively respond—follow the instruction to take deep breaths—thus the objective for reframing is to reduce anxiety while retaining sufficient cognitive presence to follow instructions reframed in the context of the VR experience. Using the space or underwater scuba diving scenarios provides alignment of the physical sensations and cognitive response to instructions with the procedural requirements, in an entertaining fantasy context. It should be appreciated that for patients with a fear of water a diving scenario may be inappropriate and not server to reduce anxiety, for such patients a space theme may be more appropriate; (¶0203) transpositions may also be grouped via creative theme for ease of reference to build VR experiences. For the preparation steps of the procedure during the introduction phase the VR theme context is established, for example on a boat near a tropical island preparing to go diving or on a space ship getting ready for a spacewalk. In each of these scenarios the patient is provides with some distraction and introduced to a mindset of using artificial breathing apparatus, which correlates well with the physical requirement for a mask for delivery of the anaesthetic; (¶0208) selected based on theme; see also ¶0155, ¶0183-¶0184. Regarding claim 4, “The entertainment content provider according to claim 1, wherein the entertainment block length is fixed, adaptable or dynamic.” Chan teaches (¶0201, ¶0225, ¶0235, ¶0175, ¶0192-¶0193) Choreography between the physical procedure and VR experience stems from the procedural requirements and in particular timing for the procedural steps. This timing may be fixed or responsive to feedback from the patient or clinician—for example, waiting for a patient to be sufficiently relaxed, some steps of a procedure may take longer or shorter depending on the circumstances (i.e. dental drilling or giving blood). Regarding claim 5, “The entertainment content provider according to claim 1, wherein the entertainment block further includes patient target group information indicating for which patient target group the entertainment content is most suitable, for instance a patient target group based on age or age group, gender, cognitive ability, eye sight, hearing, literacy, diagnosis, special interests and the like and the content scheduler is further configured to also select entertainment blocks based on the patient target group information.” Chan teaches (¶0232) the patient may be able to choose content (i.e. theme for VR experience) according to personal interests/preferences. The patient may also be able to choose the extent to which they will be immersed in the VR/AR experience. For example, for adult patients less immersion may be desirable; (¶0244) the system may be utilised for personalised and adaptable VR treatment programs for phobias or rehabilitation; (¶0241) analyse patient data from multiple procedures to understand success and risk factors for specific patient demographics, processes and procedures. Utilisation of AI may enable improved automation of patient specific customisation/personalisation of VR experiences. Regarding claim 6, “The entertainment content provider according to claim 1, wherein the content scheduler is further configured to: receive updated medical procedure data before or during the medical procedure, comprising updated data relating to the procedural steps in the succession of procedural steps; compare the updated medical procedure data with the medical procedure data; and in case any of the procedural steps has changed, has been removed and/or added:- consult the procedural step database and select the procedural step options relating to the successive procedural steps of the updated medical procedure; consult the curated content database and select entertainment blocks based on the selected procedural options; update the entertainment content schedule by listing the selected entertainment blocks based on the order of the procedural steps in the succession of procedural steps of the updated medical procedure.” Chan teaches (¶0235 and ¶0239) the modification may be triggered by sensing of changes in the environment, such as proximity of needles to the skin or movement of the clinician. The experience is modified in anticipation of the patient needs 2145 and the changes are delivered to the patient 2130 to encourage maintaining or attaining the target mental state; (¶0201) Choreography between the physical procedure and VR experience stems from the procedural requirements and in particular timing for the procedural steps. This timing may be fixed or responsive to feedback from the patient or clinician—for example, waiting for a patient to be sufficiently relaxed, some steps of a procedure may take longer or shorter depending on the circumstances (i.e. dental drilling or giving blood); (¶0192-¶0193) Interactivity: the VR experiences are designed to be interactive in this case visually by looking at objects, but could also be via hand-held sensors or controllers and voice. The interactivity is choice-based. This can enable some patient control over the pace of the procedure, based on monitoring the interactions within the VR experience, for example by delaying the next phase, and the narrative coordinating the actions of the clinician, until the patient interaction is complete indicating readiness for the next phase. Readiness for a next phase may also be gauged by the VR device based on physiologic feedback from sensors i.e. heart rate, respiratory rate, pupil dilation, blood oxygenation etc. Specific timing of sensory information is designed to reflect what is happening in the procedure; (¶0126, ¶0155, ¶0184, ¶0186, ¶0230) dynamically modify VR experiences, for example to repeat relaxation exercises until a patient heart or respiratory rate are within a target range to move on to the next step of a procedure (see Fig. 20 repetition of step 4 for example.) Regarding claim 7, “The entertainment content provider according to claim 1, further comprising:- a patient behavior module, comprising: a patient behavior data receiver configured to receive patient behavior data relating to the medical procedure wherein the patient behavior data is previously obtained patient behavior data, library patient data, modeled patient behavior and/or real-time acquired patient behavior data.; a patient behavior data provider for providing the patient behavior data to the content scheduler; and wherein the content scheduler is further configured to receive the patient behavior data from the patient behavior module and to consult the curated content database and select entertainment blocks also based on the patient behavior data.” Chan teaches (¶0158 and ¶0219-¶0223) The overarching framework used to determine the most appropriate way to support patients during procedures is determine by 3 continuums: 1. Level of pain caused by the medical procedure; 2. Level of baseline anxiety of the patient/client; 3. Level of presence either required by the procedure (e.g. need to be present and aware as the patient is required to follow specific commands or actions) or preferred by the patient/client (e.g. a patient personally prefers to be fully aware of the procedural steps versus being very distracted), FIG. 14a illustrates these three continuums as axes for a 3D block model although this may be equally conceptualised as a spectrum. Depending on where the patient undergoing the procedure lies on these spectrums determines the VR approach; (¶0232-¶0234) personalized modulation of VR content; (¶0240-¶0241, ¶0244) personalized modification based on previous healthcare data. Regarding claim 9, “The entertainment content provider according to claim 7, wherein a neural network is configured to analyze the previously obtained patient behavior data, library patient data, modeled patient behavior data and/or real-time acquired patient behavior data to generate expected patient behavior data forming or replacing the patient behavior data.” Chan teaches (¶0241) the system may also be configured to utilize artificial intelligence (AI) or machine learning algorithms to analyze patient data from multiple procedures to understand success and risk factors for specific patient demographics, processes and procedures. Utilization of AI may enable improved automation of patient specific customization/personalization of VR experiences. Regarding claim 10, “The entertainment content provider according to claim 1, wherein the content executor is further configured to receive updated medical procedure data;- adapt the entertainment schedule by adapting, removing or adding at least one entertainment content blocks; and - provide the adapted entertainment content.” Chan teaches (¶0235 and ¶0239) the modification may be triggered by sensing of changes in the environment, such as proximity of needles to the skin or movement of the clinician. The experience is modified in anticipation of the patient needs 2145 and the changes are delivered to the patient 2130 to encourage maintaining or attaining the target mental state; (¶0201) Choreography between the physical procedure and VR experience stems from the procedural requirements and in particular timing for the procedural steps. This timing may be fixed or responsive to feedback from the patient or clinician—for example, waiting for a patient to be sufficiently relaxed, some steps of a procedure may take longer or shorter depending on the circumstances (i.e. dental drilling or giving blood); (¶0192-¶0193) Interactivity: the VR experiences are designed to be interactive in this case visually by looking at objects, but could also be via hand-held sensors or controllers and voice. The interactivity is choice-based. This can enable some patient control over the pace of the procedure, based on monitoring the interactions within the VR experience, for example by delaying the next phase, and the narrative coordinating the actions of the clinician, until the patient interaction is complete indicating readiness for the next phase. Readiness for a next phase may also be gauged by the VR device based on physiologic feedback from sensors i.e. heart rate, respiratory rate, pupil dilation, blood oxygenation etc. Specific timing of sensory information is designed to reflect what is happening in the procedure; (¶0126, ¶0155, ¶0184, ¶0186, ¶0230) dynamically modify VR experiences, for example to repeat relaxation exercises until a patient heart or respiratory rate are within a target range to move on to the next step of a procedure (see Fig. 20 repetition of step 4 for example.) Regarding claim 11, its rejection is similar to claim 1. Regarding claim 12, “The method according to claim 11, further comprising the steps of:- receiving updated medical procedure data before or during the medical procedure, comprising updated data relating to the procedural steps in the succession of procedural steps; comparing the updated medical procedure data with the medical procedure data; and in case any of the procedural steps has changed, has been removed and/or added:- consulting the procedural step database and select the procedural step options relating to the successive procedural steps of the updated medical procedure; consulting the curated content database and select entertainment blocks based on the selected procedural options; updating the entertainment content schedule by listing the selected entertainment blocks based on the order of the procedural steps in the succession of procedural steps of the updated medical procedure.” Chan teaches (¶0192-¶0193) Interactivity: the VR experiences are designed to be interactive in this case visually by looking at objects, but could also be via hand-held sensors or controllers and voice. The interactivity is choice-based. This can enable some patient control over the pace of the procedure, based on monitoring the interactions within the VR experience, for example by delaying the next phase, and the narrative coordinating the actions of the clinician, until the patient interaction is complete indicating readiness for the next phase. Readiness for a next phase may also be gauged by the VR device based on physiologic feedback from sensors i.e. heart rate, respiratory rate, pupil dilation, blood oxygenation etc. Specific timing of sensory information is designed to reflect what is happening in the procedure; (¶0126, ¶0155, ¶0184, ¶0186, ¶0230) dynamically modify VR experiences, for example to repeat relaxation exercises until a patient heart or respiratory rate are within a target range to move on to the next step of a procedure. Regarding claim 13, “Method according to claims 11, further comprising the steps of:- receiving patient behavior data relating to the medical procedure;- additionally using the patient behavior data to select entertainment blocks.” Chan teaches (¶0219-¶0223, ¶0126, ¶0192, ¶0230) Data collected during procedures may also be utilised for modification of VR experiences (either dynamically as the procedure is underway or for future procedures). This may include but is not limited to any one or more of: Physiological observations: heart rate, respiratory rate, oxygen saturation, galvanic skin conductance; Eye tracking, gaze, blink, pupil dilation; Procedure: timing, major movements, key stages, procedural site, procedural success/failure; Voice: to respond to specific voice-activated commands, to identify changes in pitch, volume and rate which may provide indications as to whether the patient is in pain or is overly sedated. Regarding claim 14, “A method for providing adaptive entertainment content according to any of the claim 11, wherein the medical procedure is a magnetic resonance imaging (MRI) procedure and the succession of procedural steps includes at least one MRI sequence.” Chan teaches (¶0169, ¶0166, ¶0238) for use with an MRI procedure. Regarding claim 15, “A computer program product comprising executable instructions stored on anon-transitory computer readable medium, wherein execution of the executable instructions causes a processor to perform the method according to claim 11.” Regarding claim 16, “The entertainment content provider according to claim 1, wherein the medical procedure is a magnetic resonance imaging (MRI) procedure and the succession of procedural steps includes at least one MRI sequence.” Chan teaches (¶0169, ¶0166, ¶0238) for use with an MRI procedure. Regarding claim 17, “A computer program product comprising executable instructions stored on a non-transitory computer readable medium, wherein execution of the executable instructions causes a processor to” Chan teaches (¶0002) Virtual Reality is a computer technology (i.e., inherent processor, memory, and software.) The remainder of claim 17 is similar to claim 1 therefore its rejection is similar to claim 1. Regarding claim 18, its rejection is similar to claim 6. Regarding claim 19, its rejection is similar to claim 7. Regarding claim 20, its rejection is similar to claim 3. Regarding claim 21, its rejection is similar to claim 4. Regarding claim 22, its rejection is similar to claim 5. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Mason et al. (US 10083363) – (16:39-48) The video clip may have been created for use in separate unrelated media, or in other embodiments it may have been created specifically to populate the object characteristics database 506. Further in the example, if the one-second video clip is originally shot in 48 frames-per-second format, the stored information may provide up to 48 individual video frame images of the actor's head and face from varying angles, thereby providing additional three-dimensional information for mapping the actor's face and head in the generation of the VR content. Hughes et al. (US 20100238362) – (¶0004) An entertainment system for use with a magnetic resonance imaging (MRI) device that includes video glasses and headphones Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to FRANK J JOHNSON whose telephone number is (571)272-9629. The examiner can normally be reached 9:00AM-3:00PM EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Brian T. Pendleton can be reached on 571-272-7527. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Frank Johnson/Primary Examiner, Art Unit 2425
Read full office action

Prosecution Timeline

Oct 12, 2023
Application Filed
Jul 25, 2025
Non-Final Rejection — §102
Dec 18, 2025
Response Filed
Mar 12, 2026
Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597262
DETECTING AND IDENTIFYING OBJECTS REPRESENTED IN SENSOR DATA GENERATED BY MULTIPLE SENSOR SYSTEMS
2y 5m to grant Granted Apr 07, 2026
Patent 12583386
METHOD FOR DETECTING TARGET PEDESTRIAN AROUND VEHICLE, METHOD FOR MOVING VEHICLE, AND DEVICE
2y 5m to grant Granted Mar 24, 2026
Patent 12575718
UNIVERSAL ENDOSCOPE ADAPTER
2y 5m to grant Granted Mar 17, 2026
Patent 12574588
Image Selection Using Motion Data
2y 5m to grant Granted Mar 10, 2026
Patent 12573219
DEVICE AND METHOD FOR COUNTING AND IDENTIFICATION OF BACTERIAL COLONIES USING HYPERSPECTRAL IMAGING
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
57%
Grant Probability
77%
With Interview (+20.0%)
2y 11m
Median Time to Grant
Moderate
PTA Risk
Based on 222 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month