DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
2. The information disclosure statement (IDS) submitted on 04/16/2024 and 08/01/2024. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 103
3. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
4. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
5. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
6. Claim(s) 1-15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Rydberg (US 2021/0298868 A1) in view of Mandwal et al. (US 2020/0005481 A1).
7. With reference to claim 1, Rydberg teaches An augmented-reality system configured to provide real-time guidance in placing one or more electrocardiogram electrodes on a subject, (“Various embodiments encompass an Augmented Reality (AR) system to facilitate placement of electrodes onto a patient's body (e.g., electrodes of an ECG monitoring device). Specifically, certain embodiments are configured for generating AR-based instrument positioning maps comprising visual instrument positioning identifiers corresponding to appropriate electrode placements on a patient's body. … Various embodiments of the invention utilize real-time image tracking of a patient's body (specifically, a patient's torso) within an AR-view environment to facilitate placement of electrodes and ECG leads by a clinician during a mapping process. Additionally, a system is provided for recording the placement of the electrodes and ECG leads via the AR-view during the mapping process, so that patients can later retrieve the generated instrument positioning map and utilize the instrument positioning map within a patient-oriented AR view to self-place the electrodes and ECG leads using the same guidance as provided by the clinician and reflected within the instrument positioning map. The guidance is provided via an AR overlay reflective of the generated instrument positioning map over the digital image of the patient's body, utilizing a known visual landmark for positioning of the instrument positioning map. The instrument positioning map of certain embodiments thus indicates where specific electrodes should be placed on the patient's body.” [0021-0022]) Rydberg also teaches the system comprising: at least one camera configured to generate visual data comprising one or more images; (“The AR system provides the AR visual environment in real time by identifying and registering a known visual landmark that is centered and vertically aligned to a patient's chest. The visual landmark is identified and compared against a reference image (e.g., known visual landmark or reference image target) of an image target database or a cloud-based image target database used for identification. If there is a match, the AR system retrieves AR content linked with the identified visual landmark and renders the AR content on a display screen of an AR device such as a head-mounted display (HMD), computer monitor, mobile device, high definition TV, and the like. After the visual landmark is identified, users (e.g., patients and clinicians) may activate a camera or the AR device to perform real-time tracking of the visual landmark positioned at a defined position on the patient's body to interact with the AR content linked to the visual landmark.” [0026] “the image tracking device 170 is configured to utilize a visual-inertial odometry technique which combines information from the plurality of motion sensors with computer vision analysis of the real-world environment visible to the tracking device 170 camera.” [0062], Fig. 5) Rydberg further teaches a display unit configured to display a visual feed comprising visual data and/or computer-augmented objects; (“The AR system provides the AR visual environment in real time by identifying and registering a known visual landmark that is centered and vertically aligned to a patient's chest. The visual landmark is identified and compared against a reference image (e.g., known visual landmark or reference image target) of an image target database or a cloud-based image target database used for identification. If there is a match, the AR system retrieves AR content linked with the identified visual landmark and renders the AR content on a display screen of an AR device such as a head-mounted display (HMD), computer monitor, mobile device, high definition TV, and the like. After the visual landmark is identified, users (e.g., patients and clinicians) may activate a camera or the AR device to perform real-time tracking of the visual landmark positioned at a defined position on the patient's body to interact with the AR content linked to the visual landmark.” [0026] “As shown in FIG. 1, the system 100 may comprise an augmented reality instrument positioning (ARIP) system 130, one or more user computing entities 110A-110N, one or more networks 120, one or more image tracking devices 170, one or more display devices 180, and/or the like.” [0036]) Rydberg teaches one or more processors in communication with the display unit and the at least one camera; and a memory in communication with the one or more processors, the memory having stored thereon machine-readable instructions that, when executed by the one or more processors, cause the one or more processors to perform operations (“an apparatus comprising at least one processor and at least one memory including computer program code is provided. In one embodiment, at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to receive image data of a patient's body,” [0005] “The AR system provides the AR visual environment in real time by identifying and registering a known visual landmark that is centered and vertically aligned to a patient's chest. The visual landmark is identified and compared against a reference image (e.g., known visual landmark or reference image target) of an image target database or a cloud-based image target database used for identification. If there is a match, the AR system retrieves AR content linked with the identified visual landmark and renders the AR content on a display screen of an AR device such as a head-mounted display (HMD), computer monitor, mobile device, high definition TV, and the like. After the visual landmark is identified, users (e.g., patients and clinicians) may activate a camera or the AR device to perform real-time tracking of the visual landmark positioned at a defined position on the patient's body to interact with the AR content linked to the visual landmark.” [0026] “circuitry 200 can includes various means, such as processing element 205, volatile memory 207, non-volatile memory 206, communications interface 208, instrument positioning repository 150, ARIP application 130, and/or input/output circuitry 216. As referred to herein, “circuitry” includes hardware, software and/or firmware configured to perform one or more particular functions. In this regard, the means of circuitry 200 as described herein may be embodied as, for example, circuitry, hardware elements (e.g., a suitably programmed processor, combinational logic circuit, and/or the like), a computer program product comprising computer-readable program instructions stored on a non-transitory computer-readable medium (e.g., non-volatile memory 206) that is executable by a suitably configured processing device (e.g., processing element 205), or some combination thereof.” [0044]) Rydberg also teaches (i) receiving, via the at least one camera, a first visual data comprising at least one image showing at least a portion of the subject, wherein the portion of the subject includes a torso of the subject; (“the image data comprises a real-time image of the patient's body and a known visual landmark positioned at a defined position on the patient's body; receive user input identifying mapping locations of each of one or more instruments relative to the known visual landmark on the image data of the patient's body; generate an instrument positioning map for the patient by overlaying the mapping locations of each of the one or more instruments onto the image data, wherein the instrument positioning map comprises data locating the mapping locations of each of the one or more instruments relative to the known visual landmark; and store the instrument positioning map for access by the patient.” [0005] “The AR system provides the AR visual environment in real time by identifying and registering a known visual landmark that is centered and vertically aligned to a patient's chest. The visual landmark is identified and compared against a reference image (e.g., known visual landmark or reference image target) of an image target database or a cloud-based image target database used for identification. If there is a match, the AR system retrieves AR content linked with the identified visual landmark and renders the AR content on a display screen of an AR device such as a head-mounted display (HMD), computer monitor, mobile device, high definition TV, and the like. After the visual landmark is identified, users (e.g., patients and clinicians) may activate a camera or the AR device to perform real-time tracking of the visual landmark positioned at a defined position on the patient's body to interact with the AR content linked to the visual landmark.” [0026] “an image tracking device (which in certain embodiments may be a part of one or more user computing entities 110A-110N) provides real-time images of a user's body (specifically, a user's torso) and additional information identifying the user or the user's environment to the ARIP system 130 (e.g., via network 120).” [0041] “the image tracking device 170 is configured to utilize a visual-inertial odometry technique which combines information from the plurality of motion sensors with computer vision analysis of the real-world environment visible to the tracking device 170 camera.” [0062]) Rydberg further teaches (ii) analyzing the first visual data to generate a visual overlay template corresponding to the portion of the subject shown in the first visual data, (iii) generating a composite visual feed based on the first visual data and the generated visual overlay template, wherein the visual overlay template is superimposed on the portion of the subject shown in the first visual data; and (iv) displaying, via the display unit, the composite visual feed. (“the ARIP system uses information from the plurality of sensors to determine one or both of a position and orientation of the image tracking device relative to the tracked imagery from the real-world environment. For example, as the user moves and interacts with real-world objects (e.g., electrodes) in the AR-view environment provided by the ARIP system 130, the image tracking device may detect changes in the position/orientation of the image tracking device (and the user's body) and render for display new AR content (e.g., the generated instrument positioning map) to overlay over the image of the user's body.” [0061] “The AR content overlaid on the digital image of the user's body is generated using AR processing by processing element 308. The AR processing performed by processing element 308 may include object identification algorithms or other AR processing techniques used to create or identify AR information or real-world objects in the digital image or video sequence. In embodiments, the ARIP system 130 may utilize the image tracking device and AR processing algorithms to identify a known visual landmark in the digital image or video sequence scene. As discussed herein, the known visual landmark may be embodied as a specially designed AR image target (e.g. comprising one or more 2D bar codes having a known orientation relative to the patient's torso) linked to an AR-based instrument positioning map. In an example embodiment, the AR image target is utilized by processing element 308 as a reference image. The reference image may be stored locally on user computing entity 110A and once detected in the real-world environment, processing element 308 triggers AR content associated with the known visual landmark to be rendered. Detecting the known visual landmark positioned at a defined position on the user's body, the ARIP system 130 may then infer the 6-dimensional space of poses (e.g., X/Y/Z positions and roll/pitch/yaw orientations) of the image tracking device with respect to the known visual marker. Thereafter, the ARIP system 130 may render the AR-based instrument positioning map and graphics or positioning identifiers that overlay the digital image of the world in such a way that the AR-based instrument positioning map and positioning identifiers would be positioned or point to specific physical locations of the user's body (e.g., right clavicle at the midclavicular line).” [0063] “The AR ECG testing preparation module can include, for example the following features: (1) provision of an AR-view environment to facilitate placement of electrodes and ECG leads by a clinician during a mapping process using a preliminary AR-based instrument positioning map; (2) calibration of the preliminary AR-based instrument positioning map based on appropriate electrode and ECG lead placements on a patient's body and estimated electrode and lead placements utilizing a known visual landmark; (3) customization and personalization of the AR-based instrument positioning map comprising visual instrument positioning identifiers corresponding to appropriate electrode and ECG lead placements with respect to a patient's body; … the projection of the AR-view is generated by using display device 180, image tracking device 170, the AR image target (ARIT) 503, user computing entity 110A, 12-lead Holter monitor 400, or other specialized equipment. As illustrated in FIG. 5, ARIP system 130 is configured to superimpose an AR-based instrument positioning map 505 on a real-world imagery (e.g., imagery of the patient's 501 body) processed by the image tracking device 170 and display the combination of these on the display device 180 and/or the user computing entity 110A. The exact alignment of the patient's 501 body and the AR-based instrument positioning map 505 is important. As such, the ARIP system 130 is configured to accurately align both the patient's 501 body and the AR-based instrument positioning map 505 using an ARIT 503 placed at a defined position on the patient's torso, such as on the patient's jugular notch 1101 as shown in FIG. 11. The AR-view facilitates the optimal placement of electrodes of the 12-lead Holter monitor 400.” [0069-0070])
PNG
media_image1.png
459
592
media_image1.png
Greyscale
Rydberg does not explicitly teach the visual overlay template includes recommended positions for one or more electrocardiogram electrodes; This is what Mandwal teaches (“The positioning analysis module is executable by a processor to receive an image taken with the camera of a patient's chest before the electrodes have been placed thereon. The positioning analysis module obtains an overlay image which includes a visual representation of desired location for the electrodes. The overlay image, in some embodiments, can also include anatomical landmarks to further aid in training and electrode placement. The overlay image is displayed with the real time image of the patient on the display of the augmented reality device.” [0010] “The positioning analysis module 15 would then communicate to the overlay image module 16 to retrieve and generate an overlay image 52 that would be presented to the clinician on the display 18a. The overlay image 52 would be layered over the real time image 50 to create the augmented image 54 shown in FIG. 5. In the embodiment shown in FIG. 5, the overlay image 52 includes a graphical image of hidden anatomical structures of the patient, such as ribcage, sternum and clavicle bones. The overlay image 52 further includes visual representations of desired positions 56 for the precordial electrodes used in a standard 12-lead ECG. These standard precordial electrodes include the V1, V2, V3, V4, V5 and V6 chest electrodes. If the clinician shifts his or her field of view, the overlay image 52 would shift and show the desired locations for the four limb electrodes RA, LA, RL and LL or locations for alternative lead sets that may be selected by the clinician such as right sided precordial leads V3r, V4r, or V5r (not shown in Figures). As can be understood in the augmented image 54 shown in FIG. 5, the augmented image 54 provides both the real time image of the patient 3 with the overlay image 52 of the desired electrode positions superimposed over the patient. The desired positions provide location information for the electrodes relative to the anatomically significant structures of the patient, such as the sternum and individual ribs of the ribcage.” [0039]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Mandwal into Rydberg, in order to further aid in training and electrode placement.
8. With reference to claim 2, Rydberg teaches the first visual data received from the at least one camera comprises real-time images showing at least the portion of the subject, wherein the portion of the subject includes the torso of the subject. (“the image data comprises a real-time image of the patient's body and a known visual landmark positioned at a defined position on the patient's body; receive user input identifying mapping locations of each of one or more instruments relative to the known visual landmark on the image data of the patient's body; generate an instrument positioning map for the patient by overlaying the mapping locations of each of the one or more instruments onto the image data, wherein the instrument positioning map comprises data locating the mapping locations of each of the one or more instruments relative to the known visual landmark; and store the instrument positioning map for access by the patient.” [0005] “The AR system provides the AR visual environment in real time by identifying and registering a known visual landmark that is centered and vertically aligned to a patient's chest. The visual landmark is identified and compared against a reference image (e.g., known visual landmark or reference image target) of an image target database or a cloud-based image target database used for identification. If there is a match, the AR system retrieves AR content linked with the identified visual landmark and renders the AR content on a display screen of an AR device such as a head-mounted display (HMD), computer monitor, mobile device, high definition TV, and the like. After the visual landmark is identified, users (e.g., patients and clinicians) may activate a camera or the AR device to perform real-time tracking of the visual landmark positioned at a defined position on the patient's body to interact with the AR content linked to the visual landmark.” [0026] “an image tracking device (which in certain embodiments may be a part of one or more user computing entities 110A-110N) provides real-time images of a user's body (specifically, a user's torso) and additional information identifying the user or the user's environment to the ARIP system 130 (e.g., via network 120).” [0041] “the image tracking device 170 is configured to utilize a visual-inertial odometry technique which combines information from the plurality of motion sensors with computer vision analysis of the real-world environment visible to the tracking device 170 camera.” [0062] “the projection of the AR-view is generated by using display device 180, image tracking device 170, the AR image target (ARIT) 503, user computing entity 110A, 12-lead Holter monitor 400, or other specialized equipment. As illustrated in FIG. 5, ARIP system 130 is configured to superimpose an AR-based instrument positioning map 505 on a real-world imagery (e.g., imagery of the patient's 501 body) processed by the image tracking device 170 and display the combination of these on the display device 180 and/or the user computing entity 110A. The exact alignment of the patient's 501 body and the AR-based instrument positioning map 505 is important. As such, the ARIP system 130 is configured to accurately align both the patient's 501 body and the AR-based instrument positioning map 505 using an ARIT 503 placed at a defined position on the patient's torso, such as on the patient's jugular notch 1101 as shown in FIG. 11. The AR-view facilitates the optimal placement of electrodes of the 12-lead Holter monitor 400.” [0070])
9. With reference to claim 3, Rydberg teaches a user interface in communication with the one or more processors and configured to receive user input from an associated user, wherein the memory further includes machine-readable instructions stored thereon that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: receiving, via the user interface, a user input indicating whether the generated visual overlay template accurately aligns with the portion of the subject as shown in the composite visual feed. (“an apparatus comprising at least one processor and at least one memory including computer program code is provided. In one embodiment, at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to receive image data of a patient's body, wherein the image data comprises a real-time image of the patient's body and a known visual landmark positioned at a defined position on the patient's body; receive user input identifying mapping locations of each of one or more instruments relative to the known visual landmark on the image data of the patient's body;” [0005] “circuitry 200 can includes various means, such as processing element 205, volatile memory 207, non-volatile memory 206, communications interface 208, instrument positioning repository 150, ARIP application 130, and/or input/output circuitry 216. As referred to herein, “circuitry” includes hardware, software and/or firmware configured to perform one or more particular functions. In this regard, the means of circuitry 200 as described herein may be embodied as, for example, circuitry, hardware elements (e.g., a suitably programmed processor, combinational logic circuit, and/or the like), a computer program product comprising computer-readable program instructions stored on a non-transitory computer-readable medium (e.g., non-volatile memory 206) that is executable by a suitably configured processing device (e.g., processing element 205), or some combination thereof.” [0044] “The user computing entity 110A may also comprise a user interface device comprising one or more user input/output interfaces (e.g., a display 316 and/or speaker/speaker driver coupled to a processing element 309 and a touch screen, keyboard, mouse, and/or microphone coupled to a processing element 309). For example, the user output interface may be configured to provide an application, browser, user interface, dashboard, webpage, and/or similar words used herein interchangeably executing on and/or accessible via the user computing entity 110A to cause display or audible presentation of information/data and for user interaction therewith via one or more user input interfaces.” [0056] “the projection of the AR-view is generated by using display device 180, image tracking device 170, the AR image target (ARIT) 503, user computing entity 110A, 12-lead Holter monitor 400, or other specialized equipment. As illustrated in FIG. 5, ARIP system 130 is configured to superimpose an AR-based instrument positioning map 505 on a real-world imagery (e.g., imagery of the patient's 501 body) processed by the image tracking device 170 and display the combination of these on the display device 180 and/or the user computing entity 110A. The exact alignment of the patient's 501 body and the AR-based instrument positioning map 505 is important. As such, the ARIP system 130 is configured to accurately align both the patient's 501 body and the AR-based instrument positioning map 505 using an ARIT 503 placed at a defined position on the patient's torso, such as on the patient's jugular notch 1101 as shown in FIG. 11. The AR-view facilitates the optimal placement of electrodes of the 12-lead Holter monitor 400.” [0070] “Once the clinician or patient places the ARIT on the patient, invokes the ARIP app on their user computing entity, and scans the ARIT to render the AR-view, guidance is provided via an AR overlay reflective of the generated instrument positioning map over the digital image of the patient's body. For example, FIG. 13 illustrates the AR overlay positioned using the ARIT 1301 providing guidance to the clinician to locate the Right Arm (RA) 1305 electrode location below the patient's right clavicle, and between the right midclavicular line 1302 and shoulder. Once the first mapping location (e.g., RA 1305) is identified by the clinician marking the location of the first instrument using a wireless air mouse, a second instrument location (e.g., electrode location) may automatically be identified, for example, if the second instrument location is dependent on the positioning of the first mapping location. For example, the Left Arm (LA) 1304 electrode mapping location is automatically added and displayed in a mirror position to the RA 1305 mapping location.” [0085])
10. With reference to claim 4, Rydberg teaches the memory further includes machine-readable instructions stored thereon that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: generating instructions for placing one or more electrocardiogram electrodes on the torso of the subject if the user input indicates that the generated visual overlay template accurately aligns with the portion of the subject as shown in the composite visual feed; and generating instructions for adjusting either (i) a position of the at least one camera relative to the subject or (ii) one or more features of the generated visual overlay template if the user input indicates that the generated visual overlay template does not accurately align with the portion of the subject as shown in the composite visual feed. (“an apparatus comprising at least one processor and at least one memory including computer program code is provided. In one embodiment, at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to receive image data of a patient's body, wherein the image data comprises a real-time image of the patient's body and a known visual landmark positioned at a defined position on the patient's body; receive user input identifying mapping locations of each of one or more instruments relative to the known visual landmark on the image data of the patient's body;” [0005] “After the visual landmark is identified, users (e.g., patients and clinicians) may activate a camera or the AR device to perform real-time tracking of the visual landmark positioned at a defined position on the patient's body to interact with the AR content linked to the visual landmark.” [0026] “circuitry 200 can includes various means, such as processing element 205, volatile memory 207, non-volatile memory 206, communications interface 208, instrument positioning repository 150, ARIP application 130, and/or input/output circuitry 216. As referred to herein, “circuitry” includes hardware, software and/or firmware configured to perform one or more particular functions. In this regard, the means of circuitry 200 as described herein may be embodied as, for example, circuitry, hardware elements (e.g., a suitably programmed processor, combinational logic circuit, and/or the like), a computer program product comprising computer-readable program instructions stored on a non-transitory computer-readable medium (e.g., non-volatile memory 206) that is executable by a suitably configured processing device (e.g., processing element 205), or some combination thereof.” [0044] “The user computing entity 110A may also comprise a user interface device comprising one or more user input/output interfaces (e.g., a display 316 and/or speaker/speaker driver coupled to a processing element 309 and a touch screen, keyboard, mouse, and/or microphone coupled to a processing element 309). For example, the user output interface may be configured to provide an application, browser, user interface, dashboard, webpage, and/or similar words used herein interchangeably executing on and/or accessible via the user computing entity 110A to cause display or audible presentation of information/data and for user interaction therewith via one or more user input interfaces.” [0056] “the projection of the AR-view is generated by using display device 180, image tracking device 170, the AR image target (ARIT) 503, user computing entity 110A, 12-lead Holter monitor 400, or other specialized equipment. As illustrated in FIG. 5, ARIP system 130 is configured to superimpose an AR-based instrument positioning map 505 on a real-world imagery (e.g., imagery of the patient's 501 body) processed by the image tracking device 170 and display the combination of these on the display device 180 and/or the user computing entity 110A. The exact alignment of the patient's 501 body and the AR-based instrument positioning map 505 is important. As such, the ARIP system 130 is configured to accurately align both the patient's 501 body and the AR-based instrument positioning map 505 using an ARIT 503 placed at a defined position on the patient's torso, such as on the patient's jugular notch 1101 as shown in FIG. 11. The AR-view facilitates the optimal placement of electrodes of the 12-lead Holter monitor 400.” [0070] “Once the clinician or patient places the ARIT on the patient, invokes the ARIP app on their user computing entity, and scans the ARIT to render the AR-view, guidance is provided via an AR overlay reflective of the generated instrument positioning map over the digital image of the patient's body. For example, FIG. 13 illustrates the AR overlay positioned using the ARIT 1301 providing guidance to the clinician to locate the Right Arm (RA) 1305 electrode location below the patient's right clavicle, and between the right midclavicular line 1302 and shoulder. Once the first mapping location (e.g., RA 1305) is identified by the clinician marking the location of the first instrument using a wireless air mouse, a second instrument location (e.g., electrode location) may automatically be identified, for example, if the second instrument location is dependent on the positioning of the first mapping location. For example, the Left Arm (LA) 1304 electrode mapping location is automatically added and displayed in a mirror position to the RA 1305 mapping location.” [0085] “In an example embodiment the user may move, turn, walk around as long as the ARIT can be tracked and processed by the ARIP app. In an embodiment, when a change in movement is detected by the ARIP app, the instrument positioning map is provided in a geometrically correct orientation with respect to change of movement by the user. The instrument positioning map may be determined to be properly positioned when the ARIT is just below the user's right clavicle. If the ARIT is not properly positioned, a warning that the ARIT is not properly positioned may be displayed to the user via the ARIP app. In another example embodiment, if the ARIT is obstructed in any way, the ARIP app may not render the instrument positioning map and may further provide a warning to the user. Once the ARIT is properly positioned, the instrument positioning map is generated and shown to the user using graphical mapping locations in an AR view. The ARIT may be determined to be properly positioned when image target requirements for the instrument positioning map are satisfied. … As shown in FIG. 21, the ARIP app provides a resizing graphical user interface 2100 comprising a variety of actions to re-size the AR overlay reflective of the generated instrument positioning map over the digital image of the patient's body. For example, upon selecting action 2101, the instrument positioning map increases its width up to a predetermined limit. In FIG. 22, the ARIP app provides an action 2200 to scale up 2201 the instrument positioning map to a predetermined limit. In some embodiments, the ARIP app is further configured to scale up the mapping locations. In FIG. 23, the ARIP app provides an action 2300 to scale down 2301 the instrument positioning map to a predetermined limit. FIG. 24 illustrates the user selecting action 2400 to decrease the width 2401 of the instrument positioning map. As can be seen in FIGS. 21-24, the AR overlay may be further resized and manipulated to better suit the user's body size.” [0095-0097])
11. With reference to claim 5, Rydberg teaches the memory further includes machine-readable instructions stored thereon that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: receiving, via the user interface, a user input comprising one or more inputs for the subject, the inputs including at least one of an age of the subject, a gender of the subject, and a body mass index of the subject; wherein the visual overlay template is generated based on the first visual data received and the one or more inputs for the subject. (“an apparatus comprising at least one processor and at least one memory including computer program code is provided. In one embodiment, at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to receive image data of a patient's body, wherein the image data comprises a real-time image of the patient's body and a known visual landmark positioned at a defined position on the patient's body; receive user input identifying mapping locations of each of one or more instruments relative to the known visual landmark on the image data of the patient's body;” [0005] “circuitry 200 can includes various means, such as processing element 205, volatile memory 207, non-volatile memory 206, communications interface 208, instrument positioning repository 150, ARIP application 130, and/or input/output circuitry 216. As referred to herein, “circuitry” includes hardware, software and/or firmware configured to perform one or more particular functions. In this regard, the means of circuitry 200 as described herein may be embodied as, for example, circuitry, hardware elements (e.g., a suitably programmed processor, combinational logic circuit, and/or the like), a computer program product comprising computer-readable program instructions stored on a non-transitory computer-readable medium (e.g., non-volatile memory 206) that is executable by a suitably configured processing device (e.g., processing element 205), or some combination thereof.” [0044] “The user computing entity 110A may also comprise a user interface device comprising one or more user input/output interfaces (e.g., a display 316 and/or speaker/speaker driver coupled to a processing element 309 and a touch screen, keyboard, mouse, and/or microphone coupled to a processing element 309). For example, the user output interface may be configured to provide an application, browser, user interface, dashboard, webpage, and/or similar words used herein interchangeably executing on and/or accessible via the user computing entity 110A to cause display or audible presentation of information/data and for user interaction therewith via one or more user input interfaces.” [0056] “the ARIP server provides a recommended image target based on the Holter/ECG monitor associated with the patient. To facilitate selection of a patient, the ARIP server displays a listing of patients as shown in an exemplary list of patients illustrated by tables 900 and 903 of FIG. 9. Table 900 illustrates a list of registered patients where each patient row 902 includes patient information 901 including the patient identification, last name, first name, middle name, date of birth, gender, and last visit date (e.g., last visit with the clinician). Once the clinician using the clinician user computing entity 701 selects a particular patient such as, for example patient John Smith, the ECG testing preparation support interface renders for display table 903 illustrating a list of requested ECG tests. Each row 905 of table 903 includes ECG test information 904 including the ECG order identification, the visitation identification, the requested date and time for the ECG test, the status of the ECG test, the ECG test, and the ECG device.” [0077])
12. With reference to claim 6, Rydberg teaches the visual overlay template comprises either (i) a two-dimensional outline of a portion of a body that corresponds to the portion of the subject shown in the first visual data, or (ii) a three-dimensional model of a portion of a body that corresponds to the portion of the subject shown in the first visual data. (“The AR content overlaid on the digital image of the user's body is generated using AR processing by processing element 308. The AR processing performed by processing element 308 may include object identification algorithms or other AR processing techniques used to create or identify AR information or real-world objects in the digital image or video sequence. In embodiments, the ARIP system 130 may utilize the image tracking device and AR processing algorithms to identify a known visual landmark in the digital image or video sequence scene. As discussed herein, the known visual landmark may be embodied as a specially designed AR image target (e.g. comprising one or more 2D bar codes having a known orientation relative to the patient's torso) linked to an AR-based instrument positioning map. In an example embodiment, the AR image target is utilized by processing element 308 as a reference image. The reference image may be stored locally on user computing entity 110A and once detected in the real-world environment, processing element 308 triggers AR content associated with the known visual landmark to be rendered. Detecting the known visual landmark positioned at a defined position on the user's body, the ARIP system 130 may then infer the 6-dimensional space of poses (e.g., X/Y/Z positions and roll/pitch/yaw orientations) of the image tracking device with respect to the known visual marker. Thereafter, the ARIP system 130 may render the AR-based instrument positioning map and graphics or positioning identifiers that overlay the digital image of the world in such a way that the AR-based instrument positioning map and positioning identifiers would be positioned or point to specific physical locations of the user's body (e.g., right clavicle at the midclavicular line).” [0063] “A 12-lead Holter monitor may be preferred by some care providers for its precise ECG signal information required to analyze and detect arrhythmias and myocardial ischemia. To measure the heart's electrical activity accurately, proper electrode placement on the chest 401 is crucial. In a 12-lead ECG such as Holter monitor 400, there are 12 leads calculated using 10 electrodes. The 10 electrodes illustrated in FIG. 4 (and their corresponding intended placement) include: V1 (408)—Fourth intercostal space on the right sternum located between the right midclavicular line (402) and the midline (403); V2 (409)—Fourth intercostal space at the left sternum located between the midline (403) and the left midclavicular line (404); V3 (410)—Midway between placement of V2 (409) and V4 (411); V4 (411)—Fifth intercostal space at the left midclavicular line (404); V5 (412)—Anterior axillary line on the same horizontal level as V4 (411); V6 (413)—Midaxillary line (405) on the same horizontal level as V4 (411) and V5 (412); RA—Right Arm (406)—Anywhere between the right shoulder and right elbow; RL—Right Leg (415)—Anywhere below the right torso and above the right ankle; LA—Left Arm (407)—Anywhere between the left shoulder and the left elbow; and LL—Left Leg (416)—Anywhere below the left torso and above the left ankle.” [0065])
13. With reference to claim 7, Rydberg teaches the memory further includes machine-readable instructions stored thereon that, when executed by the one or more processors, cause the one or more processors to perform operations comprising: receiving, via the at least one camera, a second visual data comprising at least one image showing one or more electrocardiogram electrodes placed on at least the portion of the subject; analyzing the second visual data to determine whether the one or more electrocardiogram electrodes shown in the second visual data are properly positioned; generating instructions for correcting the positioning of one or more of the electrocardiogram electrodes if it is determined that one or more of the electrocardiogram electrodes are not properly positioned; and generating a notification for a user associated with the system if it is determined that the one or more electrocardiogram electrodes are properly positioned, wherein the notification indicates that the one or more electrocardiogram electrodes are properly positioned. (“an apparatus comprising at least one processor and at least one memory including computer program code is provided. In one embodiment, at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to receive image data of a patient's body, wherein the image data comprises a real-time image of the patient's body and a known visual landmark positioned at a defined position on the patient's body; receive user input identifying mapping locations of each of one or more instruments relative to the known visual landmark on the image data of the patient's body;” [0005] “After the visual landmark is identified, users (e.g., patients and clinicians) may activate a camera or the AR device to perform real-time tracking of the visual landmark positioned at a defined position on the patient's body to interact with the AR content linked to the visual landmark.” [0026] “circuitry 200 can includes various means, such as processing element 205, volatile memory 207, non-volatile memory 206, communications interface 208, instrument positioning repository 150, ARIP application 130, and/or input/output circuitry 216. As referred to herein, “circuitry” includes hardware, software and/or firmware configured to perform one or more particular functions. In this regard, the means of circuitry 200 as described herein may be embodied as, for example, circuitry, hardware elements (e.g., a suitably programmed processor, combinational logic circuit, and/or the like), a computer program product comprising computer-readable program instructions stored on a non-transitory computer-readable medium (e.g., non-volatile memory 206) that is executable by a suitably configured processing device (e.g., processing element 205), or some combination thereof.” [0044] “Once the clinician or patient places the ARIT on the patient, invokes the ARIP app on their user computing entity, and scans the ARIT to render the AR-view, guidance is provided via an AR overlay reflective of the generated instrument positioning map over the digital image of the patient's body. For example, FIG. 13 illustrates the AR overlay positioned using the ARIT 1301 providing guidance to the clinician to locate the Right Arm (RA) 1305 electrode location below the patient's right clavicle, and between the right midclavicular line 1302 and shoulder. Once the first mapping location (e.g., RA 1305) is identified by the clinician marking the location of the first instrument using a wireless air mouse, a second instrument location (e.g., electrode location) may automatically be identified, for example, if the second instrument location is dependent on the positioning of the first mapping location. For example, the Left Arm (LA) 1304 electrode mapping location is automatically added and displayed in a mirror position to the RA 1305 mapping location.” [0085] “In an example embodiment the user may move, turn, walk around as long as the ARIT can be tracked and processed by the ARIP app. In an embodiment, when a change in movement is detected by the ARIP app, the instrument positioning map is provided in a geometrically correct orientation with respect to change of movement by the user. The instrument positioning map may be determined to be properly positioned when the ARIT is just below the user's right clavicle. If the ARIT is not properly positioned, a warning that the ARIT is not properly positioned may be displayed to the user via the ARIP app. In another example embodiment, if the ARIT is obstructed in any way, the ARIP app may not render the instrument positioning map and may further provide a warning to the user. Once the ARIT is properly positioned, the instrument positioning map is generated and shown to the user using graphical mapping locations in an AR view. The ARIT may be determined to be properly positioned when image target requirements for the instrument positioning map are satisfied.” [0095-0096])
14. With reference to claim 8, Rydberg teaches the step of analyzing the second visual data includes: generating a projection for the subject based on the second visual data, wherein the projection includes expected positions of one or more electrocardiogram electrodes; registering a condition of each of the one or more electrocardiogram electrodes, wherein the condition of an electrocardiogram electrode includes a relative position of the electrocardiogram electrode and an identity of the electrocardiogram electrode; and comparing the projection with the registered conditions of each of the one or more electrocardiogram electrodes to determine whether the one or more electrocardiogram electrodes deviate from the expected positions. (“an apparatus comprising at least one processor and at least one memory including computer program code is provided. In one embodiment, at least one memory and the computer program code may be configured to, with the processor, cause the apparatus to receive image data of a patient's body, wherein the image data comprises a real-time image of the patient's body and a known visual landmark positioned at a defined position on the patient's body; receive user input identifying mapping locations of each of one or more instruments relative to the known visual landmark on the image data of the patient's body;” [0005] “After the visual landmark is identified, users (e.g., patients and clinicians) may activate a camera or the AR device to perform real-time tracking of the visual landmark positioned at a defined position on the patient's body to interact with the AR content linked to the visual landmark.” [0026] “circuitry 200 can includes various means, such as processing element 205, volatile memory 207, non-volatile memory 206, communications interface 208, instrument positioning repository 150, ARIP application 130, and/or input/output circuitry 216. As referred to herein, “circuitry” includes hardware, software and/or firmware configured to perform one or more particular functions. In this regard, the means of circuitry 200 as described herein may be embodied as, for example, circuitry, hardware elements (e.g., a suitably programmed processor, combinational logic circuit, and/or the like), a computer program product comprising computer-readable program instructions stored on a non-transitory computer-readable medium (e.g., non-volatile memory 206) that is executable by a suitably configured processing device (e.g., processing element 205), or some combination thereof.” [0044] “Once the clinician or patient places the ARIT on the patient, invokes the ARIP app on their user computing entity, and scans the ARIT to render the AR-view, guidance is provided via an AR overlay reflective of the generated instrument positioning map over the digital image of the patient's body. For example, FIG. 13 illustrates the AR overlay positioned using the ARIT 1301 providing guidance to the clinician to locate the Right Arm (RA) 1305 electrode location below the patient's right clavicle, and between the right midclavicular line 1302 and shoulder. Once the first mapping location (e.g., RA 1305) is identified by the clinician marking the location of the first instrument using a wireless air mouse, a second instrument location (e.g., electrode location) may automatically be identified, for example, if the second instrument location is dependent on the positioning of the first mapping location. For example, the Left Arm (LA) 1304 electrode mapping location is automatically added and displayed in a mirror position to the RA 1305 mapping location.” [0085] “In an example embodiment the user may move, turn, walk around as long as the ARIT can be tracked and processed by the ARIP app. In an embodiment, when a change in movement is detected by the ARIP app, the instrument positioning map is provided in a geometrically correct orientation with respect to change of movement by the user. The instrument positioning map may be determined to be properly positioned when the ARIT is just below the user's right clavicle. If the ARIT is not properly positioned, a warning that the ARIT is not properly positioned may be displayed to the user via the ARIP app. In another example embodiment, if the ARIT is obstructed in any way, the ARIP app may not render the instrument positioning map and may further provide a warning to the user. Once the ARIT is properly positioned, the instrument positioning map is generated and shown to the user using graphical mapping locations in an AR view. The ARIT may be determined to be properly positioned when image target requirements for the instrument positioning map are satisfied.” [0095-0096])
15. Claim 9 is similar in scope to claim 1, and thus is rejected under similar rationale. Rydberg additionally teaches A non-transitory computer-readable storage medium having stored thereon machine-readable instructions that, when executed by one or more processors, cause the one or more processors to perform operations (“circuitry 200 can includes various means, such as processing element 205, volatile memory 207, non-volatile memory 206, communications interface 208, instrument positioning repository 150, ARIP application 130, and/or input/output circuitry 216. As referred to herein, “circuitry” includes hardware, software and/or firmware configured to perform one or more particular functions. In this regard, the means of circuitry 200 as described herein may be embodied as, for example, circuitry, hardware elements (e.g., a suitably programmed processor, combinational logic circuit, and/or the like), a computer program product comprising computer-readable program instructions stored on a non-transitory computer-readable medium (e.g., non-volatile memory 206) that is executable by a suitably configured processing device (e.g., processing element 205), or some combination thereof.” [0044] “A computer program product comprising a non-transitory computer readable medium having computer program instructions stored therein, the computer program instructions when executed by a processor, cause the processor to” claim 15)
16. Claims 10-12 are similar in scope to claims 1-3, and they are rejected under similar rationale.
17. The combination of claims 13 and 14 are similar in scope to claim 4, and they are rejected under similar rationale.
18. Claim 15 is similar in scope to claim 7, and thus is rejected under similar rationale.
Conclusion
19. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Michelle Chin whose telephone number is (571)270-3697. The examiner can normally be reached on Monday-Friday 8:00 AM-4:30 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http:/Awww.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Kent Chang can be reached on (571)272-7667. The fax phone number for the organization where this application or proceeding is assigned is (571)273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https:/Awww.uspto.gov/patents/apply/patent- center for more information about Patent Center and https:/Awww.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MICHELLE CHIN/
Primary Examiner, Art Unit 2614