DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The amendment filed November 26th, 2025 has been entered. Claims 1 and 3-11 remain pending in the application. Claims 12- 19 are newly added by the Applicant. Applicant’s amendments to the claims have overcome each and every objection previously set forth in the Non-Final Office Action mailed August 27th, 2025.
Response to Arguments
Applicant’s arguments filed November 26th, 2025, with respect to the rejection of claims 1 and 11 under 35 USC § 101 have been fully considered and are persuasive. The amended language appropriately addresses issues previously set forth in the Non-Final Office Action mailed August 27th, 2025. The rejection of claims 1 and 11 under 35 USC § 101 has been withdrawn.
Regarding amended claims 1 and 11 in the present instance, Applicant argues that Ekkel (US 20230152157 A1) in view of Watanabe (US 20140176599 A1) fails to disclose the method of claim 1, wherein a time difference is calculated between the first and second time acquired from a first and second sensor, stating on pages 12-13 of the remarks filed November 26th, 2025 that “[…] the subject matter of claim 1 uses an advantageous two-sensor approach and associated time-difference calculation, for an improved, simplified approach. No proposed combination of Ekkel and Watanabe has such combined features.”
As cited in the previous Office Action and restated below, Ekkel teaches in at least figures 2A-2C and at least paragraphs 76, 82-88, and 96 the method of claim 1 (now included as lines 1-10 of amended claim 1), including the acquiring of first and second signals and calculating a time difference between them, but does not explicitly disclose a second sensor. Watanabe is cited in the previous Office Action and restated below as an analogous art explicitly teaching a method for calculating a time difference between a first and second time (Watanabe fig. 3 steps S112-S116 and steps 144-S148 and par. 74: server 4 calculates movement speed of subject 5 from first and second signals) acquired from the outputs of a first and second sensor (Watanabe fig. 4 and par. 82: camera 1A detects subject 5 walking in the X direction at first position and camera 1B detects subject 5 walking in the X direction at later position).
The words of the claims are given their plain meaning under the broadest reasonable interpretation as would be understood by a person having ordinary skill in the art (MPEP 2111.01(I)). In the present instance one of ordinary skill in the art would understand that the cameras 1A and 1B and the calculation of server 4 of Watanabe anticipate the “two-sensor approach and associated time-difference calculation” as claimed.
In response to applicant’s argument that there is no teaching, suggestion, or motivation to combine the references, the examiner recognizes that obviousness may be established by combining or modifying the teachings of the prior art to produce the claimed invention where there is some teaching, suggestion, or motivation to do so found either in the references themselves or in the knowledge generally available to one of ordinary skill in the art. See In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988), In re Jones, 958 F.2d 347, 21 USPQ2d 1941 (Fed. Cir. 1992), and KSR International Co. v. Teleflex, Inc., 550 U.S. 398, 82 USPQ2d 1385 (2007). In this case, the motivation to combine to produce a device capable of accurately displaying outputs to a moving subject in a wide variety of spaces is established in both Ekkel (Ekkel fig. 2A and par.’s 12-13. See par. 13: “ the present invention provides a sensor device for controlling an electrical device upon determining a movement characteristic of a person moving across said surface.”) and Watanabe (Watanabe fig.’s 2-4 and par.’s 48 and 53: “the predictor predicts a future action by estimating factors such as a subject's goal, something the subject is searching for, or the fact that an activity was aborted partway through, according to a captured image capturing factors such as the direction in which the subject is facing, a path selection at a corner or turn, a walking speed, a gesture, or line of sight movement”).
Accordingly, amended claims 1 and 11 are rejected.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 3-7, 9-16, and 18-19 are rejected under 35 U.S.C. 103 as being unpatentable over Ekkel (US 20230152157 A1) further in view of Watanabe (US 20140176599 A1).
Regarding Claim 1: Ekkel discloses (in at least figures 2A-3, the description, and the claims) a display method (fig. 3 and par.’s 97: method of determining a movement characteristic of a person moving across a surface and to a corresponding computer program product. See also par. 1) comprising:
acquiring output corresponding to a movement speed of a human from a first position to a second position (fig. 3 and par. 97: step 903 the controller determines a movement characteristic of the person moving across a surface from a pattern of detection signals, fig. 2A and par. 96: “[…] the controller 42 determines the movement speed of the person 49, 49′ based on the respective time period 53, 54 of the pattern corresponding to the first and second movement location.”) using at least one sensor (fig. 2A and par. 76: sensor device 40);
determining an image according to a content used for alert based on the output corresponding to the movement speed (fig. 2A, par. 16: “the controller may be configured to output the output signal, wherein the output signal is configured to control the electrical device based on the movement characteristic, e.g. the movement direction, movement location and/or movement speed”, and par. 90: “The output signal 61 is arranged to control a first portion 655 of the display 60 upon determining the first movement location 55 of the person 49. The first portion 655 of the display 60 thereby corresponds to the first movement location 55 […] Similarly: The output signal 61 is arranged to control a second portion 656 of the display 60 upon determining the second movement location 56 of the person 49′.” See also par. 75: electrical display device 60 “may be used for digital and/or interactive signage.”);
determining a time to display the image based on the output corresponding to the movement speed (fig. 2A, and par. 96: controller determines movement speed of person from first time period 53 to second time period 54 as the movement characteristic, fig. 3 and par.’s 97-98: controller determines movement speed as movement characteristic then at a third time (step 904) signals the display 60 to display an image based on the movement characteristic. See also par. 16); and
displaying the image at the time in a third position using a display apparatus (fig. 2A and par. 90: electrical device 60 comprises a display), wherein
the second position is located between the first position and the third position (fig. 2A and par. 90: “The person 49 may therefore walk towards the display by traversing the detection region 53”, i.e., the display device 60 is located beyond the detection region that includes first movement location and second movement location leading to display 60 located at a third position),
the at least one sensor (fig. 2A and par. 76: sensor device 40 comprises a Single Pixel Thermopile 41 (SPT) and controller 42) includes
a first sensor outputting a first signal indicating detection of passing of the human through the first position (fig.’s 2A-2C and par.’s 82-88: SPT 41 detects and outputs temperature signals 50 and 50’ of human traversing within detection region 43. Controller 42 uses these signals to determine movement location 55 and second movement location 56.), and
a first sensor outputting a second signal indicating detection of passing of the human through the second position (fig.’s 2A-2C and par.’s 82-88: SPT 41 detects and outputs temperature signals 50 and 50’ of human traversing within detection region 43. Controller 42 uses these signals to determine movement location 55 and second movement location 56.), and
the acquiring output corresponding to the movement speed includes
acquiring the first signal of the first sensor, acquiring the second signal of the first sensor (fig. 2A and par. 96: “ controller 42 determines the movement speed of the person 49, 49′ based on the respective time period 53, 54 of the pattern corresponding to the first and second movement location. Namely, the controller 42 may utilize prestored information about the installation location of the sensor device 40 relative to the surface 44 and the angle 45 […] controller may retrieve or receive said information (or the installation location) from a further device, such as a (location) server or a user input device.” ), and calculating a time difference between a first time when acquiring the first signal and a second time when acquiring the second signal (fig. 2A and par. 96: “[…], the controller 42 may determine the (average) movement speed of the person 49, 49′ with said (determined) time period 53, 54.”).
Ekkel does not explicitly disclose wherein a second sensor outputs the second signal.
Watanabe discloses an analogous art (fig.’s 1-7: display control system and associated and corresponding method of operations) wherein an at least one sensor includes a first sensor outputting a first signal indicating detection of passing of the human through the first position (fig. 4 and par. 82: camera 1A detects subject 5 walking in the X direction at first position), and a second sensor outputting a second signal indicating detection of passing of the human through the second position (fig. 4 and par. 82: camera 1B detects subject 5 walking in the X direction at later position), and the acquiring output corresponding to the movement speed includes acquiring the first signal of the first sensor, acquiring the second signal of the second sensor, and calculating a time difference between a first time when acquiring the first signal and a second time when acquiring the second signal (fig. 3 steps S112-S116 and steps 144-S148 and par. 74: server 4 calculates movement speed of subject 5 from first and second signals).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention for the second sensor, as taught by Watanabe, to be included in the system of Ekkel thereby expanding the detection region of the device allowing it to operate along longer corridors, walkaway, at a corner or a turn, and ensure accurate display outputs in a wider variety of spaces (Watanabe fig.’s 2-4 and par.’s 48 and 53).
Regarding Claim 3: Ekkel and Watanabe disclose the display method according to claim 1, and Watanabe further discloses wherein the second sensor is placed between the first sensor and the display apparatus (fig. 4: camera 1B between camera 1A and devices 2B and 2C of display apparatus).
The rationale to combine is the same as for claim 1.
Regarding Claim 4: Ekkel and Watanabe disclose the display method according to claim 1, and Ekkel further discloses wherein the determining the image includes not displaying the image (par. 25: controller can control electrical display device to power on or power off based upon output signal 61 determined by controller 42 based upon person’s location, movement direction, and movement speed. See also par. 98) when the second time is before the first time (fig.’s 2A-2C and par’s 90 and 96: output signal 61 controls the first and second display portions 655 and 656 independently based upon whether person was detected at corresponding locations 55 or 56 at respective times walking toward display 60. Accordingly, it is inherent that since the controller 42 powers on portions of 60 when movement direction is toward the display (first time before second time), it can power off said portions of 60 when movement direction is away from the display (second time before the first time)).
Regarding Claim 5: Ekkel and Watanabe disclose the display method according to claim 1, and Ekkel further discloses wherein the determining the image includes determining use of a first content as the content when the time difference from the first time to the second time is equal to or smaller than a predetermined value, and determining use of a second content different from the first content as the content when the time difference is larger than the predetermined value (par.’s 41, 86-87, 96, and 100-101 controller compares detection timestamps to predefined time durations or timetables to determine movement detection locations and movement speed therebetween, par.’s 16: output signal and controller are configured to control the display based upon the movement speed. Accordingly, the display content is determined based on whether the movement speed is faster or slower relative to some predetermined value. See also fig. 3 and par.’s 99-101: steps 903, 9034 and 904).
Regarding Claim 6: Ekkel and Watanabe disclose the display method according to claim 1, and Ekkel further discloses wherein the determining the image includes determining a third time when displaying the image based on the time difference from the first time to the second time (fig. 2A, and par. 96: controller determines movement speed of person from first time period 53 to second time period 54 as the movement characteristic, fig. 3 and par.’s 97-98: controller determines movement speed as movement characteristic then at a third time (step 904) signals the display 60 to display an image based on the movement characteristic. See also par. 16).
Regarding Claim 7: Ekkel and Watanabe disclose the display method according to claim 1, and Ekkel discloses the method further comprising, when determining the image, stopping acquisition of the output of the first sensor in a predetermined period from the first time (par. 96: “Knowing the size of the detection region 43, and in particular the elongated detection area 46, the controller 42 may determine the (average) movement speed of the person 49, 49′ with said (determined) time period 53, 54.”, i.e., the controller 42 stops acquiring the output of the first sensor after predetermined time period 53,54 and movement speed between time period 53 and 54. See also fig. 3: steps 903 to 904.).
Regarding Claim 9: Ekkel and Watanabe disclose the display method according to claim 1, and Ekkel further discloses wherein the displaying the image includes determining an orientation of the image to be displayed in a direction different from a direction from the first position to the second position (fig. 2A and par. 90: “The person 49 may therefore walk towards the display by traversing the detection region 53, and the first portion 655 of the display 60 may be controlled in response thereto […] The person 49′ may therefore walk towards the display 60 by traversing the detection region 53, and the second portion 656 of the display 60 may be controlled in response thereto.” See also par. 98: “movement characteristic may be movement direction, movement speed, and/or movement location,”).
Regarding Claim 10: Ekkel and Watanabe disclose the display method according to claim 1, and Ekkel further discloses wherein the determining the image includes determining the content based on a date, a time, or a day of week (par. 75 and fig 2A: system 200 is installed in a retail environment and electrical display device 60 “may be used for digital and/or interactive signage.” It is inherent that digital and interactive signage in a retail space is determined based on date, time, or day of the week for events, seasonal advertising, etc.).
Regarding Claim 11: Ekkel discloses (in at least figures 2A-3, the description, and the claims) a display system (fig. 3 and par.’s 97: method of determining a movement characteristic of a person moving across a surface and to a corresponding computer program product. See also par. 1) comprising:
a display apparatus (fig. 2A and par. 90: electrical device 60 comprises a display);
a sensor; and a control apparatus controlling operation of the display apparatus, the control apparatus (fig. 2A and par. 76: sensor device 40 comprises a Single Pixel Thermopile 41 (SPT) and controller 42) executing
acquiring output corresponding to a movement speed of a human from a first position to a second position (fig. 3 and par. 97: step 903 the controller determines a movement characteristic of the person moving across a surface from a pattern of detection signals, fig. 2A and par. 96: “[…] the controller 42 determines the movement speed of the person 49, 49′ based on the respective time period 53, 54 of the pattern corresponding to the first and second movement location.”) from the sensor (fig.’s 2A-2C and par.’s 82-88: SPT 41 detects and outputs temperature signals 50 and 50’ of human traversing within detection region 43. Controller 42 uses these signals to determine movement location 55 and second movement location 56.),
determining an image according to a content used for alert based on the output corresponding to the movement speed (fig. 2A, par. 16: “the controller may be configured to output the output signal, wherein the output signal is configured to control the electrical device based on the movement characteristic, e.g. the movement direction, movement location and/or movement speed”, and par. 90: “The output signal 61 is arranged to control a first portion 655 of the display 60 upon determining the first movement location 55 of the person 49. The first portion 655 of the display 60 thereby corresponds to the first movement location 55 […] Similarly: The output signal 61 is arranged to control a second portion 656 of the display 60 upon determining the second movement location 56 of the person 49′.” See also par. 75: electrical display device 60 “may be used for digital and/or interactive signage.”),
determining a time to display the image based on the output corresponding to the movement speed (fig. 2A, and par. 96: controller determines movement speed of person from first time period 53 to second time period 54 as the movement characteristic, fig. 3 and par.’s 97-98: controller determines movement speed as movement characteristic then at a third time (step 904) signals the display 60 to display an image based on the movement characteristic. See also par. 16), and
displaying the image at the time in a third position using the display apparatus (fig. 2A and par. 90: electrical device 60 comprises a display), and
the second position is located between the first position and the third position (fig. 2A and par. 90: “The person 49 may therefore walk towards the display by traversing the detection region 53”, i.e., the display device 60 is located beyond the detection region that includes first movement location and second movement location leading to display 60 located at a third position), wherein
the sensor (fig. 2A and par. 76: sensor device 40 comprises a Single Pixel Thermopile 41 (SPT) and controller 42) includes
a first sensor outputting a first signal indicating detection of passing of the human through the first position (fig.’s 2A-2C and par.’s 82-88: SPT 41 detects and outputs temperature signals 50 and 50’ of human traversing within detection region 43. Controller 42 uses these signals to determine movement location 55 and second movement location 56.), and
a first sensor outputting a second signal indicating detection of passing of the human through the second position (fig.’s 2A-2C and par.’s 82-88: SPT 41 detects and outputs temperature signals 50 and 50’ of human traversing within detection region 43. Controller 42 uses these signals to determine movement location 55 and second movement location 56.), and
the acquiring output corresponding to the movement speed includes
acquiring the first signal of the first sensor, acquiring the second signal of the first sensor (fig. 2A and par. 96: “ controller 42 determines the movement speed of the person 49, 49′ based on the respective time period 53, 54 of the pattern corresponding to the first and second movement location. Namely, the controller 42 may utilize prestored information about the installation location of the sensor device 40 relative to the surface 44 and the angle 45 […] controller may retrieve or receive said information (or the installation location) from a further device, such as a (location) server or a user input device.” ), and calculating a time difference between a first time when acquiring the first signal and a second time when acquiring the second signal (fig. 2A and par. 96: “[…], the controller 42 may determine the (average) movement speed of the person 49, 49′ with said (determined) time period 53, 54.”).
Ekkel does not explicitly disclose wherein a second sensor outputs the second signal.
Watanabe discloses an analogous art (fig.’s 1-7: display control system and associated and corresponding method of operations) wherein a sensor includes a first sensor outputting a first signal indicating detection of passing of the human through the first position (fig. 4 and par. 82: camera 1A detects subject 5 walking in the X direction at first position), and a second sensor outputting a second signal indicating detection of passing of the human through the second position (fig. 4 and par. 82: camera 1B detects subject 5 walking in the X direction at later position), and the acquiring output corresponding to the movement speed includes acquiring the first signal of the first sensor, acquiring the second signal of the second sensor, and calculating a time difference between a first time when acquiring the first signal and a second time when acquiring the second signal (fig. 3 steps S112-S116 and steps 144-S148 and par. 74: server 4 calculates movement speed of subject 5 from first and second signals).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention for the second sensor, as taught by Watanabe, to be included in the system of Ekkel thereby expanding the detection region of the device allowing it to operate along longer corridors, walkaway, at a corner or a turn, and ensure accurate display outputs in a wider variety of spaces (Watanabe fig.’s 2-4 and par.’s 48 and 53).
Regarding Claim 12: Ekkel and Watanabe disclose the display method according to claim 11, and Watanabe further discloses wherein the second sensor is placed between the first sensor and the display apparatus (fig. 4: camera 1B between camera 1A and devices 2B and 2C of display apparatus).
The rationale to combine is the same as for claim 11.
Regarding Claim 13: Ekkel and Watanabe disclose the display method according to claim 11, and Ekkel further discloses wherein the determining the image includes not displaying the image (par. 25: controller can control electrical display device to power on or power off based upon output signal 61 determined by controller 42 based upon person’s location, movement direction, and movement speed. See also par. 98) when the second time is before the first time (fig.’s 2A-2C and par’s 90 and 96: output signal 61 controls the first and second display portions 655 and 656 independently based upon whether person was detected at corresponding locations 55 or 56 at respective times walking toward display 60. Accordingly, it is inherent that since the controller 42 powers on portions of 60 when movement direction is toward the display (first time before second time), it can power off said portions of 60 when movement direction is away from the display (second time before the first time)).
Regarding Claim 14: Ekkel and Watanabe disclose the display method according to claim 11, and Ekkel further discloses wherein the determining the image includes determining use of a first content as the content when the time difference from the first time to the second time is equal to or smaller than a predetermined value, and determining use of a second content different from the first content as the content when the time difference is larger than the predetermined value (par.’s 41, 86-87, 96, and 100-101 controller compares detection timestamps to predefined time durations or timetables to determine movement detection locations and movement speed therebetween, par.’s 16: output signal and controller are configured to control the display based upon the movement speed. Accordingly, the display content is determined based on whether the movement speed is faster or slower relative to some predetermined value. See also fig. 3 and par.’s 99-101: steps 903, 9034 and 904).
Regarding Claim 15: Ekkel and Watanabe disclose the display method according to claim 11, and Ekkel further discloses wherein the determining the image includes determining a third time when displaying the image based on the time difference from the first time to the second time (fig. 2A, and par. 96: controller determines movement speed of person from first time period 53 to second time period 54 as the movement characteristic, fig. 3 and par.’s 97-98: controller determines movement speed as movement characteristic then at a third time (step 904) signals the display 60 to display an image based on the movement characteristic. See also par. 16).
Regarding Claim 16: Ekkel and Watanabe disclose the display method according to claim 11, and Ekkel discloses the method further comprising, when determining the image, stopping acquisition of the output of the first sensor in a predetermined period from the first time (par. 96: “Knowing the size of the detection region 43, and in particular the elongated detection area 46, the controller 42 may determine the (average) movement speed of the person 49, 49′ with said (determined) time period 53, 54.”, i.e., the controller 42 stops acquiring the output of the first sensor after predetermined time period 53,54 and movement speed between time period 53 and 54. See also fig. 3: steps 903 to 904.).
Regarding Claim 18: Ekkel and Watanabe disclose the display method according to claim 11, and Ekkel further discloses wherein the displaying the image includes determining an orientation of the image to be displayed in a direction different from a direction from the first position to the second position (fig. 2A and par. 90: “The person 49 may therefore walk towards the display by traversing the detection region 53, and the first portion 655 of the display 60 may be controlled in response thereto […] The person 49′ may therefore walk towards the display 60 by traversing the detection region 53, and the second portion 656 of the display 60 may be controlled in response thereto.” See also par. 98: “movement characteristic may be movement direction, movement speed, and/or movement location,”).
Regarding Claim 19: Ekkel and Watanabe disclose the display method according to claim 11, and Ekkel further discloses wherein the determining the image includes determining the content based on a date, a time, or a day of week (par. 75 and fig 2A: system 200 is installed in a retail environment and electrical display device 60 “may be used for digital and/or interactive signage.” It is inherent that digital and interactive signage in a retail space is determined based on date, time, or day of the week for events, seasonal advertising, etc.).
Claims 8 and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Ekkel and Watanabe as applied to claims 1 and 11 above, and further in view of Bury (US 20170139471 A1).
Regarding Claim 8: Ekkel and Watanabe disclose the display method according to claim 1, Ekkel discloses wherein the acquiring output corresponds to the movement speed (Ekkel fig. 2A, par. 16, and par. 96) and includes discarding information on the first signal (par. 44: device is implemented in a computer program product with instructions provided executable programs that can be modified and/or updated).
However, neither Ekkel nor Watanabe discloses wherein information on the first time is discarded when the second signal is not output by the second sensor in a predetermined period from the first time.
Bury discloses an analogous art (fig.’s 1-2 and par. 25: interactive digital display device (IDD) 100) wherein the acquiring the output corresponding to the movement speed includes discarding information on the first time when the second signal is not output by the second sensor in a predetermined period from the first time (fig. 1 and par. 27: plurality of sensors 106 detect presence of people in vicinity of device 100 including “the position, velocity, acceleration and orientation relative to the IDD of people in the vicinity of the display”, par. 41: “after going active, when no user presence is detected for some predetermined period of time, the IDD may return to sleep mode or otherwise go inactive.”, fig. 16 and par. 80: computing system 104 or server 150 included memory 1604 with volatile memory 1605 such as RAM. It is well known that RAM memory is discarded when a device is inactive. See also par. 73: It is desirable to reset IDD 100 to discard historical such that the device learning exercise can start anew).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention for Bury’s method, wherein output data is discarded after a predetermined time when no second signal is detected, to be included in the system of Ekkel and Watanabe thereby allowing the system to fully reset and prepare and adapt to new detection scenarios and environments (Bury par. 73).
Regarding Claim 17: Ekkel and Watanabe disclose the display method according to claim 11, Ekkel discloses wherein the acquiring output corresponds to the movement speed (Ekkel fig. 2A, par. 16, and par. 96) and includes discarding information on the first signal (par. 44: device is implemented in a computer program product with instructions provided executable programs that can be modified and/or updated).
However, neither Ekkel nor Watanabe discloses wherein information on the first time is discarded when the second signal is not output by the second sensor in a predetermined period from the first time.
Bury discloses an analogous art (fig.’s 1-2 and par. 25: interactive digital display device (IDD) 100) wherein the acquiring the output corresponding to the movement speed includes discarding information on the first time when the second signal is not output by the second sensor in a predetermined period from the first time (fig. 1 and par. 27: plurality of sensors 106 detect presence of people in vicinity of device 100 including “the position, velocity, acceleration and orientation relative to the IDD of people in the vicinity of the display”, par. 41: “after going active, when no user presence is detected for some predetermined period of time, the IDD may return to sleep mode or otherwise go inactive.”, fig. 16 and par. 80: computing system 104 or server 150 included memory 1604 with volatile memory 1605 such as RAM. It is well known that RAM memory is discarded when a device is inactive. See also par. 73: It is desirable to reset IDD 100 to discard historical such that the device learning exercise can start anew).
Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention for Bury’s method, wherein output data is discarded after a predetermined time when no second signal is detected, to be included in the system of Ekkel and Watanabe thereby allowing the system to fully reset and prepare and adapt to new detection scenarios and environments (Bury par. 73).
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure includes:
Fujiune (US 20160286186 A1) discloses at least the display system according to claim 11 and the display method according to claims 1-2, 4, and 6-10.
Barlow (US 8884229 B2) discloses at least the display system according to claim 11 and the display method according to claims 1-2, 4, and 6-10.
Kim (US 20170285760 A1) discloses at least the display system according to claim 11 and the display method according to claims 1-2, 4, and 6-10.
Kimura (JP 2018181776 A) discloses at least the display system according to claim 11 and the display method according to claims 1-9, and 10.
Makino (JP 2016040694 A) discloses at least the display system according to claim 11 and the display method according to claims 1-9, and 10.
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to EVAN MANCINI whose telephone number is (703)756-5796. The examiner can normally be reached Mon-Fri 8AM-5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KRISTINA DEHERRERA can be reached at (303)297-4237. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/EVAN MANCINI/Examiner, Art Unit 2855
/KRISTINA M DEHERRERA/Supervisory Patent Examiner, Art Unit 2855 1/29/26