DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
2. In the response to this office action, the Examiner respectfully requests that support be shown for language added to any original claims on amendment and any new claims. That is, indicate support for newly added claim language by specifically pointing to page(s) and line numbers in the specification and/or drawing figure(s). This will assist the Examiner in prosecuting this application.
Information Disclosure Statement
3. The information disclosure statement filed on July 17, 2024 has been considered and placed in the application file.
Drawings
4. The drawings are objected to as failing to comply with 37 CFR 1.84(p)(4) because reference character “109” in Figure 1 has been used to designate both microphone (see Specification, page 6, paragraph [0019]) and gyroscope (see Specification, page 7, paragraph [0021]). Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Claim Rejections - 35 USC § 112
5. The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
6. Claims 1-9 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention.
Independent claim 1 is indefinite because it is unclear whether limitation “a first signal” in line 8 is the same “a first signal“ as recited in line 3. If it is, the examiner suggests that applicant can amend “a first signal” in line to read “the first signal” to overcome this problem.
Claims 2-9 depend from claim 1, and are also rejected for the same reasons.
Claim Rejections - 35 USC § 103
7. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
8. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
9. Claims 1, 7-10, and 16 are rejected under 35 U.S.C. 103 as being unpatentable over Abrahamsson et al U.S. Patent Application Publication 20120114132 (hereinafter, “Abrahamsson”) in view of Gui et al. U.S. Patent Application Publication 20200174558 (hereinafter, “Gui”).
Regarding claim 1, Abrahamsson teaches a headphone assembly (Headphones, also sometimes referred to as earphones, are a type of headset (also referred to as listening device) that have been used to listen to audio content or material, e.g., sounds, such as music, lectures and so on. Headphones typically have used speakers that are positioned over the ears of a user to convey audio content to the respective ears and a support bar on which the speakers are mounted, par [0004]. In Figs. 1 and 2, an audio headset system 10 is illustrated in position with respect to a user 11, who may listen to sounds provided by the audio headset system, par [0069], see Abrahamsson), comprising:
headphones (The audio headset system 10 includes a pair of earpieces 12R, 12L that are illustrated in position with respect to respective ears 13R, 13L of the user 11 to provide sounds to those ears, Figs. 1, 2 par [0070], see Abrahamsson);
a switch (At step 83 the user may press a reference direction switch of the audio headset system 10, e.g., a switch located on an earpiece, see Figs. 2, 8, par [0117]) configured to transmit a first signal indicative of a command to establish a relative orientation for the headphones (FIG. 8 is a flowchart or a logic diagram 80 representing, for example, steps for setting a reference direction for the audio headset system 10. At step 81 the user may determine that it is intended to set a reference direction. At step 82 the user may face a reference direction. For example, the user may face north or some other known reference direction. At step 83 the user may press a reference direction switch of the audio headset system 10, e.g., a switch located on an earpiece, Figs. 2, 8, par [0117], see Abrahamsson. Whether the three-axis accelerometers 14L, 14R are positioned identically in the respective earbuds 12L, 12R or whether they are randomly mounted in or on the respective earbuds, the orientation of the two accelerometer axes may not be aligned with each other, i.e., the x, y and z axes of one accelerometer may not be generally parallel to the respective x, y and z axes of the other accelerometer. This may be due to the fact that the accelerometers are not identically mounted or positioned on or in the respective earbuds or may be due to the different orientations of the earbuds in the respective ears 13 of the user 11. One earbud and the accelerometer thereof may be oriented with respect to an ear differently from the orientation, i.e., relative orientation, of the earbud and accelerometer positioned with respect to the other ear of the user 11, Figs. 7A-7E, par [0101], see Abrahamsson).
Abrahamsson further teaches according to another aspect the processing includes considering the accelerometers as generally symmetrically located relative to the axis of rotation of the head, and wherein the processing includes using the relative movement of the ear pieces in relation to each other (in other words, also relative movement of the ear pieces in relation to the user’s head) as an indication of angular motion or direction of angular motion (par [0033], see Abrahamsson). A user should be confident that the ear pieces 12 are appropriately in position in his ears 13. Various detectors are available to detect that an ear piece, such as an earbud, is properly in position in a user's ear. Capacitive sensors and infrared proximity sensors have been used in the past for this purpose (Fig. 1, par [0154], see Abrahamsson).
However, Abrahamsson does not explicitly disclose at least one motion detector configured to transmit a second signal indicative of movement of the headphones relative to a user's head.
Gui teaches on/off detection in wearable electronic devices (see Title) in which in one or more embodiments, computer system 800 provides a system for operating a wearable electronic device such as a headset, head-mounted display, helmet-mounted device, hat-mounted device, eyeglasses, safety glasses, and/or smart glasses (par [0108], see Gui). Processing apparatus 110 may then use the root mean square of displacement measurements from inertial sensors 116 to detect a "nudge" or "wobble" as wearable electronic device 100 transitions from being in motion (e.g., as wearable electronic device 100 is moved toward the user's head) to being relatively still (e.g., on the user's head). Finally, processing apparatus 110 may use inertial sensors 116 to detect head movements as wearable electronic device 100 is worn (Fig. 1, par [0054], see Gui).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the on/off detection in wearable electronic devices taught by Gui with the headphone assembly of Abrahamsson such that to obtain at least one motion detector configured to transmit a second signal indicative of movement of the headphones relative to a user's head in order to reduce sound leakage from transducer, as suggested by Gui in paragraph [0074].
Abrahamsson in view of Gui, as modified, teaches at least one processor (According to a further aspect, the processor is configured to process acceleration information to determine amount and/or direction of angular motion relative to a reference direction, and wherein the accelerometers provide acceleration information indicative of the reference direction, par [0014], see Abrahamsson. The acceleration signal portions 55 alternatively may represent a bit of extra motion, e.g., acceleration/deceleration to bring the shaft 44s to a desired orientation relative to the forward facing direction. Thus, it will be appreciated that the acceleration signals 50 represent rotation from a start position represented at 56 along the time axis 42, a deceleration in the general area 57, a reversal in the area 58, and a stopping in the area 59,see Figs. 5A, 5B, par [0096], see Abrahamsson) programmed to:
receive a first signal from the switch to establish the relative orientation of the headphones (FIG. 8 is a flowchart or a logic diagram 80 representing, for example, steps for setting a reference direction for the audio headset system 10. At step 81 the user may determine that it is intended to set a reference direction. At step 82 the user may face a reference direction. For example, the user may face north or some other known reference direction. At step 83 the user may press a reference direction switch of the audio headset system 10, e.g., a switch located on an earpiece, Figs. 2, 8, par [0117], see Abrahamsson. Whether the three-axis accelerometers 14L, 14R are positioned identically in the respective earbuds 12L, 12R or whether they are randomly mounted in or on the respective earbuds, the orientation of the two accelerometer axes may not be aligned with each other, i.e., the x, y and z axes of one accelerometer may not be generally parallel to the respective x, y and z axes of the other accelerometer. This may be due to the fact that the accelerometers are not identically mounted or positioned on or in the respective earbuds or may be due to the different orientations of the earbuds in the respective ears 13 of the user 11. One earbud and the accelerometer thereof may be oriented with respect to an ear differently from the orientation, i.e., relative orientation, of the earbud and accelerometer positioned with respect to the other ear of the user 11, Figs. 7A-7E, par [0101], see Abrahamsson);
receive the second signal that indicates that the headphones are moving relative to the user's head (Processing apparatus 110 may then use the root mean square of displacement measurements from inertial sensors 116 to detect a "nudge" or "wobble" as wearable electronic device 100 transitions from being in motion (e.g., as wearable electronic device 100 is moved toward the user's head) to being relatively still (e.g., on the user's head). Finally, processing apparatus 110 may use inertial sensors 116 to detect head movements as wearable electronic device 100 is worn (Fig. 1, par [0054], see Gui).
determine that the headphones are in a stable state after moving relative to the user's head (The above-described acceleration signals are with respect to clockwise rotation of the shaft 44s from zero or stand-still represented, for example, at 47 on the graph 40, showing the acceleration signal 41; the rotation tends to slow down at the area 48, where the polarity of the acceleration signals 45, 46 switches to opposite and, thus, the acceleration signals are shown, respectively, at 45d, 46d. At location 49 along the time axis 42, the shaft 44s has come to a stop. No acceleration signal in the horizontal plane would occur, and, therefore, the acceleration signals would be, for example, at a zero level relative to the y axis 43 (see Figs. 5A, 5B, par [0093], see Abrahamsson). Processing apparatus 110 may then use the root mean square of displacement measurements from inertial sensors 116 to detect a "nudge" or "wobble" as wearable electronic device 100 transitions from being in motion (e.g., as wearable electronic device 100 is moved toward the user's head) to being relatively still (e.g., on the user's head), i.e., stable state, Fig. 1, par [0054], see Gui); and
establish the relative orientation of the headphones while positioned on the user's head (The acceleration signal portions 55 alternatively may represent a bit of extra motion, e.g., acceleration/deceleration to bring the shaft 44s to a desired orientation relative to the forward facing direction. Thus, it will be appreciated that the acceleration signals 50 represent rotation from a start position represented at 56 along the time axis 42, a deceleration in the general area 57, a reversal in the area 58, and a stopping in the area 59 (see Figs. 5A, 5B, par [0096], see Abrahamsson). One earbud and the accelerometer thereof may be oriented with respect to an ear differently from the orientation, i.e., relative orientation, of the earbud and accelerometer positioned with respect to the other ear of the user 11, Figs. 7A-7E, par [0101], see Abrahamsson) in response to at least determining that the headphones are in the stable state (At location 49 along the time axis 42, the shaft 44s has come to a stop. No acceleration signal in the horizontal plane would occur, and, therefore, the acceleration signals would be, for example, at a zero level relative to the y axis 43 (see Figs. 5A, 5B, par [0093], see Abrahamsson). Processing apparatus 110 may then use the root mean square of displacement measurements from inertial sensors 116 to detect a "nudge" or "wobble" as wearable electronic device 100 transitions from being in motion (e.g., as wearable electronic device 100 is moved toward the user's head) to being relatively still (e.g., on the user's head) (corresponds to stable state), Fig. 1, par [0054], see Gui). The motivation is in order to reduce sound leakage from transducer, as suggested by Gui in paragraph [0074].
Regarding claim 7, Abrahamsson in view of Gui teaches the headphone assembly of claim 1. Abrahamsson in view of Gui, as modified, teaches wherein the at least one processor (the processor is configured to process acceleration information to determine amount and/or direction of angular motion relative to a reference direction, and wherein the accelerometers provide acceleration information indicative of the reference direction, par [0014], see Abrahamsson) is further programed to transmit an alert to the user to notify the user (At step 94 a compass, GPS, navigation system, and so forth may be read in the sense that signals provided from such a device may be received as inputs to the mobile phone 15, for example, to indicate a known direction, Fig. 9, par [0119], see Abrahamsson) that the relative orientation of the headphones has been established (At step 95 the absolute direction toward which the user is facing may be computed by determining the difference between the facing direction and the information from the GPS, etc. Knowing the absolute direction, then, such information may be used (step 96) for various purposes, Fig. 9, par [0119], see Abrahamsson) in response to at least determining that the headphones are in the stable state (No acceleration signal in the horizontal plane would occur, and, therefore, the acceleration signals would be, for example, at a zero level relative to the y axis 43 (see Figs. 5A, 5B, par [0093], see Abrahamsson). Processing apparatus 110 may then use the root mean square of displacement measurements from inertial sensors 116 to detect a "nudge" or "wobble" as wearable electronic device 100 transitions from being in motion (e.g., as wearable electronic device 100 is moved toward the user's head) to being relatively still (e.g., on the user's head), i.e., stable state, Fig. 1, par [0054], see Gui).
Regarding claim 8, Abrahamsson in view of Gui teaches the headphone assembly of claim 1. Abrahamsson in view of Gui, as modified, teaches wherein the at least one motion detector is positioned on the headphones (The audio headset system 10 includes a pair of accelerometers, which are shown schematically at 14R, 14L in FIG. 1 (and shown at 14R, in FIG. 3), par [0071], see Abrahamsson).
Regarding claim 9, Abrahamsson in view of Gui teaches the headphone assembly of claim 1. Abrahamsson in view of Gui, as modified, teaches wherein the switch is positioned on the headphones (For example, the user may face north or some other known reference direction. At step 83 the user may press a reference direction switch of the audio headset system 10, e.g., a switch located on an earpiece, Figs. 2, 8, par [0117], see Abrahamsson).
Regarding claim 10, this claim merely specifies a computer program product embodied in a non-transitory computer readable medium that is programmed for determining the relative orientation of headphones, the computer-program product comprising instructions and being executable by at least one processor for operating the headphone assembly of Claim 1 and is therefore interpreted and rejected for the same reasons. It is noted that Abrahamsson in view of Gui teaches the operations, methods, and processes disclosed herein may be embodied as code and/or data, which may be stored on a non-transitory computer-readable storage medium for use by a computer system (par [0034], see Gui).
Regarding Claim 16, this claim has similar limitations as claim 7. Therefore it is interpreted and rejected under Abrahamsson in view of Gui for the reasons.
10. Claims 2-3, 6, 11-12, and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Abrahamsson et al. U.S. Patent Application Publication 20120114132 (hereinafter, “Abrahamsson”) in view of Gui et al. U.S. Patent Application Publication 20200174558 (hereinafter, “Gui”), and further in view of Tu et al. U.S. Patent Application Publication 20220103965 (hereinafter, “Tu”).
Regarding claim 2, Abrahamsson in view of Gui teaches the headphone assembly of claim 1. Abrahamsson in view of Gui, as modified, teaches according to a further aspect, the processor is configured to process acceleration information to determine amount and/or direction of angular motion relative to a reference direction, and wherein the accelerometers provide acceleration information indicative of the reference direction, par [0014], see Abrahamsson).
However, Abrahamsson in view of Gui does not explicitly disclose wherein the at least one processor is further programmed trigger a delay to enable a user to reach a natural head orientation after receiving the first signal from the switch.
Tu teaches adaptive audio centering for head tracking in spatial audio application (see Title) including headset 101 (Fig. 1A, par [0044], see Tu). FIG. 11 is a conceptual block diagram of headset software/hardware architecture 1110 implementing the features and operations described in reference to FIGS. 1-9. In an embodiment, architecture 1100 can includes system-on-chip (SoC) 1101, stereo loudspeakers 1102a, 1102b (e.g., ear buds, headphones, earphones), pushbuttons 1112 (Fig. 11, par [0082], see Tu). The re-centering (e.g., a natural head orientation) is implemented using a bleed-to-zero (BTZ) process triggered after detecting uninterrupted period of static or correlated mutual quiescence. The rate and at BTZ re-centers the spatial audio adapts with the size of the boresight correction angle, such that smaller correction angles are corrected more slowly (i.e., delay) than larger correction angles to avoid disorientating the user. In some scenarios, the user may rotate their head to look off to the side (not at the screen) temporarily. For such a short quiescence period BTZ is not performed because the spatial audio would be perceived by the user as off-center (par [0027], see Tu).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the adaptive audio centering for head tracking in spatial audio application taught by Tu with the headphone assembly of Abrahamsson in view of Gui such that to obtain wherein the at least one processor is further programmed trigger a delay to enable a user to reach a natural head orientation after receiving the first signal from the switch for purpose of providing accurate tracking of how the true boresight vector is changing over time, as suggested by Tu in paragraph [0048].
Regarding claim 3, Abrahamsson in view of Gui in view of Tu teaches the headphone assembly of claim 2. Abrahamsson in view of Gui in view of Tu, as modified, teaches wherein the at least one processor is further programmed to execute a first timer to determine whether the headphones are in the stable state (The timers are used to determine when a BTZ horizon time is reached to invoke the BTZ process to re-center the spatial audio in the 3D virtual auditory space, Fig. 5, par [0053], see Tu; Once the boresight is moved to the center position during the BTZ process, the boresight is kept at the center position as long as mutual quiescence continues (corresponds to stable state), par [0055], see Tu) prior to the first timer expiring (When static or correlated mutual quiescence is detected the static mutual quiescence timer or the correlated mutual quiescence timer begin to accumulate time. Assuming correlated mutual quiescence (used when correlated motion is detected), if no disturbance is detected, the correlated mutual quiescence timer continues to accumulate time until a threshold time τ (BTZ horizon time) is exceeded at which time BTZ is invoked, Fig. 5, par [0054], see Tu).
Regarding claim 6, Abrahamsson in view of Gui in view of Tu teaches the headphone assembly of claim 3. Abrahamsson in view of Gui in view of Tu, as modified, teaches wherein the at least one processor is further programmed to establish the relative orientation of the headphones (The acceleration signal portions 55 alternatively may represent a bit of extra motion, e.g., acceleration/deceleration to bring the shaft 44s to a desired orientation relative to the forward facing direction. Thus, it will be appreciated that the acceleration signals 50 represent rotation from a start position represented at 56 along the time axis 42, a deceleration in the general area 57, a reversal in the area 58, and a stopping in the area 59 (see Figs. 5A, 5B, par [0096], see Abrahamsson). One earbud and the accelerometer thereof may be oriented with respect to an ear differently from the orientation, i.e., relative orientation, of the earbud and accelerometer positioned with respect to the other ear of the user 11, Figs. 7A-7E, par [0101], see Abrahamsson) while positioned on the user's head in response to the first timer expiring (When static or correlated mutual quiescence is detected the static mutual quiescence timer or the correlated mutual quiescence timer begin to accumulate time. Assuming correlated mutual quiescence (used when correlated motion is detected), if no disturbance is detected, the correlated mutual quiescence timer continues to accumulate time until a threshold time τ (BTZ horizon time) is exceeded at which time BTZ is invoked, Fig. 5, par [0054], see Tu) and the headphones being in the in the stable state (The timers are used to determine when a BTZ horizon time is reached to invoke the BTZ process to re-center the spatial audio (corresponds to stable state) in the 3D virtual auditory space, Fig. 5, par [0053], see Tu).
Regarding Claim 11, this claim has similar limitations as claim 2. Therefore it is interpreted and rejected under Abrahamsson in view of Gui in view of Tu for the reasons.
Regarding Claim 12 this claim has similar limitations as claim 3. Therefore it is interpreted and rejected under Abrahamsson in view of Gui in view of Tu for the reasons.
Regarding Claim 15, this claim has similar limitations as claim 6. Therefore it is interpreted and rejected under Abrahamsson in view of Gui in view of Tu for the reasons.
11. Claims 4-5, and 13-14 are rejected under 35 U.S.C. 103 as being unpatentable over Abrahamsson et al. U.S. Patent Application Publication 20120114132 (hereinafter, “Abrahamsson”) in view of Gui et al. U.S. Patent Application Publication 20200174558 (hereinafter, “Gui”) in view of Tu et al. U.S. Patent Application Publication 20220103965 (hereinafter, “Tu”), and further in view of Goldman U.S. Patent Application Publication 20220225887.
Regarding claim 4, Abrahamsson in view of Gui in view of Tu teaches the headphone assembly of claim 3. Abrahamsson in view of Gui in view of Tu, as modified, teaches the headphones being detected to be moving (the re-centering is implemented using a bleed-to-zero (BTZ) process triggered after detecting uninterrupted period of static or correlated mutual quiescence. The rate and at BTZ re-centers the spatial audio adapts with the size of the boresight correction angle, such that smaller correction angles are corrected more slowly than larger correction angles to avoid disorientating the user. In some scenarios, the user may rotate their head to look off to the side (not at the screen) temporarily. For such a short quiescence period, BTZ is not performed because the spatial audio would be perceived by the user as off-center (par [0027], see Tu).
However, Abrahamsson in view of Gui in view of Tu does not explicitly disclose wherein the headphones are programmed to transmit an alert to the user to remain still in response to the first timer expiring and the headphones being detected to be moving.
Goldman teaches heart rate measurements using hearing device and app (see Title) in which for obtaining reliable first physiologic sensor data and second physiologic sensor data, the user should sit or lie still, i.e. the user should not be moving, when the first physiologic sensor data and the second physiologic sensor data are obtained or measured. Thus, if the first physiologic sensor data does not match the second physiologic sensor data it may be because the user is not sitting still but is moving. Thus, the notification 58 may comprise instructions to the user to sit or lie still (Fig. 5c, par [0125], see Goldman).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the heart rate measurements using hearing device and app taught by Goldman with the headphone assembly of Abrahamsson in view of Gui in view of Tu such that to obtain wherein the headphones are programmed to transmit an alert to the user to remain still in response to the first timer expiring and the headphones being detected to be moving for purpose of providing a notification if the comparison does not result in a match, as suggested by Goldman in Abstract.
Regarding claim 5, Abrahamsson in view of Gui in view of Tu in view of Goldman teaches the headphone assembly of claim 4. Abrahamsson in view of Gui in view of Tu, as modified, teaches wherein the alert corresponds to an audio cue to notify the user to remain still (The notification may be visual, audio and/or tactile. The notification may comprise visual, audio and/or tactile notifications, par [0040], see Goldman).
Regarding Claim 13, this claim has similar limitations as claim 4. Therefore it is interpreted and rejected under Abrahamsson in view of Gui in view of Tu in view of Goldman for the reasons.
Regarding Claim 14, this claim has similar limitations as claim 5. Therefore it is interpreted and rejected under Abrahamsson in view of Gui in view of Tu in view of Goldman for the reasons.
12. Claims 17, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Abrahamsson et al. U.S. Patent Application Publication 20120114132 (hereinafter, “Abrahamsson”) in view of Tu et al. U.S. Patent Application Publication 20220103965 (hereinafter, “Tu”).
Regarding claim 17, A headphone assembly (Headphones, also sometimes referred to as earphones, are a type of headset (also referred to as listening device) that have been used to listen to audio content or material, e.g., sounds, such as music, lectures and so on. Headphones typically have used speakers that are positioned over the ears of a user to convey audio content to the respective ears and a support bar on which the speakers are mounted, par [0004]. In Figs. 1 and 2, an audio headset system10 is illustrated in position with respect to a user 11, who may listen to sounds provided by the audio headset system, par [0069], see Abrahamsson) comprising:
headphones (The audio headset system 10 includes a pair of earpieces 12R, 12L that are illustrated in position with respect to respective ears 13R, 13L of the user 11 to provide sounds to those ears par [0070], see Abrahamsson);
at least one motion detector configured to transmit a first signal indicative of movement of the headphones (The audio headset system 10 includes a pair of accelerometers, which are shown schematically at 14R, 14L in FIG. 1 (and shown at 14R, in FIG. 3). The accelerometers are configured to provide acceleration information representative of acceleration of the respective earpieces, par [0071], see Abrahamsson); and
at least one processor (According to a further aspect, the processor is configured to process acceleration information to determine amount and/or direction of angular motion relative to a reference direction, and wherein the accelerometers provide acceleration information indicative of the reference direction, par [0014], see Abrahamsson. The acceleration signal portions 55 alternatively may represent a bit of extra motion, e.g., acceleration/deceleration to bring the shaft 44s to a desired orientation relative to the forward facing direction. Thus, it will be appreciated that the acceleration signals 50 represent rotation from a start position represented at 56 along the time axis 42, a deceleration in the general area 57, a reversal in the area 58, and a stopping in the area 59,see Figs. 5A, 5B, par [0096], see Abrahamsson) programmed to:
determine that the headphones are on a user's head (the audio headset system 10 includes a pair of earpieces 12R, 12L that are illustrated (e.g., determine) in position with respect to respective ears 13R, 13L of the user 11 to provide sounds to those ears, Figs. 1, 2, par [0071], see Abrahamsson. A user should be confident that the ear pieces 12 are appropriately in position in his ears 13. Various detectors are available to detect that an ear piece, such as an earbud, is properly in position in a user's ear. Capacitive sensors and infrared proximity sensors have been used in the past for this purpose, Fig. 1, par [0154], see Abrahamsson); and
establish a relative orientation of the headphones while positioned on the user's head (The initial positioning of the shaft 44s and the accelerometers 14L, 14R on the shaft is representative of the accelerometers 14L, 14R of the earpieces 12L, 12R illustrated in FIG. 2. Therefore relative to the forward facing direction 25a, the shaft 44s initially is generally perpendicular to that direction and is perpendicular to the axis 24a (par [0090], Fig. 2, see Abrahamsson). FIG. 8 is a flowchart or a logic diagram 80 representing, for example, steps for setting a reference direction for the audio headset system 10. At step 81 the user may determine that it is intended to set a reference direction. At step 82 the user may face a reference direction. For example, the user may face north or some other known reference direction. At step 83 the user may press a reference direction switch of the audio headset system 10, e.g., a switch located on an earpiece, Figs. 2, 8, par [0117], see Abrahamsson).
However, Abrahamsson does not explicitly disclose as a frame of reference.
Tu teaches adaptive audio centering for head tracking in spatial audio application (see Title) including headset 101 (Fig. 1A, par [0044], see Tu). FIG. 11 is a conceptual block diagram of headset software/hardware architecture 1110 implementing the features and operations described in reference to FIGS. 1-9. In an embodiment, architecture 1100 can includes system-on-chip (SoC) 1101, stereo loudspeakers 1102a, 1102b (e.g., ear buds, headphones, earphones), pushbuttons 1112 (Fig. 11, par [0082], see Tu). To maintain the desired 3D spatial audio effect, it is desired that the center channel (C) be aligned with a boresight vector 104. The boresight vector 104 originates from a headset reference frame and terminates at a source device reference frame. When the virtual auditory environment 103 is first initialized, the center channel (C) is aligned with boresight vector 104 by rotating a reference frame for the ambience bed 105 (X.sub.A, Y.sub.A, Z.sub.A) to align the center channel (C) with boresight vector 104, as shown in FIG. 1B. This alignment process causes the spatial audio to be “centered.” When the spatial audio is centered, the user perceives audio from the center channel (e.g., spoken dialogue) as coming directly from the display of source device 102 (Fig. 1B, par [0047], see Tu).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the adaptive audio centering for head tracking in spatial audio application taught by Tu with the headphone assembly of Abrahamsson such that to obtain as a frame of reference for purpose of providing accurate tracking of how the true boresight vector is changing over time, as suggested by Tu in paragraph [0048].
Abrahamsson in view of Tu, as modified, teaches for subsequent relative orientation measurements (The acceleration signal portions 55 alternatively may represent a bit of extra motion, e.g., acceleration/deceleration to bring the shaft 44s to a desired orientation relative to the forward facing direction. Thus, it will be appreciated that the acceleration signals 50 represent rotation from a start position represented at 56 along the time axis 42, a deceleration in the general area 57, a reversal in the area 58, and a stopping in the area 59 (see Figs. 5A, 5B, par [0096], see Abrahamsson) in response to at least determining that the headphones are in a stable state while being worn by the user (At location 49 along the time axis 42, the shaft 44s has come to a stop. No acceleration signal in the horizontal plane would occur, and, therefore, the acceleration signals would be, for example, at a zero level relative to the y axis 43 (see Figs. 5A, 5B, par [0093], see Abrahamsson) and after the determining that the headphones are on the user's head (The initial positioning of the shaft 44s and the accelerometers 14L, 14R on the shaft is representative of the accelerometers 14L, 14R of the earpieces 12L, 12R illustrated in FIG. 2. Therefore relative to the forward facing direction 25a, the shaft 44s initially is generally perpendicular to that direction and is perpendicular to the axis 24a (par [0090], Fig. 2, see Abrahamsson). FIG. 8 is a flowchart or a logic diagram 80 representing, for example, steps for setting a reference direction for the audio headset system 10. At step 81 the user may determine that it is intended to set a reference direction. At step 82 the user may face a reference direction. For example, the user may face north or some other known reference direction. At step 83 the user may press a reference direction switch of the audio headset system 10, e.g., a switch located on an earpiece, Figs. 2, 8, par [0117], see Abrahamsson). The motivation is for purpose of providing accurate tracking of how the true boresight vector is changing over time, as suggested by Tu in paragraph [0048].
Regarding claim 20, Abrahamsson in view of Tu teaches the headphone assembly of claim 17. Abrahamsson in view of Tu teaches, as modified teaches wherein the at least one motion detector is positioned on the headphones (For example, the user may face north or some other known reference direction. At step 83 the user may press a reference direction switch of the audio headset system 10, e.g., a switch located on an earpiece, Figs. 2, 8, par [0117], see Abrahamsson).
13. Claim 18 is rejected under 35 U.S.C. 103 as being unpatentable over Abrahamsson et al. U.S. Patent Application Publication 20120114132 (hereinafter, “Abrahamsson”) in view of Tu et al. U.S. Patent Application Publication 20220103965 (hereinafter, “Tu”), and further in view of Graham et al. U.S. Patent 11153687 (hereinafter, “Graham”).
Regarding claim 18, Abrahamsson in view of Tu teaches the headphone assembly of claim 17. Abrahamsson in view of Tu teaches, as modified teaches wherein the at least one processor is further programmed to determine that the headphones had moved (Each of the earpieces 12R and 12L in the headset 10 contains an accelerometer 14. As the earpieces move in relation to each other, the accelerometers 14R, 14L will give information about the rotation, e.g., angular motion, of the user's 11 head 11h, Fig. 1, par [0083], see Abrahamsson), establishing the relative orientation of the head (The acceleration signal portions 55 alternatively may represent a bit of extra motion, e.g., acceleration/deceleration to bring the shaft 44s to a desired orientation relative to the forward facing direction (see Figs. 5A, 5B, par [0096], see Abrahamsson).
However, Abrahamsson in view of Tu does not explicitly disclose from a surface positioned off of the user's head prior to establishing the relative orientation of the head.
Graham teaches wireless headphone interactions (see Title) in which FIG. 10A indicates that headphones 1000 were previously resting on the table in front of the user and that the user picks up headphones 1000 and places them on his ears. Headphones 1000 (or an electronic device in communication with headphones 1000) detect the motion. In response to detecting the motion, headphones 1000 are operated in an audio output state that is based on the characteristics of the detected motion. For example, headphones 1000 determine that the user picked up headphones 1000 from the surface of the table, and in accordance with this determination, headphones 100 are powered on (if headphones were in a powered off state prior to being picked up) or transition from a standby operating state to a normal operating state. In some embodiments, the standby operating state includes the sensors for detecting motion of headphones 1000 being powered on (so that headphones 1000 can detect motion) without speaker drivers in headphones 1000 being powered on, while the normal operating state includes the sensors and the speaker drivers being powered on. As such, the operating state of the headphones optionally automatically changes based on the motion of headphones 1000. In some embodiments, headphones 1000 are determined to be picked up based on headphones 1000 being raised, re-oriented from a horizontal orientation to a vertical orientation, placed on a user's ears (e.g., the device detects the user's ears), or contacted by the user, or some combination thereof (Fig. 10A, col. 54, last paragraph – col. 55, first paragraph, see Graham).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the wireless headphone interactions taught by Graham with the headphone assembly of Abrahamsson in view of Tu such that to obtain from a surface positioned off of the user's head prior to establishing the relative orientation of the head for purpose of providing improved and easy-to-use
control options enhances the operability of the device, as suggested by Graham in column 43, lies 51-52.
14. Claim 19 is rejected under 35 U.S.C. 103 as being unpatentable over Abrahamsson et al. U.S. Patent Application Publication 20120114132 (hereinafter, “Abrahamsson”) in view of Tu et al. U.S. Patent Application Publication 20220103965 (hereinafter, “Tu”), and further in view of Goldman U.S. Patent Application Publication 20220225887.
Regarding claim 19, Abrahamsson in view of Tu teaches the headphone assembly of claim 17. Abrahamsson in view of Tu teaches, as modified teaches the headphones are not in the stable state while being worn by the user (the re-centering is implemented using a bleed-to-zero (BTZ) process triggered after detecting uninterrupted period of static or correlated mutual quiescence. The rate and at BTZ re-centers the spatial audio adapts with the size of the boresight correction angle, such that smaller correction angles are corrected more slowly than larger correction angles to avoid disorientating the user. In some scenarios, the user may rotate their head to look off to the side (not at the screen) temporarily. For such a short quiescence period, BTZ is not performed because the spatial audio would be perceived by the user as off-center, i.e., not in the stable state, (par [0027], see Tu).
However, Abrahamsson in view of Tu does not explicitly disclose wherein the at least one processor is further programmed to transmit an alert to the user to remain still in response to determining that the headphones are not in the stable state while being worn by the user.
Goldman teaches heart rate measurements using hearing device and app (see Title) in which for obtaining reliable first physiologic sensor data and second physiologic sensor data, the user should sit or lie still, i.e. the user should not be moving, when the first physiologic sensor data and the second physiologic sensor data are obtained or measured. Thus, if the first physiologic sensor data does not match the second physiologic sensor data it may be because the user is not sitting still but is moving. Thus, the notification 58 may comprise instructions to the user to sit or lie still (Fig. 5c, par [0125], see Goldman).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to incorporate the heart rate measurements using hearing device and app taught by Goldman with the headphone assembly of Abrahamsson in view of Gui in view of Tu such that to obtain wherein the at least one processor is further programmed to transmit an alert to the user to remain still in response to determining that the headphones are not in the stable state while being worn by the use for purpose of providing a notification if the comparison does not result in a match, as suggested by Goldman in Abstract.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CON P TRAN whose telephone number is (571) 272-7532. The examiner can normally be reached M-F (08:30 AM- 05:00 PM) ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, VIVIAN C. CHIN can be reached at 571-272-7848. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/C.P.T/Examiner, Art Unit 2695
/VIVIAN C CHIN/Supervisory Patent Examiner, Art Unit 2695