Notice of Pre-AIA or AIA Status
The present application is being examined under the pre-AIA first to invent provisions.
DETAILED ACTION
2. The request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for Continued Examination under 37 CFR 1.114, the fee set forth in 37 CFR 1.17(e) has been paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant’s submission filed 9/18/2025 has been entered. An action on the RCE follows.
Summary of claims
3. Claims 1-18 are pending,
Claims 1, 16, 17 are amended,
Claim 18 is newly added,
Claims 1, 16, 17 are independent claims,
Claims 1-18 are rejected.
Remarks
4. Applicant’s arguments, see Remarks, filed on 9/18/2025, with respect to the rejection(s) of claim(s) 1-18 under 103 have been fully considered and are not persuasive.
Applicant argued on pages 8-10 that the cited references including Chu and Tuli did not teach the newly amended features, such as, “while playing the piece of music: detecting a first input event at a location corresponding to the representation of the piece of music; in response to detecting the first input event at the location corresponding to the representation of the piece of music, providing tactile feedback that corresponds to at least a subset of beats of the piece of music,” as cited by claim 1. Examiner respectfully disagrees and submits that Chu discloses playing music and other sounds such as speech, and providing haptic sensations in response to user input (Chu: [0065], [0069]), specifically, Chu discloses a music editing program may play a sample of music (Chu: [0072]), further, Chu discloses the user physically contacts the device to provide input (Chu: [0023]), and utilizing a graphical user interface (GUI) to present options to a user and receive input from the user (Chu: [0028]), furthermore, Chu discloses position a cursor at a desired location in the sound data stream for further editing of the sound data (Chu: [0063]), please note Chu discloses providing haptic effect in response to the received user input at the desired location, and the desired location may be an element displayed, such as a representation of the music, and Chu provides functions of playing music. In addition, Tuli teaches using pulse gesture in a music player application to play a song (Tuli: [0073]), and Tuli clearly discloses "a detection of pressure on the screen ...over a given area or the detection of a change ...at a particular location...the data corresponding to the touch event (e.g., location of touch) (Tuli: [0049]), that is, Tuli teaches detection of pressure on the screen of the touch screen above a particular pressure threshold over a given aera and providing haptic feedback in response to the received user input at a particular location in the display while providing function of playing music. Accordingly, Chu and Tuli still read on the cited features in claim 1.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action:
(a) A patent may not be obtained though the invention is not identically disclosed or described as set forth in section 102 of this title, if the differences between the subject matter sought to be patented and the prior art are such that the subject matter as a whole would have been obvious at the time the invention was made to a person having ordinary skill in the art to which said subject matter pertains. Patentability shall not be negatived by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
5. Claims 1-5, 7, 10, 14-18 are rejected under 35 U.S.C. 103(a) as being unpatentable over Lonny Chu (US Publication 20030068053 A1, hereinafter Chu), and in view of Apaar Tuli (US Publication 20120242584 A1, hereinafter Tuli).
As for independent claim 1, Chu discloses: A method, comprising: at an electronic device having a display and a touch-sensitive surface (Chu: [0031], Immersion Touchsense Processor; [0040], The sensors 42 can sense the position or motion of the housing or manipulandum 60, for example, touchpad): displaying a representation of a piece of music on the display of the electronic device that has the touch-sensitive surface (Chu: [0063], application program is a music composition/editing program that allows the user to playback music based on input from the user via the haptic device; please note a representation of a piece of music may be displayed, such as an icon of a song); while playing the piece of music (Chu: [0065], [0069], playing music and other sounds such as speech, and providing haptic sensations in response to user input; [0072], a music editing program may play a sample of music): detecting a first input event at a location corresponding to the representation of the piece of music (Chu: Abstract, Haptic commands are generated based on the sound data and are used to output haptic sensations to the user by a haptic feedback device manipulated by the user. The haptic sensations correspond to one or more characteristics of the sound data to assist the user in discerning features of the sound data during the navigation through and editing of the sound data; [0023], the user physically contacts the device to provide input); in response to detecting the first input event at the location corresponding to the representation of the piece of music, providing tactile feedback that corresponds to at least a subset of beats of the piece of music (Chu: Abstract, Haptic commands are generated based on the sound data and are used to output haptic sensations to the user by a haptic feedback device manipulated by the user. The haptic sensations correspond to one or more characteristics of the sound data to assist the user in discerning features of the sound data during the navigation through and editing of the sound data; [0006], playing of the sound is controlled by user input received by the computer from a user; [0028], utilizing a graphical user interface (GUI) to present options to a user and receive input from the user; [0063], position a cursor at a desired location in the sound data stream for further editing of the sound data; please note Chu discloses providing haptic effect in response to the received user input at the desired location, and the desired location may be an element displayed, such as a representation of the music);
Chu discloses providing haptic sensations to the user in response to user input but does not disclose ceasing to provide the tactile feedback in response to a second input, Tuli discloses: after providing the tactile feedback, detecting a second input event at the location corresponding to the representation of the piece of music; and in response to detecting the second input event at the location corresponding to the representation of the piece of music, ceasing to provide the tactile feedback that corresponds to the beats of the piece of music (Tuli: [0049], a detection of pressure on the screen ...over a given area or the detection of a change ...at a particular location...the data corresponding to the touch event (e.g., location of touch), that is, Tuli teaches providing haptic feedback in response to the received user input at a particular location in the display; [0053], if the user simply lifts one or more fingers, the pulse gesture may be terminated at any time).
Chu and Tuli are analogous art because they are in the same field of endeavor, providing haptic feedback to the user in response to user input. Therefore, it would have been obvious to one with ordinary skill, in the art at the time of the invention, to modify the invention of Chu using the teachings of Tuli to include stop playing the music when user lifts one or more fingers. This would provide Chu’s method with enhanced capability of allowing user to more easily perform the manipulation tasks in music application.
As for claim 2, Chu-Tuli discloses: the first input event comprises a contact on the touch-sensitive surface; and the tactile feedback is provided by generating tactile outputs on the touch-sensitive surface (Chu: [0023], these interface devices can provide physical sensations which are felt by the user contacting the device or manipulating a user manipulandum of the device. For example, the device 12 can be a knob, a mouse, a trackball, a joystick, or other device which the user moves in provided degrees of freedom to input direction, value, magnitude, etc. While the user physically contacts the device 12 to provide input, he or she also can experience haptic sensations output by the haptic device).
As for claim 3, Chu-Tuli discloses: wherein detecting the second input event at the location corresponding to the representation of the piece of music comprises detecting movement of a contact away from the location corresponding to the representation of the piece of music (Tuli: [0053], if the user simply lifts one or more fingers, the pulse gesture may be terminated at any time).
As for claim 4, Chu-Tuli discloses: the piece of music is currently being played in a media player application; and the representation of the piece of music is a graphical representation of the piece of music (Chu: [0023], the haptic sensations are related to the editing and other sound manipulation features occurring in the application program of the host computer and allow the user to more easily perform the manipulation tasks and work with the sound data).
As for claim 5, Chu-Tuli discloses: the representation of the piece of music is displayed in a media player application; and the tactile feedback includes a plurality of tactile outputs generated when corresponding beats in the subset of beats are played by the media player application (Chu: [0056], The sound data represents a sound waveform 222 and may have short peaks 224 of sound corresponding to short or abrupt features in the sound, such as drumbeats or other percussion sounds in music; [0067], A haptic sensation is thus immediately output by the haptic device to the user simultaneously with output of the associated sound feature. For example, a drum beat sound that is output is simultaneously coordinated with the output of a force pulse or jolt from the haptic device to the user).
As for claim 7, Chu-Tuli discloses: wherein the tactile feedback includes a plurality of tactile outputs generated when an input moves over representations of corresponding beats in the subset of beats in the representation of the piece of music (Chu: [0023], While the user physically contacts the device 12 to provide input, he or she also can experience haptic sensations output by the haptic device; [0065], in a position control mode, the user can rotate a knob continually clockwise to continue to hear music playback, and can adjust the speed of that rotation to adjust the speed of playback. When the user stops rotating the knob, the sound playback stops).
As for claim 10, Chu-Tuli discloses: wherein providing tactile feedback that corresponds to at least a subset of beats of the piece of music comprises providing the tactile feedback corresponding to the subset of beats of the piece of music, without playing the piece of music (Chu: [0056], The sound data represents a sound waveform 222 and may have short peaks 224 of sound corresponding to short or abrupt features in the sound, such as drumbeats or other percussion sounds in music; [0067], A haptic sensation is thus immediately output by the haptic device to the user simultaneously with output of the associated sound feature. For example, a drum beat sound that is output is simultaneously coordinated with the output of a force pulse or jolt from the haptic device to the user).
As for claim 14, Chu-Tuli discloses: detecting the first input event at the location corresponding to the representation of the piece of music includes detecting movement of a focus selector over the representation of the piece of music (Chu: [0023], While the user physically contacts the device 12 to provide input, he or she also can experience haptic sensations output by the haptic device).
As for claim 15, Chu-Tuli discloses: wherein detecting the second input event at the location corresponding to the representation of the piece of music includes detecting movement of a focus selector away from the representation of the piece of music (Tuli: [0053], if the user simply lifts one or more fingers, the pulse gesture may be terminated at any time).
As per claim 16, it recites features that are substantially same as those features claimed by claim 1, thus the rationales for rejecting claim 1 are incorporated herein.
As per claim 17, it recites features that are substantially same as those features claimed by claim 1, thus the rationales for rejecting claim 1 are incorporated herein.
As for claim 18, Chu-Tuli discloses: wherein the touch-sensitive surface is a touch-sensitive surface of the display, and the first input event corresponds to a physical contact on the touch-sensitive surface at the location corresponding to the representation of the piece of music (Chu: [0023], the user physically contacts the device to provide input; [0031], touchsense; [0040], touch pad; Tuli: [0049], a detection of pressure on the screen ...over a given area or the detection of a change ...at a particular location...the data corresponding to the touch event (e.g., location of touch); [0075], devices may also have other touch sensitive surfaces where the pulse gesture or other touch inputs may be employed).
6. Claims 6, 8, 9, 11-13 are rejected under 35 U.S.C. 103(a) as being unpatentable over Chu and Tuli as applied on claim 1, and further in view of Mark Patrick Egan (US Publication 20090114079 A1, hereinafter Egan).
As for claim 6, Chu-Tuli discloses a music composing application (Chu: [0052], an application program, such as a sound composition/editing program) but does not clearly disclose displaying a musical score, Egan discloses: the representation of the piece of music is displayed as a musical score in a music composing application (Egan: [0101], a user interface that enables a user to enter and display the musical score, a database that stores a data structure which supports graphical symbols for musical characters in the musical score and performance generation data that is derived from the graphical symbols).
Chu and Tuli and Egan are analogous art because they are in the same field of endeavor, providing haptic feedback to the user in response to user input. Therefore, it would have been obvious to one with ordinary skill, in the art at the time of the invention, to modify the invention of Chu using the teachings of Egan to include displaying musical score in the user interface. This would provide Chu’s method with enhanced capability of allowing user to more easily perform the manipulation tasks in music application.
As for claim 8, Chu-Tuli discloses a music composing application including generating beats (Chu: [0052], an application program, such as a sound composition/editing program; [0067], A haptic sensation is thus immediately output by the haptic device to the user simultaneously with output of the associated sound feature. For example, a drum beat sound that is output is simultaneously coordinated with the output of a force pulse or jolt from the haptic device to the user) but does not clearly disclose stressed beats, Egan discloses: wherein the subset of the beats includes stressed beats of the piece of music (Egan: [0915], Pulse of Music=In music, a pulse is a series of identical, yet distinct periodic short-duration stimuli perceived as points in time. A pulse that is regularly accented is a meter. The Meter is then the measurement of a musical line into measures of stressed and unstressed beats, indicated in conventional music notation by a symbol called a time signature).
Chu and Tuli and Egan are analogous art because they are in the same field of endeavor, providing haptic feedback to the user in response to user input. Therefore, it would have been obvious to one with ordinary skill, in the art at the time of the invention, to modify the invention of Chu using the teachings of Egan to include the stressed beats.
As for claim 9, Chu-Tuli-Egan discloses: wherein the subset of the beats excludes unstressed beats of the piece of music (Egan: [0915], Pulse of Music=In music, a pulse is a series of identical, yet distinct periodic short-duration stimuli perceived as points in time. A pulse that is regularly accented is a meter. The Meter is then the measurement of a musical line into measures of stressed and unstressed beats, indicated in conventional music notation by a symbol called a time signature).
As for claim 11, Chu-Tuli-Egan discloses: the subset of beats includes one or more stressed beats and one or more unstressed beats; the tactile feedback includes a plurality of tactile outputs for corresponding beats in the subset of beats; a first tactile output is generated for stressed beats; and a second tactile output, different from the first tactile output, is generated for non-stressed beats (Egan: [0915], Pulse of Music=In music, a pulse is a series of identical, yet distinct periodic short-duration stimuli perceived as points in time. A pulse that is regularly accented is a meter. The Meter is then the measurement of a musical line into measures of stressed and unstressed beats, indicated in conventional music notation by a symbol called a time signature).
As for claim 12, Chu-Tuli-Egan discloses: the first tactile output is generated by movement of the touch-sensitive surface that includes a first dominant movement component; the second tactile output is generated by movement of the touch-sensitive surface that includes a second dominant movement component; and the first dominant movement component and the second dominant movement component have a same amplitude and different movement profiles (Chu: [0007], The haptic sensations can be continuously output during the playing of the sound data and have a magnitude based on an amplitude of the sound data currently being played; or, the haptic sensations can be output only when features of the sound data having predetermined characteristics are played; [0073], All the user preferences or settings can be used in helping determine if a sound characteristic is present which is to be mapped to a haptic sensation, and the particular haptic sensation that is to be mapped. In other embodiments, such as where haptic sensation magnitude continuously follows sound amplitude; [0074], sound waveform amplitude 330 is similarly shown, but the haptic sensation amplitude 336 can be continuously inversely varied from the sound amplitude to provide a different haptic experience to the user. Other continuous haptic sensation mappings or sound characteristic haptic mappings can also be used; Tuli: [0077], similar graphical elements may be ranked based on predefined criteria or a value such as an amplitude or other indicator of degree or frequency of occurrence of an event based on various thresholds and/or ranges corresponding to different potential values that may be assigned to each graphical element; Egan: [0002], A visual representation of the sound data is typically displayed as one or more time vs. amplitude graphs which the user can customize to a desired scale; [0007], The haptic sensations can be continuously output during the playing of the sound data and have a magnitude based on an amplitude of the sound data currently being played; or, the haptic sensations can be output only when features of the sound data having predetermined characteristics are played).
As for claim 13, Chu-Tuli-Egan discloses: the first tactile output is generated by movement of the touch-sensitive surface that includes a first dominant movement component; the second tactile output is generated by movement of the touch-sensitive surface that includes a second dominant movement component; and the first dominant movement component and the second dominant movement component have a same movement profile and different amplitudes (Chu: [0007], The haptic sensations can be continuously output during the playing of the sound data and have a magnitude based on an amplitude of the sound data currently being played; or, the haptic sensations can be output only when features of the sound data having predetermined characteristics are played; [0073], All the user preferences or settings can be used in helping determine if a sound characteristic is present which is to be mapped to a haptic sensation, and the particular haptic sensation that is to be mapped. In other embodiments, such as where haptic sensation magnitude continuously follows sound amplitude; [0074], sound waveform amplitude 330 is similarly shown, but the haptic sensation amplitude 336 can be continuously inversely varied from the sound amplitude to provide a different haptic experience to the user. Other continuous haptic sensation mappings or sound characteristic haptic mappings can also be used; Tuli: [0077], similar graphical elements may be ranked based on predefined criteria or a value such as an amplitude or other indicator of degree or frequency of occurrence of an event based on various thresholds and/or ranges corresponding to different potential values that may be assigned to each graphical element; Egan: [0002], A visual representation of the sound data is typically displayed as one or more time vs. amplitude graphs which the user can customize to a desired scale; [0007], The haptic sensations can be continuously output during the playing of the sound data and have a magnitude based on an amplitude of the sound data currently being played; or, the haptic sensations can be output only when features of the sound data having predetermined characteristics are played).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Hua Lu whose telephone number is 571-270-1410 and fax number is 571-270-2410. The examiner can normally be reached on Mon-Fri 7:30 am to 5:00 pm EST. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Scott Baderman can be reached on 571-272-3644. The fax phone number for the organization where this application or proceeding is assigned is 703-273-8300.
The prior art made of record and not relied upon is considered pertinent to applicant’s disclosure. Applicants are required under 37 C.F.R. § 1.111(c) to consider these references fully when responding to this action. It is noted that any citation to specific pages, columns, lines, or figures in the prior art references and any interpretation of the references should not be considered to be limiting in any way. A reference is relevant for all it contains and may be relied upon for all that it would have reasonably suggested to one having ordinary skill in the art. In re Heck, 699 F.2d 1331, 1332-33, 216 U.S.P.Q. 1038, 1039 (Fed. Cir. 1983) (quoting In re Lemelson, 397 F.2d 1006, 1009, 158 U.S.P.Q. 275, 277 (C.C.P.A. 1968)).
/HUA LU/
Primary Examiner, Art Unit 2118