DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 1/16/2023 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Objections
Claim 1 is objected to because of the following informality: In line 2, "the inputs signal encoding" should read, "the input signal encoding." Appropriate correction is required.
Claim 7 is objected to because of the following informality: In line 4, "combination notes" should read, "combination of notes." Appropriate correction is required.
Claim 16 is objected to because of the following informality: In line 1, "system according claim 15" should read, "system according to claim 15." Appropriate correction is required.
Claim 16 is objected to because of the following informality: In line 5, "combination notes" should read, "combination of notes." Appropriate correction is required.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1, 3, 5, 8-10, and 13-15 are rejected under 35 U.S.C. 102(a)(1) as anticipated by Riopelle et al. (US 20100107855 A1, May 6, 2010), hereinafter Riopelle.
Regarding claim 1, Riopelle discloses a method for performing a piece of music comprising the steps of: receiving an input signal from a musical instrument, the inputs signal encoding the notes played on the instrument (Riopelle ¶0106: "In one embodiment, a MIDI keyboard may be connected to a MIDI input port on a computer wherein the musical sounds can be mapped to be triggered by selected keys on the keyboard. The composer may map notes to specific beams that may then be triggered by the selected keys."), matching a note or combination of notes in the input signal to a respective trigger (Riopelle ¶0106: "In one embodiment, a MIDI keyboard may be connected to a MIDI input port on a computer wherein the musical sounds can be mapped to be triggered by selected keys on the keyboard. The composer may map notes to specific beams that may then be triggered by the selected keys.") in a predefined set of triggers stored in a memory (Riopelle ¶0097: "The program creation user interface also allows the composer to associate trigger configurations with the program ( buttons 716, 718, 720, 722, 724, 726, 728, and 730), and to assign the trigger configurations to specific triggers (drop-down boxes 732, 734, 736, 738, 740, 742, and 744).), each trigger being associated with a respective fragment of music that makes up a part of the piece of music (Riopelle ¶0149: "Each section of sample song 1902 is stored into memory 1204 by Instrument with each Instrument consisting of various Music Clips. Each Song Section 1904 lasts for a predetermined number of bars and is paired to a corresponding Music Clip for each Instrument In addition, each Instrument is paired with a specific Trigger. While sample song 1902 is playing, the Beamz Studio knows, based on the number of bars of music played, what section 1904 the song is currently in. When the user triggers the Beam Trigger, the corresponding music clip for that song section is played."), each fragment having a predefined length (Riopelle ¶0175: "In order to create a new song section, Right/Click on the name of any song section. Select New Section on the menu. A new song section will be created and inserted in the matrix. Edit the new section to name it and set its length in Bars: Beats. (4 Bars=4:0). All Music Clips for the new section will be empty.") and starting from a predefined position relative to the start of a bar (Riopelle ¶0208: "For example, if the instrument is set up to play a part that is meant to be played on the downbeat of a measure, a Start 2214 value of a Whole Note could be used. The performer can either play these parts directly at the proper moment, or pre-trigger them by playing slightly ahead of the downbeat and they will play on the next downbeat the metronome reaches."), and at least one of the fragments being more complex than the associated trigger (Riopelle ¶0125: "In Beamz Studio, music clip means the notes that are available to be played during the current section of the song. When the notes are actually played is determined by the musician who triggers a Beamz Instrument to play its active music clip. Music clips are part of a Beamz Instrument's definition, and how they actually play their sounds is determined by way the Instrument has been defined."), characterised in that when a note or combination of notes that matches a trigger is played by the user the method outputs at least part of the matched fragment (Riopelle ¶0149: "When the user triggers the Beam Trigger, the corresponding music clip for that song section is played.") that starts at the time that the note or combination of notes is played (Riopelle ¶0206: "The normal default Start 2214 value is None which provides immediate response when a Beam is triggered. If the user chooses to use them, the Start 2214 options are specified as musical note values. The note value selected here becomes the start boundary for the instrument.").
Regarding claim 3, Riopelle discloses a method for performing a piece of music comprising the features of claim 1 as discussed above.
Riopelle further discloses that the step of outputting a fragment from the point where it is triggered comprises passing the fragment to an audio playback device from that point (Riopelle ¶0226: "Beamz Studio relies on the Microsoft GS Synthesizer, which is a part of Windows. It works only with the Microsoft WDM audio stream protocol, which is the Windows standard. Practically all factory installed sound cards for Windows computers use the protocol. However, some advanced add-on sound cards offer a selection of other protocols that used. The card used for Beamz playback must be using the WDM protocol.").
Regarding claim 5, Riopelle discloses a method for performing a piece of music comprising the features of claim 1 as discussed above.
Riopelle further discloses that a fragment is associated with only one bar or selected bars in the piece of music (Riopelle ¶0149: "Each Song Section 1904 lasts for a predetermined number of bars and is paired to a corresponding Music Clip for each Instrument In addition, each Instrument is paired with a specific Trigger.") so that it will only be output if the correct trigger is played at that time or within the duration of the fragment (Riopelle ¶0149: "When the user triggers the Beam Trigger, the corresponding music clip for that song section is played.").
Regarding claim 8, Riopelle discloses a method for performing a piece of music comprising the features of claim 1 as discussed above.
Riopelle further discloses providing multiple sets of fragments, each set corresponding to one part of the piece of music (Riopelle ¶0149: "As illustrated in FIG. 19 from the software perspective, Beamz songs are stored in memory 1204 using Beamz Studio. Each section of sample song 1902 is stored into memory 1204 by Instrument with each Instrument consisting of various Music Clips. Each Song Section 1904 lasts for a predetermined number of bars and is paired to a corresponding Music Clip for each Instrument In addition, each Instrument is paired with a specific Trigger. While sample song 1902 is playing, the Beamz Studio knows, based on the number of bars of music played, what section 1904 the song is currently in. When the user triggers the Beam Trigger, the corresponding music clip for that song section is played.").
Regarding claim 9, Riopelle discloses a method for performing a piece of music comprising the features of claim 1 as discussed above.
Riopelle further discloses processing more than one trigger sequence associated with each set of fragments (Riopelle ¶0133: "The Beamz System supports 2 Beamz units, so there are a total of 16 possible Beam Triggers that can be assigned to Beamz instruments. More than one Instrument can be assigned to a single Beam enabling them all to be triggered at once by the same Beam Trigger, so there are often more than 16 instruments in a song.").
Regarding claim 10, Riopelle discloses a method for performing a piece of music comprising the features of claim 9 as discussed above.
Riopelle further discloses permitting the user to define the specific instrument from a type of instruments to be selected (Riopelle ¶0178: "As illustrated in FIG. 20, Instruments are listed by their name and can be moved to any order by dragging them up/down in the matrix. In order to edit an instrument, left click on its name in the matrix. The Edit pane below will show the properties for it") and thereby choose which set of fragments they will trigger (Riopelle ¶0127: "As the song progresses from one section to another, different music clips become available for each instrument—should the performer choose to play them by triggering a Beam Trigger that has been assigned to the instrument.").
Regarding claim 13, Riopelle discloses a method for performing a piece of music comprising the features of claim 1 as discussed above.
Riopelle further discloses providing cues to the user to aid them in playing the correct note or correct sequence of notes at the correct time (Riopelle ¶0128: "The Rhythm Master has a special property that makes it the Master controller for the song. It not only starts and stops the song, but it also serves as the master metronome for the song as well." Metronomes are cues that aid a user in playing the correct note at the correct time.).
Regarding claim 14, Riopelle discloses a method for performing a piece of music comprising the features of claim 1 as discussed above.
Riopelle further discloses receiving from a user a selection of a piece of music (Riopelle ¶0068: "A region 930 is a set of one or more content segments that are presented when a corresponding song section is selected. A trigger can contain a set of regions 930, one for each section within the song."), and selecting the appropriate set of triggers and fragments that enable that piece of music to be played (Riopelle ¶0127: "As the song progresses from one section to another, different music clips become available for each instrument—should the performer choose to play them by triggering a Beam Trigger that has been assigned to the instrument.").
Regarding claim 15, Riopelle discloses a system for assisting a user in playing along to a piece of music, the system comprising: a processing circuit having access to information stored in at least one storage device (Riopelle ¶0114: "As illustrated in FIG. 12, Beamz Studio is configured to include a computer 1200 comprising a processor 1202, memory 1204, peripheral inputs 1206, and a graphical user interface (“GUI”) 1208."), an input device for receiving from a user an electronic signal encoding a note or combination of notes played by the user (Riopelle ¶0106: "In one embodiment, a MIDI keyboard may be connected to a MIDI input port on a computer wherein the musical sounds can be mapped to be triggered by selected keys on the keyboard. The composer may map notes to specific beams that may then be triggered by the selected keys."), a set of stored trigger signals (Riopelle ¶0149: "Each section of sample song 1902 is stored into memory 1204 by Instrument with each Instrument consisting of various Music Clips. Each Song Section 1904 lasts for a predetermined number of bars and is paired to a corresponding Music Clip for each Instrument In addition, each Instrument is paired with a specific Trigger."), a set of stored fragments of music (Riopelle ¶0149: "Each section of sample song 1902 is stored into memory 1204 by Instrument with each Instrument consisting of various Music Clips."), each fragment having a predefined length (Riopelle ¶0175: "In order to create a new song section, Right/Click on the name of any song section. Select New Section on the menu. A new song section will be created and inserted in the matrix. Edit the new section to name it and set its length in Bars: Beats. (4 Bars=4:0). All Music Clips for the new section will be empty.") and starting from a predefined position relative to the start of a bar (Riopelle ¶0208: "For example, if the instrument is set up to play a part that is meant to be played on the downbeat of a measure, a Start 2214 value of a Whole Note could be used. The performer can either play these parts directly at the proper moment, or pre-trigger them by playing slightly ahead of the downbeat and they will play on the next downbeat the metronome reaches."), and an output device (Riopelle ¶0226: "Beamz Studio relies on the Microsoft GS Synthesizer, which is a part of Windows. It works only with the Microsoft WDM audio stream protocol, which is the Windows standard. Practically all factory installed sound cards for Windows computers use the protocol. However, some advanced add-on sound cards offer a selection of other protocols that used. The card used for Beamz playback must be using the WDM protocol."), in which the processing circuit is configured to match the input signal to a respective trigger (Riopelle ¶0149: "When the user triggers the Beam Trigger, the corresponding music clip for that song section is played."), each trigger being associated with one of the stored fragments (Riopelle ¶0149: "Each Song Section 1904 lasts for a predetermined number of bars and is paired to a corresponding Music Clip for each Instrument In addition, each Instrument is paired with a specific Trigger."), and in which the processing circuit is further configured to cause the output device to play audibly or record the matched fragment (Riopelle ¶0227: "When a sound file plays, the sound it produces begins along a path thru the Beamz software and ultimately ends up at the final destination: the computer's sound card, where it can be heard.") from the point in the piece of music corresponding to the time within the bar that the note or combination of notes of the input signal is played (Riopelle ¶0206: "The normal default Start 2214 value is None which provides immediate response when a Beam is triggered. If the user chooses to use them, the Start 2214 options are specified as musical note values. The note value selected here becomes the start boundary for the instrument.").
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 2, 4, and 6 are rejected under 35 U.S.C. 103 as unpatentable over Riopelle in view of Watanabe et al. (US 20120278358 A1, November 1, 2012), hereinafter Watanabe.
Regarding claim 2, Riopelle discloses a method for performing a piece of music comprising the features of claim 1 as discussed above.
Riopelle does not explicitly disclose setting a preset threshold time for determining if the user has played a combination of notes simultaneously and to take each note as part of the trigger.
However, Watanabe suggests setting a preset threshold time for determining if the user has played a combination of notes simultaneously and to take each note as part of the trigger (Watanabe ¶0153: "For example, if a difference between an ON-set time of one of the controls included in the bass tone inputting keyboard range 11 and an ON-set time of another of the controls included in the bass tone inputting keyboard range 11 falls within a predetermined time period, then the part identification section 213 determines that these controls have been operated at the same time point.").
It would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified the method of Riopelle by adding the threshold of Watanabe create a output a result close to a user's intention (Watanabe ¶0153).
Regarding claim 4, Riopelle discloses a method for performing a piece of music comprising the features of claim 1 as discussed above.
Riopelle further teaches that each of the fragments corresponds to a portion of the piece of music being played (Riopelle ¶0149: "As illustrated in FIG. 19 from the software perspective, Beamz songs are stored in memory 1204 using Beamz Studio. Each section of sample song 1902 is stored into memory 1204 by Instrument with each Instrument consisting of various Music Clips. Each Song Section 1904 lasts for a predetermined number of bars and is paired to a corresponding Music Clip for each Instrument In addition, each Instrument is paired with a specific Trigger. While sample song 1902 is playing, the Beamz Studio knows, based on the number of bars of music played, what section 1904 the song is currently in.").
Riopelle does not explicitly disclose a portion of the piece of music that is being played that has a duration equal to one bar or a duration that is less than a bar.
However, Watanabe suggests a portion of the piece of music that is being played that has a duration equal to one bar or a duration that is less than a bar (Watanabe ¶0045; "“rhythm pattern data” includes a data file having recorded therein sound generation start times of individual component notes of a phrase constituting one measure; for example, the data file is a text file having the sound generation start times of the individual component notes described therein. In a later-described matching operation, the sound generation start times in the rhythm pattern data are associated with trigger data included in an input rhythm pattern (query pattern input for a search purpose which will hereinafter be referred to also as “searching query pattern”) and indicating that performance operation has been executed by the user. Here, the sound generation start time of each of the component notes is normalized in advance using the length of one measure as a value “1”. Namely, the sound generation start time of each of the component notes described in the rhythm pattern data takes a value in the range from '0' to '1'.").
It would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified the method of Riopelle by adding the fragment durations of Watanabe to improve synchronization and improve output sound quality (Watanabe ¶0117).
Regarding claim 6, Riopelle discloses a method for performing a piece of music comprising the features of claim 1 as discussed above.
Riopelle further suggests that all fragments start on the first beat of a bar (Riopelle ¶0208: "For example, if the instrument is set up to play a part that is meant to be played on the downbeat of a measure, a Start 2214 value of a Whole Note could be used. The performer can either play these parts directly at the proper moment, or pre-trigger them by playing slightly ahead of the downbeat and they will play on the next downbeat the metronome reaches.").
Riopelle does not explicitly disclose that at least one fragment includes an initial period of padding with no notes if the notes are not to output until later in the bar.
However, Watanabe suggests that at least one fragment includes an initial period of padding with no notes if the notes are not to output until later in the bar (Watanabe ¶0045: "Namely, the sound generation start time of each of the component notes described in the rhythm pattern data takes a value in the range from '0”' to '1'.").
It would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified the method of Riopelle by adding the fragment start times of Watanabe to improve synchronization and improve output sound quality (Watanabe ¶0117).
Claims 7 and 16 are rejected under 35 U.S.C. 103 as unpatentable over Riopelle in view of Challinor et al. (US 9358456 B1, June 7, 2016), hereinafter Challinor.
Regarding claim 7, Riopelle discloses a method for performing a piece of music comprising the features of claim 1 as discussed above.
Riopelle further teaches playing each fragment on mute on a loop (Riopelle ¶0223: "Beamz software has a special internal trigger called Autoplay that is automatically triggered one time whenever the Free Run section becomes active. Instruments used with the Autoplay trigger are either Start/Stop or One-Shot trigger types. Typically, an Autoplay instrument is a silent loop that runs in the background to establish a metronome for instruments set up to trigger on a specific Start value. Sound files can also be assigned to an Autoplay instrument.") synchronised to the predefined start time for each fragment (Riopelle ¶0075: "The trigger synchronizes content presentation with a song performance. This causes the trigger to always pick segments from a region that correspond with the currently active section in the song. If the trigger is operating in one of the Play or Momentary modes, this also forces the trigger to synchronize its playback with the current position in the section.").
Riopelle does not explicitly disclose that the step of outputting a part of a fragment associated with a matched trigger comprises turning off the mute from that point in time within the fragment when the note or combination notes associated with the trigger is played so the rest of the fragment can be heard until the end of the fragment whereupon it continues to play on mute.
However, Challinor suggests that the step of outputting a part of a fragment associated with a matched trigger comprises turning off the mute from that point in time within the fragment when the note or combination notes associated with the trigger is played (Challinor col. 30, lines 30-34: "As the game is played in a multi-player mode, when an MP_BRIDGE_START text event can be encountered on the timeline of multiplayer_markers, the original audio track or tracks are muted and the bridge_audio track is unmuted.") so the rest of the fragment can be heard until the end of the fragment whereupon it continues to play on mute (Challinor col. 30, lines 37-43: "Playback continues until the transition point itself, which can be indicated by the MP_END text event. At this point, the “current time” can be set to the target of the transition, marked by the MP_START text event, and the bridge audio track continues. When the MIDI MP_BRIDGE_END event is encountered, the original audio track or tracks are unmuted, and the bridge audio track is muted.").
It would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified the method of Riopelle by adding the muting of Challinor to bridge audio segments with a single concept of current time for the audio (Challinor col. 30, lines 47-54).
Regarding claim 16, Riopelle discloses a system comprising the features of claim 15 as discussed above.
Riopelle further teaches that the processing circuit is further configured to cause the output device to play each fragment on mute on a loop (Riopelle ¶0223: "Beamz software has a special internal trigger called Autoplay that is automatically triggered one time whenever the Free Run section becomes active. Instruments used with the Autoplay trigger are either Start/Stop or One-Shot trigger types. Typically, an Autoplay instrument is a silent loop that runs in the background to establish a metronome for instruments set up to trigger on a specific Start value. Sound files can also be assigned to an Autoplay instrument.") synchronised to the predefined start time for each fragment (Riopelle ¶0075: "The trigger synchronizes content presentation with a song performance. This causes the trigger to always pick segments from a region that correspond with the currently active section in the song. If the trigger is operating in one of the Play or Momentary modes, this also forces the trigger to synchronize its playback with the current position in the section.").
Riopelle does not explicitly disclose that the processing circuit is further configured to cause the output device to turn off the mute from that point in time within the fragment when the note or combination notes associated with the trigger is played so the rest of the fragment can be heard until the end of the fragment whereupon it continues to play on mute.
However, Challinor suggests that the processing circuit is further configured to cause the output device to turn off the mute from that point in time within the fragment when the note or combination notes associated with the trigger is played (Challinor col. 30, lines 30-34: "As the game is played in a multi-player mode, when an MP_BRIDGE_START text event can be encountered on the timeline of multiplayer_markers, the original audio track or tracks are muted and the bridge_audio track is unmuted.") so the rest of the fragment can be heard until the end of the fragment whereupon it continues to play on mute (Challinor col. 30, lines 37-43: "Playback continues until the transition point itself, which can be indicated by the MP_END text event. At this point, the “current time” can be set to the target of the transition, marked by the MP_START text event, and the bridge audio track continues. When the MIDI MP_BRIDGE_END event is encountered, the original audio track or tracks are unmuted, and the bridge audio track is muted.").
It would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified the system of Riopelle by adding the muting of Challinor to bridge audio segments with a single concept of current time for the audio (Challinor col. 30, lines 47-54).
Claims 11-12 and 17 are rejected under 35 U.S.C. as unpatentable over Riopelle in view of Kay et al. (US 20090104956 A1, April 23, 2009), hereinafter Kay.
Regarding claim 11, Riopelle discloses a method for performing a piece of music comprising the features of claim 1 as discussed above.
Riopelle does not explicitly disclose providing two or more sets of trigger signals, each associated with a different level of difficulty, and assigning a difficulty level to each user, thereafter only matching the input note or combinations of notes to the chosen set of triggers.
However, Kay suggests providing two or more sets of trigger signals, each associated with a different level of difficulty (Kay ¶0277: "Thus, for a given song, the level data may comprise a plurality of cues, each cue specifying a time and an action to be performed. A collection of level data may comprise any set of level data. For example, a collection of level data might comprise level data for each of 15 songs. Or for example, a collection of level data might comprise level data for each of 15 songs, each at four different difficulty levels (that is, each difficulty level of a song may have a unique set of cues)."), and assigning a difficulty level to each user (Kay ¶0340: "In some modes, players with a guitar controller are required to choose whether they want to playa song's guitar part or bass part before they enter the matchmaking screen. In those cases, guitarists are only matched up with other guitarists who chose the same part that they did. In other modes, users are required to choose a difficulty level before they are match. In those cases, only players that have chosen the same difficulty will be grouped together."), thereafter only matching the input note or combinations of notes to the chosen set of triggers (Kay ¶0027: "detecting, by a game executing on a game console, that a first simulated musical instrument type of a plurality of simulated musical instrument types is connected to the game console; selecting, by the game from a plurality of collections of level data, each collection corresponding to a different simulated musical instrument type, a first collection of level data corresponding to the first simulated musical instrument type; and providing, by the game, a session of a rhythm-action game with the selected collection of level data." Kay ¶0277: "a collection of level data might comprise level data for each of 15 songs, each at four different difficulty levels (that is, each difficulty level of a song may have a unique set of cues).").
It would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified the method of Riopelle by adding the difficulty levels of Kay to vary the user experience according to difficulty level (Kay ¶0228).
Regarding claim 12, Riopelle discloses a method for performing a piece of music comprising the features of claim 1 as discussed above.
Riopelle does not explicitly disclose providing additional feedback to a user to indicate their performance compared to an ideal performance.
However, Kay suggests providing additional feedback to a user to indicate their performance compared to an ideal performance (Kay ¶0091: "In some embodiments, rather than competing for a high score, players or teams may compete for the best crowd rating, longest consecutive correct note streak, highest accuracy, or any other performance metric." A consecutive correct note streak indicates a user's performance compared to an ideal performance.).
It would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified the method of Riopelle by adding the additional feedback of Kay to introduce competitive play features (Kay ¶0091).
Regarding claim 17, Riopelle discloses a system comprising the features of claim 15 as discussed above.
Riopelle does not explicitly disclose that the processing circuit is further configured to provide two or more sets of trigger signals, each associated with a different level of difficulty, and assign a difficulty level to each user, thereafter only matching the input note or combinations of notes to the chosen set of triggers.
However, Kay suggests that the processing circuit is further configured to provide two or more sets of trigger signals, each associated with a different level of difficulty (Kay ¶0277: "Thus, for a given song, the level data may comprise a plurality of cues, each cue specifying a time and an action to be performed. A collection of level data may comprise any set of level data. For example, a collection of level data might comprise level data for each of 15 songs. Or for example, a collection of level data might comprise level data for each of 15 songs, each at four different difficulty levels (that is, each difficulty level of a song may have a unique set of cues)."), and assign a difficulty level to each user (Kay ¶0340: "In some modes, players with a guitar controller are required to choose whether they want to playa song's guitar part or bass part before they enter the matchmaking screen. In those cases, guitarists are only matched up with other guitarists who chose the same part that they did. In other modes, users are required to choose a difficulty level before they are match. In those cases, only players that have chosen the same difficulty will be grouped together."), thereafter only matching the input note or combinations of notes to the chosen set of triggers (Kay ¶0027: "detecting, by a game executing on a game console, that a first simulated musical instrument type of a plurality of simulated musical instrument types is connected to the game console; selecting, by the game from a plurality of collections of level data, each collection corresponding to a different simulated musical instrument type, a first collection of level data corresponding to the first simulated musical instrument type; and providing, by the game, a session of a rhythm-action game with the selected collection of level data." Kay ¶0277: "a collection of level data might comprise level data for each of 15 songs, each at four different difficulty levels (that is, each difficulty level of a song may have a unique set of cues).").
It would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified the system of Riopelle by adding the difficulty levels of Kay to vary the user experience according to difficulty level (Kay ¶0228).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PHILIP SCOLES whose telephone number is (703)756-1831. The examiner can normally be reached Monday-Friday 8:30-4:30 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Dedei Hammond can be reached on 571-270-7938. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PHILIP G SCOLES/
Examiner, Art Unit 2837
/DEDEI K HAMMOND/Supervisory Patent Examiner, Art Unit 2837