DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Receipt is acknowledged of certified copies of papers required by 37 CFR 1.55.
Information Disclosure Statement
The information disclosure statements (IDSs) submitted on 5/19/2022, 5/23/2022, and 7/11/2024 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements are being considered by the examiner.
Claim Objections
Claim 20 is objected to for the following informality: “point of interests” in line 1 should read, “points of interest.” Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1-16 and 18-20 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claims 1, 6, 11, 12, and 20 contain the conjunctions, “and/or.” The intended scope of this dual conjunction is unclear. In the interest of advancing prosecution, each occurrence of this dual conjunction will be interpreted as “or.”
Claims 2-5, 7-10, 13-16, and 18-19 are likewise rejected for depending, directly or indirectly, from one of the rejected claims.
Claim 2 recites the limitation, “the first signal” in lines 1-2. There is insufficient antecedent basis for this limitation in the claim. In the interest of advancing prosecution, this limitation will be interpreted as “the signal.” Alternatively, this rejection may be obviated by amending the recitation of “a signal” in claim 1, to “a first signal.”
Claim 3 is likewise rejected for depending, directly or indirectly, from claim 2.
Claim 4 recites the limitation, "each touch cell CT” in line 2. There is insufficient antecedent basis for this limitation in the claim. In the interest of advancing prosecution, this limitation will be interpreted as “each touch cell.”
Claim 5 is likewise rejected for depending, directly or indirectly, from claim 4.
Claim 13 recites the limitation, "the first sound signal” in line 2. There is insufficient antecedent basis for this limitation in the claim. Examiner further notes that “A first signal” is recited in claim 11 only in the context of disjunctive lists. In the interest of advancing prosecution, this limitation will be interpreted in claim 13 simply as “a sound signal.”
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-7, 9-16, and 18-20 are rejected under 35 U.S.C. 103 as unpatentable over McMillen et al. (US 20130239787 A1, September 19, 2013), hereinafter McMillen, in view of Bedikian et al. (US 20140201666 A1, July 17, 2014), hereinafter Bedikian, to the extent understood.
Regarding claim 1, McMillen discloses a system for generating a signal (McMillen ¶0003: "According to a first class of implementations, a controller is provided that includes N control pads and one or more processors, where N is an integer greater than zero. Each control pad has sensor circuitry associated therewith configured to generate one or more sensor signals representing corresponding touch events on a corresponding surface of the control pad.") comprising: a touchpad comprising a plurality of touch cells and a touch detection device for detecting the location and intensity of at least one pressure exerted on said touchpad (McMillen ¶0003: "According to a first class of implementations, a controller is provided that includes N control pads and one or more processors, where N is an integer greater than zero. Each control pad has sensor circuitry associated therewith configured to generate one or more sensor signals representing corresponding touch events on a corresponding surface of the control pad. The one or more sensor signals associated with each control pad also represent corresponding locations of the touch events on the surface of the control pad." McMillen ¶0004: "the one or more sensor signals associated with each control pad also represent pressure of the touch events on the surface of the control pad, and the pressure of the touch events is reflected in the control information."); a first calculator configured to generate at least one first setpoint based on the location and the intensity of said at least one pressure (McMillen ¶0060-0061: "One is a 4×4 array of drum pads in which sixteen different drum sounds are mapped 1:1 to the sixteen pads. Hitting a pad causes the corresponding drum sound to be produced… an array of sixteen 4-quadrant QuNeo sensors may be configured to operate as either a 4×4 sensor array (e.g., array 1102) or an 8×8 sensor array (e.g., array 1104)"); and a signal generator for producing a second signal based on: the first setpoint, or a first signal extracted from the first setpoint to which a special effect extracted from the second setpoint is applied, or the second setpoint, or a first signal extracted from the second setpoint to which a special effect extracted from the first setpoint is applied (McMillen ¶0051: "changes in position and/or pressure on a particular sensor may be used to modulate control information around one or more quantized states associated with the sensor.).
McMillen does not explicitly disclose an optical detection device for detecting a motion of a hand a position comprising at least one optics for capturing images; and a second calculator for determining at least one motion parameter based on the rotational motion of the wrist or of at least one finger, or on the direction of translation of a translational hand, or finger movement gesture, based on the captured images by the optical detection device and for generating a second setpoint based on said at least one motion parameter.
However, Bedikian teaches an optical detection device for detecting a motion of a hand a position comprising at least one optics for capturing images (Bedikian ¶0006: "Aspects of the system and methods described herein provide for improved machine interface and/or control by interpreting the positions, configurations, and/or motions of one or more control objects (or portions thereof) in free space within a field of view of an image-capture device. The control object(s) may be or include a user's body part(s) such as, e.g., the user's hand(s), finger(s), thumb(s), head, etc."); and a second calculator (Bedikian abstract: "computationally analyzing the images to recognize a gesture performed by the user") for determining at least one motion parameter based on the rotational motion of the wrist or of at least one finger, or on the direction of translation of a translational hand, or finger movement gesture based on the captured images by the optical detection device (Bedikian ¶0006: "Aspects of the system and methods described herein provide for improved machine interface and/or control by interpreting the positions, configurations, and/or motions of one or more control objects (or portions thereof) in free space within a field of view of an image-capture device. The control object(s) may be or include a user's body part(s) such as, e.g., the user's hand(s), finger(s), thumb(s), head, etc.") and for generating a second setpoint based on said at least one motion parameter (Bedikian ¶0111: "The location of the virtual surface construct can, in some embodiments, be set by the user, e.g., by means of a particular gesture recognized by the motion-capture system. To give just one example, the user may, with her index finger stretched out, have her thumb and middle finger touch so as to pin the virtual surface construct at a certain location relative to the current position of the index-finger-tip. Once set in this manner, the virtual surface construct may be stationary until reset by the user via performance of the same gesture in a different location.").
It would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified the system for generating a signal of McMillen by adding the optical detection device of Bedikian to control a display based on dynamic user interactions (Bedikian abstract).
Regarding claim 2, McMillen (in view of Bedikian) discloses a system for generating a signal comprising the features of claim 1 as discussed above.
McMillen further teaches that the signal and the second signal are sound signals (McMillen ¶0051: "For example, changes in position and/or pressure on a particular sensor may be used to modulate control information around one or more quantized states associated with the sensor. A particular example in the context of a musical effects controller is when a sensor is configured as a piano or synthesizer key. In such a case, changes in position and/or pressure, e.g., when the user wiggles or slides his finger back and forth, can be used to distort or modulate the pitch of the primary note associated with the key, e.g., to effect note bending or vibrato.").
Regarding claim 3, McMillen (in view of Bedikian) discloses a system for generating a signal comprising the features of claim 2 as discussed above.
McMillen further suggests that the signal generator is configured to produce the second signal as a third setpoint (McMillen ¶0051: "In such a case, changes in position and/or pressure, e.g., when the user wiggles or slides his finger back and forth, can be used to distort or modulate the pitch of the primary note associated with the key, e.g., to effect note bending or vibrato." Modulating the pitch of a sound signal can comprise a third setpoint such as MIDI control message (McMillen ¶0036)).
Regarding claim 4, McMillen (in view of Bedikian) discloses a system for generating a signal comprising the features of claim 1 as discussed above.
McMillen further discloses that each touch cell comprises: a first layer comprising at least one force sensing resistor (McMillen ¶0037: "The array overlaid with a conductive pressure sensitive compressible material that may be a piezoresistive material such as, for example, germanium, polycrystalline silicon, amorphous silicon, non-woven impregnated fabric and single crystal silicon."); and a second layer comprising a detection cell adapted to detect a variation in the resistivity of the force sensing resistor (McMillen ¶0010: "First one of the conductive elements are connected to a voltage reference, and second ones of the conductive elements are configured to receive sequential drive signals. At least some of the first and second conductive elements are connected to a resistive element. The sensor circuitry also includes a conductive material configured to make contact with at least some of the first and second conductive elements at locations associated with the touch events, thereby forming one or more voltage dividers when the second conductive elements with which contact by the conductive material is made are driven by a corresponding one of the sequential drive signals.").
Regarding claim 5, McMillen (in view of Bedikian) discloses a system for generating a signal comprising the features of claim 4 as discussed above.
McMillen further teaches that each detection cell comprises a printed circuit comprising at least a first portion and a second portion (McMillen ¶0037: "the sensors (e.g., the sliders and rotary sensors) are implemented using an array of driven or scanned conductive printed circuit board (PCB) elements alternating with conductive elements connected to a voltage reference, e.g., ground, through a resistor. The array overlaid with a conductive pressure sensitive compressible material that may be a piezoresistive material such as, for example, germanium, polycrystalline silicon, amorphous silicon, non-woven impregnated fabric and single crystal silicon. FIG. 3 shows a configuration of a such a sensor") connected to each other through the force sensing resistor of the first layer (McMillen ¶0010: "First one of the conductive elements are connected to a voltage reference, and second ones of the conductive elements are configured to receive sequential drive signals. At least some of the first and second conductive elements are connected to a resistive element. The sensor circuitry also includes a conductive material configured to make contact with at least some of the first and second conductive elements at locations associated with the touch events, thereby forming one or more voltage dividers when the second conductive elements with which contact by the conductive material is made are driven by a corresponding one of the sequential drive signals.").
Regarding claim 6, McMillen (in view of Bedikian) discloses a system for generating a signal comprising the features of claim 1 as discussed above.
Bedikian further discloses that the motion parameter is determined based on an amplitude, speed of the hand, or direction of a finger of the hand (Bedikian ¶0074: "The user then performs another gesture (e.g., moving her hand in an 'up' or “down” direction). The gesture-recognition module 116 detects and identifies the gesture and a scale associated therewith, and transmits this data to the electronic device 104; the device 104, in turn, interprets this information as an input parameter.").
Regarding claim 7, McMillen (in view of Bedikian) discloses a system for generating a signal comprising the features of claim 1 as discussed above.
Bedikian further discloses that the optical detection device for detecting a motion of a hand comprises a stereo camera (Bedikian ¶0132: "Other vision-based approaches that can be used in embodiments include, without limitation, stereo imaging.").
It would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified the system for generating a signal of McMillen by adding the stereo camera of Bedikian to better discriminate foreground objects from background objects (Bedikian ¶0133).
Regarding claim 9, McMillen (in view of Bedikian) discloses a system for generating a signal comprising the features of claim 1 as discussed above.
McMillen further discloses that said system is a musical instrument (McMillen ¶0016: "a music controller to operate with one or more of a plurality of music applications").
Furthermore, integrating the touchpad and optical detection devices into a single case represents an unpatentable design choice. See MPEP § 2144.04(V)(B).
It would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified the system for generating a signal of McMillen (in view of Bedikian) by integrating the touchpad and optical detection devices into a single case to emulate a wide variety of conventional control mechanisms in a highly configurable controller (McMillen abstract).
Regarding claim 10, McMillen (in view of Bedikian) discloses a system for generating a signal comprising the features of claim 1 as discussed above.
McMillen further suggests that each touch cell comprises a lighting source for producing a light signal (McMillen ¶0055: " A particular implementation of a controller (e.g., QuNeo 100 of FIG. 1) employs an array of 251 LEDs as shown in FIG. 9 to provide visual feedback to the user. LED array 900 is a PCB layout in which various of the LEDs may be selectively illuminated, e.g., under control of a microprocessor (not shown).") when a pressure is exerted on said touch cell (McMillen abstract: " pressure and location sensitive sensors that generate high-density control information which may be mapped to the controls of a wide variety of devices and software.").
Regarding claim 11, McMillen discloses a method for generating a signal (McMillen ¶0003: "According to a first class of implementations, a controller is provided that includes N control pads and one or more processors, where N is an integer greater than zero. Each control pad has sensor circuitry associated therewith configured to generate one or more sensor signals representing corresponding touch events on a corresponding surface of the control pad.") comprising: acquiring a location (McMillen ¶0003: "Each control pad has sensor circuitry associated therewith configured to generate one or more sensor signals representing corresponding touch events on a corresponding surface of the control pad. The one or more sensor signals associated with each control pad also represent corresponding locations of the touch events on the surface of the control pad.") and intensity of a pressure on a touchpad (McMillen ¶0004: "the one or more sensor signals associated with each control pad also represent pressure of the touch events on the surface of the control pad, and the pressure of the touch events is reflected in the control information.") having a plurality of touch cells (McMillen ¶0003: "According to a first class of implementations, a controller is provided that includes N control pads and one or more processors, where N is an integer greater than zero.); producing a first setpoint based on the acquired location and intensity (McMillen ¶0060-0061: "One is a 4×4 array of drum pads in which sixteen different drum sounds are mapped 1:1 to the sixteen pads. Hitting a pad causes the corresponding drum sound to be produced… an array of sixteen 4-quadrant QuNeo sensors may be configured to operate as either a 4×4 sensor array (e.g., array 1102) or an 8×8 sensor array (e.g., array 1104)"); and generating a second signal based on: the first setpoint or a first signal associated with the first setpoint to which a special effect extracted from the second setpoint is applied; or the second setpoint or a first signal associated with the second setpoint to which a special effect extracted from the first setpoint is applied (McMillen ¶0051: "For example, changes in position and/or pressure on a particular sensor may be used to modulate control information around one or more quantized states associated with the sensor. A particular example in the context of a musical effects controller is when a sensor is configured as a piano or synthesizer key. In such a case, changes in position and/or pressure, e.g., when the user wiggles or slides his finger back and forth, can be used to distort or modulate the pitch of the primary note associated with the key, e.g., to effect note bending or vibrato.").
McMillen does not explicitly disclose acquiring at least one image by at least one optics; determining at least one motion parameter comprising the detection of a rotational motion of the wrist or of at least one finger or on the direction of translation of a translational hand or finger movement gesture based on the acquired at least one image; and generating a second setpoint based on the motion parameter.
However, Bedikian discloses acquiring at least one image by at least one optics (Bedikian ¶0006: "interpreting the positions, configurations, and/or motions of one or more control objects (or portions thereof) in free space within a field of view of an image-capture device."); determining at least one motion parameter comprising the detection of a rotational motion of the wrist or of at least one finger or on the direction of translation of a translational hand or finger movement gesture based on the acquired at least one image (Bedikian ¶0006: "Aspects of the system and methods described herein provide for improved machine interface and/or control by interpreting the positions, configurations, and/or motions of one or more control objects (or portions thereof) in free space within a field of view of an image-capture device. The control object(s) may be or include a user's body part(s) such as, e.g., the user's hand(s), finger(s), thumb(s), head, etc."); and generating a second setpoint based on the motion parameter (Bedikian ¶0111: "The location of the virtual surface construct can, in some embodiments, be set by the user, e.g., by means of a particular gesture recognized by the motion-capture system. To give just one example, the user may, with her index finger stretched out, have her thumb and middle finger touch so as to pin the virtual surface construct at a certain location relative to the current position of the index-finger-tip. Once set in this manner, the virtual surface construct may be stationary until reset by the user via performance of the same gesture in a different location.").
It would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified the method for generating a signal of McMillen by adding the optical detection device of Bedikian to control a display based on dynamic user interactions (Bedikian abstract).
Regarding claim 12, McMillen (in view of Bedikian) discloses a method for generating a signal comprising the features of claim 11 as discussed above.
Bedikian further discloses that the motion parameter is also determined based on an amplitude, speed or direction of a hand or a finger of the hand (Bedikian ¶0074: "The user then performs another gesture (e.g., moving her hand in an 'up' or “down” direction). The gesture-recognition module 116 detects and identifies the gesture and a scale associated therewith, and transmits this data to the electronic device 104; the device 104, in turn, interprets this information as an input parameter.").
Regarding claim 13, McMillen (in view of Bedikian) discloses a method for generating a signal comprising the features of claim 11 as discussed above, including that a sound signal corresponds to a musical note.
Regarding claim 14, McMillen (in view of Bedikian) discloses a method for generating a signal comprising the features of claim 11 as discussed above.
Bedikian further discloses that determining at least one motion parameter comprises detecting points of interest (Bedikian ¶0111: "To give just one example, the user may, with her index finger stretched out, have her thumb and middle finger touch so as to pin the virtual surface construct at a certain location relative to the current position of the index-finger-tip.").
Regarding claim 15, McMillen (in view of Bedikian) discloses a method for generating a signal comprising the features of claim 11 as discussed above.
Bedikian further discloses that determining at least one motion parameter based on the acquired images comprises generating a depth map, said motion parameter being determined also depending on said depth map (Bedikian ¶0106: "further movements of the control object may serve to move graphical components across the screen (e.g., drag an icon, shift a scroll bar, etc.), change perceived “depth” of the object to the viewer (e.g., resize and/or change shape of objects displayed on the screen in connection, alone, or coupled with other visual effects) to create perception of “pulling” objects into the foreground of the display or “pushing” objects into the background of the display").
Regarding claim 16, McMillen (in view of Bedikian) discloses a method for generating a signal comprising the features of claim 11 as discussed above, including that said special effect comprises one or more of the elements listed below: a reverberation, an echo, a distortion, a sustain, a wha-wha, a vibrato, a phase shift.
Regarding claim 18, McMillen (in view of Bedikian) discloses a method for generating a signal comprising the features of claim 11 as discussed above.
McMillen further discloses a non-transitory computer-readable data storage medium having recorded thereon a computer program comprising program code instructions for implementing the method of claim 11 (McMillen ¶0073: "the computer program instructions with which embodiments of the invention may be… stored in any type of volatile or nonvolatile, non-transitory computer-readable storage medium or memory device, and may be executed according to a variety of computing models").
Regarding claim 19, McMillen (in view of Bedikian) discloses a system for generating a signal comprising the features of claim 7 as discussed above.
Bedikian further suggests that the stereo camera is an infrared stereo camera (Bedikian ¶0133: "For example, the light sources 912 may be LEDs that emit IR light, and the cameras 900, 902 may capture IR light that is reflected off the control object and/or objects in the background.").
Regarding claim 20, McMillen (in view of Bedikian) discloses a method for generating a signal comprising the features of claim 11 as discussed above.
Bedikian further discloses that the points of interest are finger tips, the center of mass, or a deflection point (Bedikian ¶0111: "To give just one example, the user may, with her index finger stretched out, have her thumb and middle finger touch so as to pin the virtual surface construct at a certain location relative to the current position of the index-finger-tip.").
Claim 8 is rejected under 35 U.S.C. 103 as unpatentable over McMillen in view of Bedikian, and further in view of Sajda et al. (US 20190101985 A1, April 4, 2019), hereinafter Sadja, to the extent understood.
Regarding claim 8, McMillen (in view of Bedikian) discloses a system for generating a signal comprising the features of claim 1 as discussed above.
McMillen (in view of Bedikian) does not explicitly disclose a user interface for providing the second calculator with a feedback data and wherein the second calculator comprises a reinforcement learning algorithm, configured to modify a mode of generation of the second setpoint depending on the feedback data by iteration.
However, Sadja suggests a user interface for providing the second calculator with a feedback data and wherein the second calculator comprises a reinforcement learning algorithm, configured to modify a mode of generation of the second setpoint depending on the feedback data by iteration (Sadja ¶0007: "The memory is operatively coupled to the processor and stores instructions that, when executed by the processor, cause the computer system to analyze the environment including objects, events, or actions therein; collect sensory information correlated to a state of the user or the environment from the at least one sensor; identify, via the processor, one or more reinforcement signals of the user or the environment based on the collected sensory information; alter the artificial intelligence agent to respond to the one or more reinforcement signals; and alter the environment based on the identification of the one or more reinforcement signals.").
It would have been prima facie obvious to one of ordinary skill in the art prior to the effective filing date of the claimed invention to have modified the system for generating a signal of McMillen (in view of Bedikian) by adding the reinforcement learning algorithm of Sadja to adapt implicitly adapt the system to subjective human preferences (Sadja ¶0004).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PHILIP SCOLES whose telephone number is (703)756-1831. The examiner can normally be reached Monday-Friday 8:30-4:30 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Dedei Hammond can be reached on 571-270-7938. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PHILIP G SCOLES/
Examiner, Art Unit 2837
/JEFFREY DONELS/Primary Examiner, Art Unit 2837