DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 3 rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
As per Claim 3, the claim recites “wherein a smaller difference between the first jog wheel position and the second jog wheel position causes a greater difference between the first timeline position and the second timeline position, and wherein a greater difference between the first jog wheel position and the second jog wheel position causes a smaller difference between the first timeline position and the second timeline position.”. The claim has a inverse relationship between the job wheel movement and the timeline movement. Claim 2 however, has a corresponding relationship wherien if a movement is greater on the jog wheel then its greater on the timeline and vice versa. The specification merely states the same language of the claim on ¶149 and does not elaborate on how such an inverse functionality can occur. The functionality claimed in indefinite since it doesn’t provide a basis on how it can occur while the complete opposite effect also simultaneously occurs in this application. Corrections are required.
The following is a quotation of 35 U.S.C. 112(d):
(d) REFERENCE IN DEPENDENT FORMS.—Subject to subsection (e), a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers.
The following is a quotation of pre-AIA 35 U.S.C. 112, fourth paragraph:
Subject to the following paragraph [i.e., the fifth paragraph of pre-AIA 35 U.S.C. 112], a claim in dependent form shall contain a reference to a claim previously set forth and then specify a further limitation of the subject matter claimed. A claim in dependent form shall be construed to incorporate by reference all the limitations of the claim to which it refers.
Claim 7 is rejected under 35 U.S.C. 112(d) or pre-AIA 35 U.S.C. 112, 4th paragraph, as being of improper dependent form for failing to further limit the subject matter of the claim upon which it depends, or for failing to include all the limitations of the claim upon which it depends. Claim 4 recites the same language of the last limitation of claim 1 with a minor variation “corresponds” in claim 1 and “proportional” in claim 4 wherien the claimed subject matter is entirely the same. Applicant may cancel the claim(s), amend the claim(s) to place the claim(s) in proper dependent form, rewrite the claim(s) in independent form, or present a sufficient showing that the dependent claim(s) complies with the statutory requirements.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-4, 7, 14-18, 23 & 24 is/are rejected under 35 U.S.C. 103 as being unpatentable over Jones et al. (U.S. Pub 9,870,114) hereinafter Jones, in view of WHERRY (U.S. Pub 2007/023) hereinafter Wherry.
As per Claim 1, Jones teaches A method comprising: displaying at least a partial view of a jog wheel user interface (UI) element on a graphical user interface (GUI); (Fig. 2, col. 4 lines 47-55 the virtual jog wheel 210 may have any suitable shape, such as, for example, the substantially circular shape (e.g., a dial) shown in the example illustrated in FIG. 2. The virtual jog wheel 210 is configured to virtually rotate or move in response to a movement of a touch object (e.g., a finger) relative to the virtual jog wheel)
concurrently with displaying the partial view of the jog wheel UI element: displaying a first media content item in a first timeline position relative to a timeline; (Fig. 1, Fig. 2, col. 4 lines 56-67, col. 5 lines 1-34 the slider bar 240 is mapped to a linear control of a current playback position (indicated by the time scrubber 242) of one or more media elements being played back. The slider bar 240 may include a line representing a total time for playing one or more media elements and the actual total time 246 for playing the one or more media elements (e.g., total time “N.NN” shown in FIG. 2) wherein the movement of the time scrubber 242 is controlled by the virtual jog wheel 210. For example, a touch object (not shown) may make an initial detectable contact (or other interaction) with the virtual jog wheel 210 at a first position (“P1”) and slide, swipe or move along the virtual jog wheel 210 in a clockwise direction (denoted by the dashed action arrow in FIG. 2) to a second position (“P2”) wherien the multiple media elements may be concatenated into a composite set of media elements for navigation by a user by the graphical user interface generator 106 or the virtual jog wheel controller 108 of FIG. 1.)
receiving a touch input for controlling the jog wheel UI element; responsive to the touch input: a rotation of the jog wheel UI element from a first jog wheel position to a second jog wheel position, the rotation being based on the touch input; and (Fig. 2, col. 5 lines 5-20 wherien the movement of the time scrubber 242 is controlled by the virtual jog wheel 210. For example, a touch object (not shown) may make an initial detectable contact (or other interaction) with the virtual jog wheel 210 at a first position (“P1”) and slide, swipe or move along the virtual jog wheel 210 in a clockwise direction (denoted by the dashed action arrow in FIG. 2) to a second position (“P2”). The movement by the touch object relative to the virtual jog wheel 210 represents a command from the user which is processed by the virtual jog wheel controller and translated into a corresponding movement of the time scrubber 242 from a first position (denoted by the time scrubber 242 (P1)) to a second position (denoted by the dashed-line time scrubber 242 (P2)).)
concurrently with the rotation of the jog wheel UI element from the first jog wheel position to the second jog wheel position, animating a movement of the first media content item relative to the timeline from the first timeline position to a second timeline position, (Fig. 3, Fig. 4, col. 7 lines 42-56 if the user operates a touch object from an initial position (P3) to a new position (P4) (e.g., over a same period of time but a bigger distance) such that the rotational velocity exceeds the mode change threshold, then the virtual jog wheel 310 shifts into a second mode which enables the user to switch from one media element to another (also referred to as a “choose media element” mode). In an implementation, in the choose media element mode, the time display (shown in FIG. 2) is replaced by a listing of images or thumbnails 450 each representing a media element within the set of media elements 430. The current or selected media element may be graphically indicated, such as, for example, by a time scrubber 440 and/or the current media element may be highlighted with a darker border)
wherein a difference between the first timeline position and the second timeline position corresponds to a difference between the first jog wheel position and the second jog wheel position. (Fig. 2, Fig. 4, col. 5. Lines 5-20, col. 6 lines 29-56 wherein the virtual jog wheel 210 is configured such that the time interval moved on the time slider 240 for each “click” or passage of a division boundary 330 may increase as a function of the rotational rate (e.g., velocity) of the virtual jog wheel 210. The rotational rate of movement of the virtual jog wheel 210 may be calculated by the number of divisions 320 traversed per unit of time (e.g., millisecond). In an implementation, the virtual jog wheel controller 108 is configured to calculate the rotational velocity of the virtual jog wheel 210 and compare the rotational velocity to an acceleration threshold. If the rotational velocity exceeds the acceleration threshold, the time interval corresponding to movement of the time scrubber 240 per each division 320 of the virtual jog wheel 210 increases by an acceleration amount. For example, the time interval of the media element that is moved for each division of the virtual jog wheel may be 1 second when the virtual jog wheel is operated at a first rotational velocity. The time interval spanned may increase to 5 seconds per division if the rotational velocity is increased and exceeds the acceleration threshold) wherein wherien the movement of the time scrubber 242 is controlled by the virtual jog wheel 210. For example, a touch object (not shown) may make an initial detectable contact (or other interaction) with the virtual jog wheel 210 at a first position (“P1”) and slide, swipe or move along the virtual jog wheel 210 in a clockwise direction (denoted by the dashed action arrow in FIG. 2) to a second position (“P2”). The movement by the touch object relative to the virtual jog wheel 210 represents a command from the user which is processed by the virtual jog wheel controller and translated into a corresponding movement of the time scrubber 242 from a first position (denoted by the time scrubber 242 (P1)) to a second position (denoted by the dashed-line time scrubber 242 (P2)).)
Jones teaches a rotation and change in the position of a jog wheel and a position of movement of media and changes in the timeline; However, Jones doesn’t explicitly teach the concept of animation.
Wherry teaches animating a rotation of the jog wheel UI element (Fig. 4B, ¶31, ¶107, ¶132 wherein touch screen interface provides scrolling and visual feedback in response to substantially circular scrolling motions on the touch screen interface wherien A visual clue, such as dots, could appear in control bar 1890 to indicate that the system is scrolling. The visual clue can also provide additional information such as the direction and speed of scrolling. For example, a series of moving dots of different sizes can indicate scrolling speed using motion and the direction of scroll using the sizes. The "swiping" type of scrolling can also simulate turning a knob or wheel that has friction and inertia by responding with faster scrolling at first (such as by scrolling more lines at the beginning), gradually slowing down (such as by scrolling fewer and fewer lines per unit time), and eventually coming to a stop wherien graphical scroll wheels can be implemented to respond to the scrolling speed. For example, the graphical scroll wheel may become more opaque (darker) when the user causes scrolling to occur very quickly and more transparent (lighter) when the user causes scrolling to occur slowly, or vice versa)
concurrently with the rotation of the jog wheel UI element from the first jog wheel position to the second jog wheel position, animating a movement of the first media content item from the first position to a second position, (Fig. 8A, Fig. 8B, ¶97, ¶98 wherien The user can move input object 850 (shown as a finger) in a clockwise or counterclockwise direction along the path to scroll up or down the list. The response scrolling action can be indicated by the touch screen interface 810 holding the image of highlighter 820 still at the top of the displayed portion of the list and moving the list (i.e. causing items in the list to appear at one end of the list and disappear at another end of the list, and the rest of the items to shift in position appropriately). In this case, as shown by comparison of FIGS. 8A and 8B, the input object 850 moves in a clockwise manner along the path, and this type of subsequent object motion causes the song list to move upwards while highlighter 820 holds still at the top of the displayed portion of the list wherein In FIG. 8A, the song list is arranged alphabetically, and the item "Ernest Was (Rain)" is highlighted and "Hard Wood (Worlds Apart)" is in the middle of the list. In FIG. 8A, the clockwise motion of input object 850 about 180 degrees around the path indicated by graphical scroll wheel 840 has resulted in the items from "Ernest Was (Rain)" to "Green Acres (Rendering)" to be removed from the displayed set, "Hard Wood (Worlds Apart)" to move to the top of the list and be highlighted by highlighter 820, other items to move upwards on the display, and items "Look Elsewhere (Run Away)" to Pepper (Bubble Toys)" to appear in the displayed set.)
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of graphical scroll wheel of Wherry with the teaching of media element navigation using a virtual jog wheel of Jones because Wherry teaches improved techniques for facilitating scrolling with touch sensor devices by providing a display screen, a touch sensor device, and a processor coupled to the display screen and the touch sensor. The touch sensor device is adapted to sense object motion in a sensing region that overlaps at least part of the display screen. The processor is adapted to cause a scroll wheel that indicates a scrolling path to appear on the display screen selectively, such as in response to the touch sensor sensing object motion that corresponds to a scrolling initiation gesture. The processor is further adapted to cause scrolling on a display screen selectively, such as in response to the touch sensor sensing subsequent object motion along the scrolling path after the touch sensor has sensed the object motion corresponding to the scrolling initiation gesture. The present invention thus allows the display screen to provide a more versatile graphical user interface (GUI), a benefit resulting from having available additional space when the scroll wheel is not shown. The present invention also enables the electronic system to allow the user to scroll in an intuitive manner even as it reduces the chances of accidental scrolling (¶9, ¶11)
As per Claim 2, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches wherein a greater difference between the first jog wheel position and the second jog wheel position causes a greater difference between the first timeline position and the second timeline position, and (Fig. 2, col. 6 lines 14-28 wherein the virtual jog wheel 210 is configured such that the time interval moved on the time slider 240 for each “click” or passage of a division boundary 330 may increase as a function of the rotational rate (e.g., velocity) of the virtual jog wheel 210. The rotational rate of movement of the virtual jog wheel 210 may be calculated by the number of divisions 320 traversed per unit of time (e.g., millisecond). In an implementation, the virtual jog wheel controller 108 is configured to calculate the rotational velocity of the virtual jog wheel 210 and compare the rotational velocity to an acceleration threshold (e.g., a predefined rotational velocity value). If the rotational velocity exceeds the acceleration threshold, the time interval corresponding to movement of the time scrubber 240 per each division 320 of the virtual jog wheel 210 increases by an acceleration amount. For example, the time interval of the media element that is moved for each division of the virtual jog wheel may be 1 second when the virtual jog wheel is operated at a first rotational velocity. The time interval spanned may increase to 5 seconds per division if the rotational velocity is increased and exceeds the acceleration threshold). It is noted that the virtual jog wheel 210 and virtual jog wheel controller 108 may be configured to operate in accordance with any number of acceleration thresholds and associated threshold values. It is further noted that the acceleration amount may be varied programmatically. For example, the speed of the time scrubber 240 may be doubled each time an acceleration threshold is crossed.; as taught by Jones)
wherein a smaller difference between the first jog wheel position and the second jog wheel position causes a smaller difference between the first timeline position and the second timeline position. (Fig. 2, col. 6 lines 57-67 & col. 7 lines 1-14 wherien sion boundary 330 may decrease as a function of the rotational velocity of the virtual jog wheel 210. The virtual jog wheel controller 108 may be configured to compare the rotational velocity of the virtual jog wheel 210 to a deceleration threshold (e.g., a predefined rotational velocity value). If the rotational velocity falls below the deceleration threshold, the time interval corresponding to movement of the time scrubber 240 per each division 320 of the virtual jog wheel 210 decreases by a deceleration amount. For example, the time interval of the media element that is moved for each division of the virtual jog wheel may be 5 seconds when the virtual jog wheel is operated at a first rotational velocity. The time interval spanned may decrease to 1 second per division if the rotational velocity is decreased and falls below the deceleration threshold. It is noted that the virtual jog wheel 210 and virtual jog wheel controller 108 may be configured to operate in accordance with any number of deceleration thresholds and associated threshold values. It is further noted that the deceleration amount may be varied programmatically. For example, the speed of the time scrubber 240 may be cut in half each time a deceleration threshold is crossed. as taught by Jones)
As per Claim 3, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches wherein a smaller difference between the first jog wheel position and the second jog wheel position causes a greater difference between the first timeline position and the second timeline position, and wherein a greater difference between the first jog wheel position and the second jog wheel position causes a smaller difference between the first timeline position and the second timeline position. (Fig. 2,, col. 7 lines 20-29 wherien movement of the virtual jog wheel 210 in a clockwise direction increases the time scrubber 242 along the time display 240 (e.g., the playback position of the media element is moved forward). In addition, in this implementation, movement of the virtual jog wheel 210 in a counterclockwise direction decreases the time scrubber 242 along the time display 240 (e.g., the playback position of the media element is moved backward or rewound).; as taught by Jones)
As per Claim 4, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches further comprising, responsive to the movement of the first media content item to the second timeline position: detecting a second media content item located along the timeline at least partially between the first timeline position to the second timeline position; and moving the second media content item along the timeline in a same direction as the movement of the first media content item. (Fig. 3, Fig. 4, col. 7 lines 42-56 if the user operates a touch object from an initial position (P3) to a new position (P4) (e.g., over a same period of time but a bigger distance) such that the rotational velocity exceeds the mode change threshold, then the virtual jog wheel 310 shifts into a second mode which enables the user to switch from one media element to another (also referred to as a “choose media element” mode). In an implementation, in the choose media element mode, the time display (shown in FIG. 2) is replaced by a listing of images or thumbnails 450 each representing a media element within the set of media elements 430. The current or selected media element may be graphically indicated, such as, for example, by a time scrubber 440 and/or the current media element may be highlighted with a darker border; as taught by Jones; Fig. 8A, Fig. 8B, ¶97, ¶98 wherien The user can move input object 850 (shown as a finger) in a clockwise or counterclockwise direction along the path to scroll up or down the list. The response scrolling action can be indicated by the touch screen interface 810 holding the image of highlighter 820 still at the top of the displayed portion of the list and moving the list (i.e. causing items in the list to appear at one end of the list and disappear at another end of the list, and the rest of the items to shift in position appropriately). In this case, as shown by comparison of FIGS. 8A and 8B, the input object 850 moves in a clockwise manner along the path, and this type of subsequent object motion causes the song list to move upwards while highlighter 820 holds still at the top of the displayed portion of the list; as taught by Wherry)
As per Claim 7, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches wherein the difference between the first timeline position and the second timeline position is proportional to the difference between the first jog wheel position and the second jog wheel position. (Fig. 2, Fig. 4, col. 5. Lines 5-20, col. 6 lines 29-56 wherein the virtual jog wheel 210 is configured such that the time interval moved on the time slider 240 for each “click” or passage of a division boundary 330 may increase as a function of the rotational rate (e.g., velocity) of the virtual jog wheel 210. The rotational rate of movement of the virtual jog wheel 210 may be calculated by the number of divisions 320 traversed per unit of time (e.g., millisecond). In an implementation, the virtual jog wheel controller 108 is configured to calculate the rotational velocity of the virtual jog wheel 210 and compare the rotational velocity to an acceleration threshold. If the rotational velocity exceeds the acceleration threshold, the time interval corresponding to movement of the time scrubber 240 per each division 320 of the virtual jog wheel 210 increases by an acceleration amount. For example, the time interval of the media element that is moved for each division of the virtual jog wheel may be 1 second when the virtual jog wheel is operated at a first rotational velocity. The time interval spanned may increase to 5 seconds per division if the rotational velocity is increased and exceeds the acceleration threshold) wherein wherien the movement of the time scrubber 242 is controlled by the virtual jog wheel 210. For example, a touch object (not shown) may make an initial detectable contact (or other interaction) with the virtual jog wheel 210 at a first position (“P1”) and slide, swipe or move along the virtual jog wheel 210 in a clockwise direction (denoted by the dashed action arrow in FIG. 2) to a second position (“P2”). The movement by the touch object relative to the virtual jog wheel 210 represents a command from the user which is processed by the virtual jog wheel controller and translated into a corresponding movement of the time scrubber 242 from a first position (denoted by the time scrubber 242 (P1)) to a second position (denoted by the dashed-line time scrubber 242 (P2)).; as taught by Jones)
As per Claim 14, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches wherein one or more characteristics of the jog wheel UI element are user adjustable, the one or more characteristics of the jog wheel UI element being selected from a group consisting of: a size of the jog wheel UI element, a position of the jog wheel UI element on the GUI, an input response direction of the jog wheel UI element, and a portion of the jog wheel UI element to display on the GUI. (Fig. 11, ¶54 wherien The graphical scroll wheel can also be used with user configurable interfaces wherien processor 140 may be adapted such that users can set and change one or more of the following: characteristics of scroll initiation gesture(s); scroll wheel size, transparency, or visual appearance; characteristics of scrolling termination gesture(s), scroll amount, speed, or ballistics; durations associated with the appearance and disappearance of the scroll wheel; timings associated with scroll wheel response; if precursor images are used and their characteristics if so; if the scroll wheel can disappear while still retaining a scroll function and its characteristics if so; what is scrolled; or any other aspect of the scroll wheel function; as taught by Wherry)
As per Claim 18, Jones teaches A method comprising: displaying at least a partial view of a jog wheel user interface (UI) element on a graphical user interface (GUI); (Fig. 2, col. 4 lines 47-55 the virtual jog wheel 210 may have any suitable shape, such as, for example, the substantially circular shape (e.g., a dial) shown in the example illustrated in FIG. 2. The virtual jog wheel 210 is configured to virtually rotate or move in response to a movement of a touch object (e.g., a finger) relative to the virtual jog wheel)
concurrently with displaying the partial view of the jog wheel UI element: displaying a position indicator for selecting positions within one or more media items; (Fig. 1, Fig. 2, col. 4 lines 56-67, col. 5 lines 1-34 the slider bar 240 is mapped to a linear control of a current playback position (indicated by the time scrubber 242) of one or more media elements being played back. The slider bar 240 may include a line representing a total time for playing one or more media elements and the actual total time 246 for playing the one or more media elements (e.g., total time “N.NN” shown in FIG. 2) wherein the movement of the time scrubber 242 is controlled by the virtual jog wheel 210. For example, a touch object (not shown) may make an initial detectable contact (or other interaction) with the virtual jog wheel 210 at a first position (“P1”) and slide, swipe or move along the virtual jog wheel 210 in a clockwise direction (denoted by the dashed action arrow in FIG. 2) to a second position (“P2”) wherien the multiple media elements may be concatenated into a composite set of media elements for navigation by a user by the graphical user interface generator 106 or the virtual jog wheel controller 108 of FIG. 1. wherein the GUI 200 displays an indication of a portion of the one or more media elements has been played (e.g., an indication of the amount of time of the media element that has been played) and an indication of how much of the one or more media elements are contained within the cache or buffer of the user device 101 (e.g., a bar or other indication which indicates how much of the media element has been downloaded and is available for playback))
receiving a touch input for controlling the jog wheel UI element; responsive to the touch input: a rotation of the jog wheel UI element from a first jog wheel position to a second jog wheel position, the rotation being based on the touch input; and (Fig. 2, col. 5 lines 5-20 wherien the movement of the time scrubber 242 is controlled by the virtual jog wheel 210. For example, a touch object (not shown) may make an initial detectable contact (or other interaction) with the virtual jog wheel 210 at a first position (“P1”) and slide, swipe or move along the virtual jog wheel 210 in a clockwise direction (denoted by the dashed action arrow in FIG. 2) to a second position (“P2”). The movement by the touch object relative to the virtual jog wheel 210 represents a command from the user which is processed by the virtual jog wheel controller and translated into a corresponding movement of the time scrubber 242 from a first position (denoted by the time scrubber 242 (P1)) to a second position (denoted by the dashed-line time scrubber 242 (P2)).)
concurrently with the rotation of the jog wheel UI element from the first jog wheel position to the second jog wheel position, animating a movement of the position indicator relative to the one or more media content items from a first clip position to a second clip position, (Fig. 3, Fig. 4, col. 7 lines 42-67, col. 8 lines 1-5 if the user operates a touch object from an initial position (P3) to a new position (P4) (e.g., over a same period of time but a bigger distance) such that the rotational velocity exceeds the mode change threshold, then the virtual jog wheel 310 shifts into a second mode which enables the user to switch from one media element to another (also referred to as a “choose media element” mode). In an implementation, in the choose media element mode, the time display (shown in FIG. 2) is replaced by a listing of images or thumbnails 450 each representing a media element within the set of media elements 430. The current or selected media element may be graphically indicated, such as, for example, by a time scrubber 440 and/or the current media element may be highlighted with a darker border wherein the rate of rotation of the virtual jog wheel 410 determines the rate of switching between the media elements. In an implementation, as the user decelerates or slows down the rate of rotation of the touch object relative to the virtual jog wheel 410, the virtual jog wheel controller switches back into the single media element scrub mode for scrubbing/navigating within a selected media element. In an implementation, if the selected media element is the original media element (e.g., the media element that the user was navigating at the time of the switch to the choose media element mode), then the playback resumes at the same playback point as it was when the mode switch occurred. If the selected media element is different than the initial media element (e.g., the media element prior to the mode switch), then the time scrubber is placed at the beginning of the selected media element)
wherein a difference between the first timeline position and the second timeline position corresponds to a difference between the first jog wheel position and the second jog wheel position. (Fig. 2, Fig. 4, col. 5. Lines 5-20, col. 6 lines 29-56 wherein the virtual jog wheel 210 is configured such that the time interval moved on the time slider 240 for each “click” or passage of a division boundary 330 may increase as a function of the rotational rate (e.g., velocity) of the virtual jog wheel 210. The rotational rate of movement of the virtual jog wheel 210 may be calculated by the number of divisions 320 traversed per unit of time (e.g., millisecond). In an implementation, the virtual jog wheel controller 108 is configured to calculate the rotational velocity of the virtual jog wheel 210 and compare the rotational velocity to an acceleration threshold. If the rotational velocity exceeds the acceleration threshold, the time interval corresponding to movement of the time scrubber 240 per each division 320 of the virtual jog wheel 210 increases by an acceleration amount. For example, the time interval of the media element that is moved for each division of the virtual jog wheel may be 1 second when the virtual jog wheel is operated at a first rotational velocity. The time interval spanned may increase to 5 seconds per division if the rotational velocity is increased and exceeds the acceleration threshold) wherein wherien the movement of the time scrubber 242 is controlled by the virtual jog wheel 210. For example, a touch object (not shown) may make an initial detectable contact (or other interaction) with the virtual jog wheel 210 at a first position (“P1”) and slide, swipe or move along the virtual jog wheel 210 in a clockwise direction (denoted by the dashed action arrow in FIG. 2) to a second position (“P2”). The movement by the touch object relative to the virtual jog wheel 210 represents a command from the user which is processed by the virtual jog wheel controller and translated into a corresponding movement of the time scrubber 242 from a first position (denoted by the time scrubber 242 (P1)) to a second position (denoted by the dashed-line time scrubber 242 (P2)).)
Jones teaches a rotation and change in the position of a jog wheel and a position of movement of media and changes in the timeline; However, Jones doesn’t explicitly teach the concept of animation.
Wherry teaches animating a rotation of the jog wheel UI element (Fig. 4B, ¶31, ¶107, ¶132 wherein touch screen interface provides scrolling and visual feedback in response to substantially circular scrolling motions on the touch screen interface wherien A visual clue, such as dots, could appear in control bar 1890 to indicate that the system is scrolling. The visual clue can also provide additional information such as the direction and speed of scrolling. For example, a series of moving dots of different sizes can indicate scrolling speed using motion and the direction of scroll using the sizes. The "swiping" type of scrolling can also simulate turning a knob or wheel that has friction and inertia by responding with faster scrolling at first (such as by scrolling more lines at the beginning), gradually slowing down (such as by scrolling fewer and fewer lines per unit time), and eventually coming to a stop wherien graphical scroll wheels can be implemented to respond to the scrolling speed. For example, the graphical scroll wheel may become more opaque (darker) when the user causes scrolling to occur very quickly and more transparent (lighter) when the user causes scrolling to occur slowly, or vice versa)
concurrently with the rotation of the jog wheel UI element from the first jog wheel position to the second jog wheel position, animating a movement of the first media content item from the first position to a second position, (Fig. 8A, Fig. 8B, ¶97, ¶98 wherien The user can move input object 850 (shown as a finger) in a clockwise or counterclockwise direction along the path to scroll up or down the list. The response scrolling action can be indicated by the touch screen interface 810 holding the image of highlighter 820 still at the top of the displayed portion of the list and moving the list (i.e. causing items in the list to appear at one end of the list and disappear at another end of the list, and the rest of the items to shift in position appropriately). In this case, as shown by comparison of FIGS. 8A and 8B, the input object 850 moves in a clockwise manner along the path, and this type of subsequent object motion causes the song list to move upwards while highlighter 820 holds still at the top of the displayed portion of the list wherein In FIG. 8A, the song list is arranged alphabetically, and the item "Ernest Was (Rain)" is highlighted and "Hard Wood (Worlds Apart)" is in the middle of the list. In FIG. 8A, the clockwise motion of input object 850 about 180 degrees around the path indicated by graphical scroll wheel 840 has resulted in the items from "Ernest Was (Rain)" to "Green Acres (Rendering)" to be removed from the displayed set, "Hard Wood (Worlds Apart)" to move to the top of the list and be highlighted by highlighter 820, other items to move upwards on the display, and items "Look Elsewhere (Run Away)" to Pepper (Bubble Toys)" to appear in the displayed set.)
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of graphical scroll wheel of Wherry with the teaching of media element navigation using a virtual jog wheel of Jones because Wherry teaches improved techniques for facilitating scrolling with touch sensor devices by providing a display screen, a touch sensor device, and a processor coupled to the display screen and the touch sensor. The touch sensor device is adapted to sense object motion in a sensing region that overlaps at least part of the display screen. The processor is adapted to cause a scroll wheel that indicates a scrolling path to appear on the display screen selectively, such as in response to the touch sensor sensing object motion that corresponds to a scrolling initiation gesture. The processor is further adapted to cause scrolling on a display screen selectively, such as in response to the touch sensor sensing subsequent object motion along the scrolling path after the touch sensor has sensed the object motion corresponding to the scrolling initiation gesture. The present invention thus allows the display screen to provide a more versatile graphical user interface (GUI), a benefit resulting from having available additional space when the scroll wheel is not shown. The present invention also enables the electronic system to allow the user to scroll in an intuitive manner even as it reduces the chances of accidental scrolling (¶9, ¶11)
Claim 16 is similar in scope to Claim 1; therefore, Claim 16 is rejected under the same rationale as Claim 1.
Claim 17 is similar in scope to Claim 1; therefore, Claim 17 is rejected under the same rationale as Claim 1.
Claim 23 is similar in scope to Claim 18; therefore, Claim 23 is rejected under the same rationale as Claim 18.
Claim 24 is similar in scope to Claim 18; therefore, Claim 24 is rejected under the same rationale as Claim 18.
Claim(s) 5, 6 & 19 is/are rejected under 35 U.S.C. 103 as being unpatentable over Jones in view of Wherry as applied to claims 1, 18 above, and further in view of DANDOKO et al. (U.S. Pub 2023/0393729) hereinafter Dan.
As per Claim 5, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches further comprising, responsive to the movement of the first media content item to the second timeline position: detecting a second media content item located along the timeline at least partially between the first timeline position to the second timeline position; and (Fig. 3, Fig. 4, col. 7 lines 42-56 if the user operates a touch object from an initial position (P3) to a new position (P4) such that the rotational velocity exceeds the mode change threshold, then the virtual jog wheel 310 shifts into a second mode which enables the user to switch from one media element to another. The current or selected media element may be graphically indicated, such as, for example, by a time scrubber 440 and/or the current media element may be highlighted with a darker border; as taught by Jones; Fig. 8A, Fig. 8B, ¶97, ¶98 wherien The user can move input object 850 in a clockwise or counterclockwise direction along the path to scroll up or down the list. The response scrolling action can be indicated by the touch screen interface 810 holding the image of highlighter 820 still at the top of the displayed portion of the list and moving the list (i.e. causing items in the list to appear at one end of the list and disappear at another end of the list, and the rest of the items to shift in position appropriately).; as taught by Wherry)
However, Jones as modified does not explicitly teach clipping a portion of the second media content item that is overlapped due to the movement of the first media content item to the second position.
Dan teaches clipping a portion of the second media content item that is overlapped due to the movement of the first media content item to the second position. (Fig. 3B, ¶37, ¶47 wherien The operation screen D1 includes a scrollable display region A1, a menu display region A2, and a scroll bar SB1 for scrolling the image displayed in the scrollable display region A1. The display controller 101 displays the scroll bar SB1 at a position other than the display regions A1 and A2wherein although the operation button B17 is displayed at the uppermost position scrollable display region A1, the operation button B17 is only partially displayed)
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of display menu item indicating name of group including item to be set displayed at uppermost position of scrollable display of Dan with the teaching of media element navigation using a virtual jog wheel of Jones as modified because Dan teaches an improved control device acting as a display controller that causes the display device to display an operation screen, including a scrollable display region for displaying a list of items to be set, in which a plurality of items to be set are classified into groups and listed along a scroll direction, and a menu display region for displaying menu items each indicating a name of the group, to scroll, when a scroll instruction is made on the scrollable display region, the list of items to be set according to the scroll instruction, and to display a first menu item indicating a name of a first group that includes a first item to be set displayed at an uppermost position of the scrollable display region, in a first display style visually different from other menu items. (Abstract)
As per Claim 6, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches further comprising, responsive to the movement of the first media content item to the second timeline position: detecting a second media content item located along the timeline at least partially between the first timeline position to the second timeline position; and(Fig. 3, Fig. 4, col. 7 lines 42-56 if the user operates a touch object from an initial position (P3) to a new position (P4) such that the rotational velocity exceeds the mode change threshold, then the virtual jog wheel 310 shifts into a second mode which enables the user to switch from one media element to another. The current or selected media element may be graphically indicated, such as, for example, by a time scrubber 440 and/or the current media element may be highlighted with a darker border; as taught by Jones; Fig. 8A, Fig. 8B, ¶97, ¶98 wherien The user can move input object 850 in a clockwise or counterclockwise direction along the path to scroll up or down the list. The response scrolling action can be indicated by the touch screen interface 810 holding the image of highlighter 820 still at the top of the displayed portion of the list and moving the list (i.e. causing items in the list to appear at one end of the list and disappear at another end of the list, and the rest of the items to shift in position appropriately).; as taught by Wherry)
However, Jones as modified does not explicitly teach visibly reversing at least a portion of the movement of the first media content item to a third position along the timeline that positions the first media content item adjacent the second media content item to avoid overlap of the first media content item with the second media content item.
Dan teaches visibly reversing at least a portion of the movement of the first media content item to a third position along the timeline that positions the first media content item adjacent the second media content item to avoid overlap of the first media content item with the second media content item. (Fig. 5A, Fig. 5B, ¶51, ¶52, ¶53 wherein As result of the upward scroll, the operation button B30 located at the tail end of the group G3 is partially hidden as shown in FIG. 5A, and the operation button B31 marked as “Density”, located at the head of the group G4, is regarded as the operation button displayed at the uppermost position of the scrollable display region A1 wherein As result of the downward scroll, the operation button B17 marked as “Org. Manual Feed(DP)” and located at the tail end of the group G2 entirely appears as shown in FIG. 5B, so that the operation button B17 included in the group G2 is displayed at the uppermost position of the scrollable display region A1. In other words, the group corresponding to the operation button displayed at the uppermost position of the scrollable display region A1 is switched from the group G3 to the group G2)
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of display menu item indicating name of group including item to be set displayed at uppermost position of scrollable display of Dan with the teaching of media element navigation using a virtual jog wheel of Jones as modified because Dan teaches an improved control device acting as a display controller that causes the display device to display an operation screen, including a scrollable display region for displaying a list of items to be set, in which a plurality of items to be set are classified into groups and listed along a scroll direction, and a menu display region for displaying menu items each indicating a name of the group, to scroll, when a scroll instruction is made on the scrollable display region, the list of items to be set according to the scroll instruction, and to display a first menu item indicating a name of a first group that includes a first item to be set displayed at an uppermost position of the scrollable display region, in a first display style visually different from other menu items. (Abstract)
As per Claim 19, the rejection of claim 18 is hereby incorporated by reference; Jones as modified previously taught one or more media content items , second clip. However, Jones as modified does not explicitly teach receiving a second touch input; and responsive to receiving the second touch input: clipping a portion of the one or more media content items.
Dan teaches receiving a second touch input; and responsive to receiving the second touch input: clipping a portion of the one or more media content items corresponding. (Fig. 3B, ¶37, ¶47 wherien The operation screen D1 includes a scrollable display region A1, a menu display region A2, and a scroll bar SB1 for scrolling the image displayed in the scrollable display region A1. The display controller 101 displays the scroll bar SB1 at a position other than the display regions A1 and A2wherein although the operation button B17 is displayed at the uppermost position scrollable display region A1, the operation button B17 is only partially displayed)
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of display menu item indicating name of group including item to be set displayed at uppermost position of scrollable display of Dan with the teaching of media element navigation using a virtual jog wheel of Jones as modified because Dan teaches an improved control device acting as a display controller that causes the display device to display an operation screen, including a scrollable display region for displaying a list of items to be set, in which a plurality of items to be set are classified into groups and listed along a scroll direction, and a menu display region for displaying menu items each indicating a name of the group, to scroll, when a scroll instruction is made on the scrollable display region, the list of items to be set according to the scroll instruction, and to display a first menu item indicating a name of a first group that includes a first item to be set displayed at an uppermost position of the scrollable display region, in a first display style visually different from other menu items. (Abstract)
Claim(s) 8-13 & 20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Jones in view of Wherry as applied to claims 1 & 18 above, and further in view of Snyder et al. (U.S. Pub 2017/0147160) hereinafter Snyder.
As per Claim 8, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches wherein a portion of the jog wheel UI element is displayed on the GUI, wherein the timeline is displayed horizontally with time increasing rightward, wherein an approximately downward swipe touch input causes rightward movement of the first media content item relative to the timeline, and (Fig. 2, col. 5 lines 6-20 wherien The movement of the time scrubber 242 is controlled by the virtual jog wheel 210. For example, a touch object (not shown) may make an initial detectable contact (or other interaction) with the virtual jog wheel 210 at a first position (“P1”) and slide, swipe or move along the virtual jog wheel 210 in a clockwise direction (denoted by the dashed action arrow in FIG. 2) to a second position (“P2”). The movement by the touch object relative to the virtual jog wheel 210 represents a command from the user which is processed by the virtual jog wheel controller and translated into a corresponding movement of the time scrubber 242 from a first position (denoted by the time scrubber 242 (P1)) to a second position (denoted by the dashed-line time scrubber 242 (P2)).; as taught by Jones)
wherein an approximately upward swipe touch input causes leftward movement of the first media content item relative to the timeline. (Fig. 2, col. 7 lines 20-28 wherein movement of the virtual jog wheel 210 in a counterclockwise direction decreases the time scrubber 242 along the time display 240 (e.g., the playback position of the media element is moved backward or rewound).; as taught by Jones)
However Jones does not explicitly teach wherein a portion of the jog wheel UI element is displayed left of center on the GUI,
Snyder teaches wherein a portion of the jog wheel UI element is displayed left of center on the GUI, (Fig. 3C-3E, ¶40 wherein virtual wheel 111 displayed in a left handed configuration with the virtual wheel 111 defined by the outer curved border 109 and inner curved border 107)
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of touchscreen interface menu with virtual wheel of Snyder with the teaching of media element navigation using a virtual jog wheel of Jones as modified because Snyder teaches an improved user interface could be operated in a user friendly manner and was adaptable and flexible for providing interaction with various applications and operating systems to accomplish a range of functions by providing a touchscreen menu that includes a virtual wheel for a user interface display are presented in this disclosure. The menu and virtual wheel embodiments provide an intuitive and efficient way of displaying output and/or receiving input at a user interface display, for example on a touch screen of a mobile device. (¶3, ¶5)
As per Claim 9, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches wherein the partial view of the jog wheel UI element is displayed on the GUI, wherein the timeline is displayed horizontally with time increasing rightward, wherein an approximately downward swipe touch input causes leftward movement of the first media content item relative to the timeline, and (Fig. 2, col. 5 lines 6-35 wherien The movement of the time scrubber 242 is controlled by the virtual jog wheel 210. For example, a touch object (not shown) may make an initial detectable contact with the virtual jog wheel 210 at a first position (“P1”) and slide, swipe or move along the virtual jog wheel 210 in a clockwise direction to a second position (“P2”). The movement by the touch object relative to the virtual jog wheel 210 represents a command from the user which is processed by the virtual jog wheel controller and translated into a corresponding movement of the time scrubber 242 from a first position (denoted by the time scrubber 242 (P1)) to a second position (denoted by the dashed-line time scrubber 242 (P2)) wherien the multiple media elements of a playlist (e.g., Media Element 1, Media Element 2, Media Element 3, and Media Element 4 presented to the user in a playlist area 220) may be concatenated to form a composite navigatable set of media elements 230. In an implementation, the multiple media elements may be concatenated into a composite set of media elements for navigation by a user by the graphical user interface generator 106 or the virtual jog wheel controller 108 of FIG. 1. By generating a composite set of media elements, the user is able to navigate the composite set as a single entity, and may navigate from a first position within a first media element of the set of media elements to a second position in a second media element of the set of media elements using a single time slider 240 and a single time scrubber 242.; as taught by Jones)
wherein an approximately upward swipe touch input causes rightward movement of the first media content item relative to the timeline. (Fig. 2, col. 7 lines 20-28, col. 8 lines 12-15 wherein movement of the virtual jog wheel 210 in a counterclockwise direction decreases the time scrubber 242 along the time display 240 (e.g., the playback position of the media element is moved backward or rewound) wherein if the virtual jog wheel is rotated counterclockwise, the current media element will change to a previous media element.; as taught by Jones)
However Jones does not explicitly teach wherein a portion of the jog wheel UI element is displayed left of center on the GUI,
Snyder teaches wherein a portion of the jog wheel UI element is displayed left of center on the GUI, (Fig. 3C-3E, ¶40 wherein virtual wheel 111 displayed in a left handed configuration with the virtual wheel 111 defined by the outer curved border 109 and inner curved border 107)
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of touchscreen interface menu with virtual wheel of Snyder with the teaching of media element navigation using a virtual jog wheel of Jones as modified because Snyder teaches an improved user interface could be operated in a user friendly manner and was adaptable and flexible for providing interaction with various applications and operating systems to accomplish a range of functions by providing a touchscreen menu that includes a virtual wheel for a user interface display are presented in this disclosure. The menu and virtual wheel embodiments provide an intuitive and efficient way of displaying output and/or receiving input at a user interface display, for example on a touch screen of a mobile device. (¶3, ¶5)
As per Claim 10, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches wherein the view of the jog wheel UI element is displayed right of center on the GUI, wherein the timeline is displayed horizontally with time increasing rightward, wherein an approximately downward swipe touch input causes rightward movement of the first media content item relative to the timeline, and (Fig. 2, col. 5 lines 6-20 wherien The movement of the time scrubber 242 is controlled by the virtual jog wheel 210. For example, a touch object (not shown) may make an initial detectable contact (or other interaction) with the virtual jog wheel 210 at a first position (“P1”) and slide, swipe or move along the virtual jog wheel 210 in a clockwise direction (denoted by the dashed action arrow in FIG. 2) to a second position (“P2”). The movement by the touch object relative to the virtual jog wheel 210 represents a command from the user which is processed by the virtual jog wheel controller and translated into a corresponding movement of the time scrubber 242 from a first position (denoted by the time scrubber 242 (P1)) to a second position (denoted by the dashed-line time scrubber 242 (P2)).; as taught by Jones)
wherein an approximately upward swipe touch input causes leftward movement of the first media content item relative to the timeline.. (Fig. 2, col. 7 lines 20-28 wherein movement of the virtual jog wheel 210 in a counterclockwise direction decreases the time scrubber 242 along the time display 240 (e.g., the playback position of the media element is moved backward or rewound).; as taught by Jones)
However Jones does not explicitly teach wherein the partial view of the jog wheel UI element is displayed right of center on the GUI,
Snyder teaches wherein the partial view of the jog wheel UI element is displayed right of center on the GUI, (Fig. 2A-2C, ¶38 wherein a menu having a virtual wheel that is configured for a right-handed user wherein when a user is operating on virtual wheel 110a, virtual wheel 110a may be slid into a reduced portion or minimized position by applying a touchscreen gesture on the wheel 110a in the direction shown by arrow 134 to slide or laterally reposition virtual wheel 110a into right side border 115d. The display may then appear as shown in FIG. 2A with virtual wheel 110a reduced in size or minimized from view. Touchscreen 104 may display relevant items 116 or selection list 118 not shown on the virtual wheel. The virtual wheel 110a may be brought back into view in the increased portion size or maximized position of FIG. 2B by pressing button 132 to cause the virtual wheel to slide or laterally reposition out from right side border 115d in the direction shown by arrow 136 shown on FIG. 2A)
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of touchscreen interface menu with virtual wheel of Snyder with the teaching of media element navigation using a virtual jog wheel of Jones as modified because Snyder teaches an improved user interface could be operated in a user friendly manner and was adaptable and flexible for providing interaction with various applications and operating systems to accomplish a range of functions by providing a touchscreen menu that includes a virtual wheel for a user interface display are presented in this disclosure. The menu and virtual wheel embodiments provide an intuitive and efficient way of displaying output and/or receiving input at a user interface display, for example on a touch screen of a mobile device. (¶3, ¶5)
As per Claim 11, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches wherein the partial view of the jog wheel UI element is displayed right of center on the GUI, wherein the timeline is displayed horizontally with time increasing rightward, wherein an approximately downward swipe touch input causes leftward movement of the first media content item relative to the timeline, and (Fig. 2, col. 5 lines 6-35 wherien The movement of the time scrubber 242 is controlled by the virtual jog wheel 210. For example, a touch object (not shown) may make an initial detectable contact with the virtual jog wheel 210 at a first position (“P1”) and slide, swipe or move along the virtual jog wheel 210 in a clockwise direction to a second position (“P2”). The movement by the touch object relative to the virtual jog wheel 210 represents a command from the user which is processed by the virtual jog wheel controller and translated into a corresponding movement of the time scrubber 242 from a first position (denoted by the time scrubber 242 (P1)) to a second position (denoted by the dashed-line time scrubber 242 (P2)) wherien the multiple media elements of a playlist (e.g., Media Element 1, Media Element 2, Media Element 3, and Media Element 4 presented to the user in a playlist area 220) may be concatenated to form a composite navigatable set of media elements 230. In an implementation, the multiple media elements may be concatenated into a composite set of media elements for navigation by a user by the graphical user interface generator 106 or the virtual jog wheel controller 108 of FIG. 1. By generating a composite set of media elements, the user is able to navigate the composite set as a single entity, and may navigate from a first position within a first media element of the set of media elements to a second position in a second media element of the set of media elements using a single time slider 240 and a single time scrubber 242.; as taught by Jones)
wherein an approximately upward swipe touch input causes rightward movement of the first media content item relative to the timeline. (Fig. 2, col. 7 lines 20-28, col. 8 lines 12-15 wherein movement of the virtual jog wheel 210 in a counterclockwise direction decreases the time scrubber 242 along the time display 240 (e.g., the playback position of the media element is moved backward or rewound) wherein if the virtual jog wheel is rotated counterclockwise, the current media element will change to a previous media element.; as taught by Jones)
However Jones does not explicitly teach wherein the partial view of the jog wheel UI element is displayed right of center on the GUI,
Snyder teaches wherein the partial view of the jog wheel UI element is displayed right of center on the GUI, (Fig. 2A-2C, ¶38 wherein a menu having a virtual wheel that is configured for a right-handed user wherein when a user is operating on virtual wheel 110a, virtual wheel 110a may be slid into a reduced portion or minimized position by applying a touchscreen gesture on the wheel 110a in the direction shown by arrow 134 to slide or laterally reposition virtual wheel 110a into right side border 115d. The display may then appear as shown in FIG. 2A with virtual wheel 110a reduced in size or minimized from view. Touchscreen 104 may display relevant items 116 or selection list 118 not shown on the virtual wheel. The virtual wheel 110a may be brought back into view in the increased portion size or maximized position of FIG. 2B by pressing button 132 to cause the virtual wheel to slide or laterally reposition out from right side border 115d in the direction shown by arrow 136 shown on FIG. 2A)
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of touchscreen interface menu with virtual wheel of Snyder with the teaching of media element navigation using a virtual jog wheel of Jones as modified because Snyder teaches an improved user interface could be operated in a user friendly manner and was adaptable and flexible for providing interaction with various applications and operating systems to accomplish a range of functions by providing a touchscreen menu that includes a virtual wheel for a user interface display are presented in this disclosure. The menu and virtual wheel embodiments provide an intuitive and efficient way of displaying output and/or receiving input at a user interface display, for example on a touch screen of a mobile device. (¶3, ¶5)
As per Claim 12, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches configuring the jog wheel UI element to respond to a subsequent touch input based one or more characteristics of the subsequent touch input. (Fig. 3, Fig. 4, col. 7 lines 42-56 if the user operates a touch object from an initial position (P3) to a new position (P4) (e.g., over a same period of time but a bigger distance) such that the rotational velocity exceeds the mode change threshold, then the virtual jog wheel 310 shifts into a second mode which enables the user to switch from one media element to another (also referred to as a “choose media element” mode). In an implementation, in the choose media element mode, the time display (shown in FIG. 2) is replaced by a listing of images or thumbnails 450 each representing a media element within the set of media elements 430. The current or selected media element may be graphically indicated, such as, for example, by a time scrubber 440 and/or the current media element may be highlighted with a darker border; as taught by Jones)
However, Jones as modified does not explicitly teach further comprising, prior to receiving the touch input for controlling the jog wheel UI element: receiving a first user input positioning the jog wheel UI element to a first location on the GUI; and
Snyder teaches further comprising, prior to receiving the touch input for controlling the jog wheel UI element: receiving a first user input positioning the jog wheel UI element to a first location on the GUI; and (Fig. 3A-3C, ¶40 wherien now to FIG. 3B, therein is illustrated an example menu for changing from the right handed selection list 118 and virtual wheel of FIG. 3A to a left handed selection list and virtual wheel display configuration. Upon tapping or selecting handedness switch button 135, while in the minimized position of FIG. 3A, a menu is presented on the touchscreen display 104 prompting a user to choose right side 141 or left side 143. When left side is chosen the selection list 118 and virtual wheel 111 will be displayed in a mirror image of the right sided view and in a position more convenient to left-handed users. FIGS. 3C-3E illustrate example operations on selection list 118 and virtual wheel 111 displayed in a left handed configuration with the virtual wheel 111 defined by the outer curved border 109 and inner curved border 107)
configuring the jog wheel UI element to respond to a subsequent touch input based on the first location and one or more characteristics of the subsequent touch input. (¶46 wherien the virtual wheel may be configured to display items of a set number of items that may disappear and reappear in view again as the virtual wheel is rotated continually in one direction or back and forth. Alternately, in other implementations, the virtual wheel may be configured to display items from a set of any number of items, for example, hundreds or thousands or a set of indefinite number, that appear and disappear from view as the wheel is rotated, but that may or may not reappear on the virtual wheel depending on the desired implementation. Examiner interprets a desired implementation related to where the wheel is positioned whether left or right positioning)
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of touchscreen interface menu with virtual wheel of Snyder with the teaching of media element navigation using a virtual jog wheel of Jones as modified because Snyder teaches an improved user interface could be operated in a user friendly manner and was adaptable and flexible for providing interaction with various applications and operating systems to accomplish a range of functions by providing a touchscreen menu that includes a virtual wheel for a user interface display are presented in this disclosure. The menu and virtual wheel embodiments provide an intuitive and efficient way of displaying output and/or receiving input at a user interface display, for example on a touch screen of a mobile device. (¶3, ¶5)
As per Claim 13, the rejection of claim 12 is hereby incorporated by reference; Jones as modified further teaches wherein the one or more characteristics of the subsequent touch input include at least one of: a pressure of the touch input, a speed of movement of the touch input, a length of the touch input, an approximate direction of the touch input, and presence of one or more elements associated with the touch input that are displayed on the GUI. (Fig. 3, Fig. 4, col. 7 lines 42-56 if the user operates a touch object from an initial position (P3) to a new position (P4) (e.g., over a same period of time but a bigger distance) such that the rotational velocity exceeds the mode change threshold, then the virtual jog wheel 310 shifts into a second mode which enables the user to switch from one media element to another (also referred to as a “choose media element” mode).; as taught by Jones; ¶46 wherien the virtual wheel may be configured to display items of a set number of items that may disappear and reappear in view again as the virtual wheel is rotated continually in one direction or back and forth. Alternately, in other implementations, the virtual wheel may be configured to display items from a set of any number of items, for example, hundreds or thousands or a set of indefinite number, that appear and disappear from view as the wheel is rotated, but that may or may not reappear on the virtual wheel depending on the desired implementation. Examiner interprets a desired implementation related to where the wheel is positioned whether left or right positioning; as taught by Snyder)
As per Claim 20, the rejection of claim 1 is hereby incorporated by reference; Jones as modified previously taught one or more media content items, second clip. However, Jones as modified does not explicitly teach further comprising: receiving a second touch input; and responsive to receiving the second touch input: expanding the one or more media content items to add additional media content of the one or more media content items.
Snyder teaches further comprising: receiving a second touch input; and responsive to receiving the second touch input: expanding the one or more media content items to add additional media content of the one or more media content items. (Fig. 2A-2C, ¶35 wherien The virtual wheel 110a may be brought back into view in the increased portion size or maximized position of FIG. 2B by pressing button 132 to cause the virtual wheel to slide or laterally reposition out from right side border 115d in the direction shown by arrow 136 shown on FIG. 2A. In embodiments, each of the virtual wheels at multiple selection levels may be moved between minimized and maximized, or reduced and increased, size portion positions in this manner to show and hide the wheel)
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of touchscreen interface menu with virtual wheel of Snyder with the teaching of media element navigation using a virtual jog wheel of Jones as modified because Snyder teaches an improved user interface could be operated in a user friendly manner and was adaptable and flexible for providing interaction with various applications and operating systems to accomplish a range of functions by providing a touchscreen menu that includes a virtual wheel for a user interface display are presented in this disclosure. The menu and virtual wheel embodiments provide an intuitive and efficient way of displaying output and/or receiving input at a user interface display, for example on a touch screen of a mobile device. (¶3, ¶5)
Claim(s) 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Jones in view of Wherry as applied to claim 1 above, and further in view of Rottler et al. (U.S. Pub 2011/0173539) hereinafter Rottler.
As per Claim 15, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches further comprising: concurrently with animating the rotation of the jog wheel UI element, outputting feedback at a frequency that is proportional to a rotational speed of the jog wheel UI element. (Fig. 13A-13C, ¶114 wherien the user can initiate the scroll wheel function and then cause scrolling by placing an input object 1350 (shown as a finger) in navigation control region 1386 can cause that region 1386 to highlight in response to the input to provide visual feedback The user sliding the input object 1350 in a clockwise direction after placing the input object 1350 in navigation control region 1386 can cause scrolling down through the song list, and the touch screen interface 1310 can provide visual feedback by moving and shifting the list or the highlighted list item as appropriate. The visual appearance of the graphical scroll wheel 1340 can also change to indicate scrolling and to reflect the object motion wherien region 1382 highlights to provide feedback to a user of the activation of the play function; as taught by Wherry)
However, Jones as modified does not explicitly teach
outputting audio clicks at a frequency that is proportional to a rotational speed of the jog wheel UI element. (¶9, ¶119 wherien audio feedback techniques may be applied as the user may navigate through a listing of items (e.g., using a scroll wheel) without a corresponding visual interface which encompasses non-verbal types of audio feedback, such as tones, clicks, beeps, chirps, etc)
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of adaptive audio feedback system of Rottler with the teaching of media element navigation using a virtual jog wheel of Jones as modified because Rottler teaches an improved the user experience with respect to audio user interfaces in electronic devices by providing techniques for adaptively varying audio feedback provided by an audio user interface on an electronic device. In accordance with one embodiment, an audio user interface may be configured to devolve or evolve the verbosity of audio feedback in response to user interface events based at least partially upon the verbosity level of audio feedback provided during previous occurrences of the user interface event. As will be discussed further below, the term "verbosity," as used herein, refers to the "wordiness" of the audio information provided by the audio feedback, and may also encompass non-verbal types of audio feedback, such as tones, clicks, beeps, chirps, etc. For instance, if a subsequent occurrence of the user interface event occurs in relatively close proximity to a previous occurrence of the user interface event, the audio user interface may devolve the audio feedback (e.g., by reducing verbosity), such as to avoid overwhelming a user with repetitive and highly verbose information. (¶7, ¶9)
Claim(s) 21 & 22 is/are rejected under 35 U.S.C. 103 as being unpatentable over Jones in view of Wherry as applied to claim 18 above, and further in view of Ishii et al. (U.S. Pat 6,546,188) hereinafter Ishii.
As per Claim 21, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches wherein the rotation of the jog wheel UI element from the first jog wheel position to the second jog wheel position based on the touch input causes the movement of the position indicator on the timeline to increment. (Fig. 2, col. 5 lines 5-20 wherien the movement of the time scrubber 242 is controlled by the virtual jog wheel 210. For example, a touch object (not shown) may make an initial detectable contact (or other interaction) with the virtual jog wheel 210 at a first position (“P1”) and slide, swipe or move along the virtual jog wheel 210 in a clockwise direction (denoted by the dashed action arrow in FIG. 2) to a second position (“P2”). The movement by the touch object relative to the virtual jog wheel 210 represents a command from the user which is processed by the virtual jog wheel controller and translated into a corresponding movement of the time scrubber 242 from a first position (denoted by the time scrubber 242 (P1)) to a second position (denoted by the dashed-line time scrubber 242 (P2)).; as taught by Jones)
However, Jones does not explicitly teach on a frame-by-frame basis.
Ishii teaches on a frame-by-frame basis. (Fig. 7, col. 68 lines 14-22 wherein this specification is performed by turning the search button 40m of the time-line display area 40 shown in FIG. 7, or the scroll buttons 40i and 40j on, and operating the mark-IN button 27c and the mark-OUT button 27f while watching the image displayed in increments of frames on the replay video screen 23a. )
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of editing of video content of Ishii with the teaching of media element navigation using a virtual jog wheel of Jones as modified because of Ishii teaches an improved system for video signals input to a hybrid recorder are output to the main unit of a computer as video signals. The hybrid recorder records the video signals in a built-in hard disk, and also replays the signals from the hard disk, the signals being output to a picture effects device as video signals. The picture effects device applies certain effects to the video signals, which are then output to the main unit as video signals. The main unit displays the image of the video signals and the image of the video signals on a monitor. Thus, editing can be performed easily and speedily, while adding effects. (Abstract)
As per Claim 22, the rejection of claim 1 is hereby incorporated by reference; Jones as modified further teaches wherein the rotation of the jog wheel UI element from the first jog wheel position to the second jog wheel position based on the touch input causes the movement of the position indicator on the timeline to increment. (Fig. 2, col. 5 lines 5-20 wherien the movement of the time scrubber 242 is controlled by the virtual jog wheel 210. For example, a touch object (not shown) may make an initial detectable contact (or other interaction) with the virtual jog wheel 210 at a first position (“P1”) and slide, swipe or move along the virtual jog wheel 210 in a clockwise direction (denoted by the dashed action arrow in FIG. 2) to a second position (“P2”). The movement by the touch object relative to the virtual jog wheel 210 represents a command from the user which is processed by the virtual jog wheel controller and translated into a corresponding movement of the time scrubber 242 from a first position (denoted by the time scrubber 242 (P1)) to a second position (denoted by the dashed-line time scrubber 242 (P2)).; as taught by Jones)
However, Jones does not explicitly teach on a subframe basis.
Ishii teaches on a subframe basis. (Fig. 5, col. 10 lines 40-53 wherein The 1520 pixel by 960 pixel video data stored in the frame memory 11c is read out based on read control from the processor controller 11a. The 1520 pixel by 960 pixel video data read out from the frame memory 11c is video data which has been pruned of data amount, so that it is 350 pixel by 240 pixel video data instead of the 1520 pixel by 960 pixel full-pixel video data. The process of pruning here involves simply reducing the sampling rate of reading the video data from the frame memory 11c to 1/4, so that the amount of read video data is reduced. The 350 pixel by 240 pixel video data thus read is sent to the display controller 13 via the image data bus 5a (this image is then displayed on monitor 2b on the recording video screen 21a shown in FIG. 5, as described later).)
It would have been obvious to one having ordinary skill in the art at the time the invention was filed to utilize the teaching of editing of video content of Ishii with the teaching of media element navigation using a virtual jog wheel of Jones as modified because of Ishii teaches an improved system for video signals input to a hybrid recorder are output to the main unit of a computer as video signals. The hybrid recorder records the video signals in a built-in hard disk, and also replays the signals from the hard disk, the signals being output to a picture effects device as video signals. The picture effects device applies certain effects to the video signals, which are then output to the main unit as video signals. The main unit displays the image of the video signals and the image of the video signals on a monitor. Thus, editing can be performed easily and speedily, while adding effects. (Abstract)
Related Art
Related Art not replied upon Kwahk et al. (U.S. Pub 2012/0079430) for teaching providing a GUI for searching and selecting desired content which displays a GUI item including a specific figure, in which each of one or more categories including contents stored in a selected storage medium is occupied by a predetermined range along a border region thereof and for selecting a category and a content according to a manipulation input by the user, and a device applying the same.
Inquiry
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANGIE BADAWI whose telephone number is (571)270-7590. The examiner can normally be reached Monday thru Wednesday 9:00am - 5:00pm EST with Thursdays and Fridays off.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Fred Ehichioya can be reached at (571) 272-4034. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ANGIE BADAWI/ Primary Examiner, Art Unit 2179