DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-19 have been canceled. Claims 20-36 are added. Claims 20-36 are present for examination.
Claim Objections
Claim 24 is objected to because of the following informalities: “claim 4” should be “claim 23” because claim 24 recites “the second control group” and claim 23 recites “a second control group” while claim 4 has been canceled. Also, “the control” in line 3 should be “the controls”.
Claim 27 is objected to because of the following informalities: “wherein the” in line 1 should be “wherein”.
Claim 33 is objected to because of the following informalities: “the control” in lines 3-4 should be “the controls”.
Appropriate correction is required.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 26-29 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 26 recites the limitation "the second control group" in line 5. There is insufficient antecedent basis for this limitation in the claim.
Claims 27-29 depend from claim 26 but fail to cure the deficiencies of claim 26.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 20-22, 25, 30, and 34-36 is/are rejected under 35 U.S.C. 103 as being unpatentable over US Patent Publication 20190370030 A1 to Xiao et al. in view of US Patent Publication 20120056889 A1 to Carter et al.
Regarding claim 20, Xiao discloses An animation effect display method, applied to an electronic device (Xiao, para. [0002], disclosing implementing user interfaces and animation of graphical elements on electronic devices), wherein the method comprises:
determining, by a user interface (UI) thread of an application (Xiao, para. [0044], disclosing the UI framework includes a root process to manage an event in connection with a UI, indicating the UI framework can correspond to an application having a user interface (UI) thread), and based on information about a first animation effect of a first control group that is set by the application (Xiao, FIGs. 3A and 3B, showing moving block A to block C to Block B, para. [0049], disclosing the UI framework includes an animation module that provides information related to an animation, providing an animation-related information to update views and receiving information related to the passage of time, an animation may be animated based on a timing curve controlling the progression of the animation over time, para. [0050], disclosing animation may be linked with gestures, para. [0051], disclosing animation for moving a horizontal location of a graphical UI element on a screen, the graphical UI element can correspond to a first control group, and the animation for moving the graphical UI element can correspond to a first animation effect set by the UI framework as the application), a duration of the first animation effect, a start time of the first animation effect, and description information about an end frame of the first animation effect (Xiao, para. [0049], disclosing an animation maybe be animated based on a timing curve which defines a function that controls the progression of the animation over time, the timing functions can be used to define the pacing of an animation over its duration, para. [0050], disclosing animation may be expressed in code at a high level indicating an initial state and a destination state, which may include information about a particular view associated with the animation, para. [0054], disclosing the animation module creates an animation record storing properties for animation including a start time for calculating an elapsed time for the animation, para. [0070], disclosing receiving information related to an animation including initial state, destination state, and an animation function, and can include a start time for animation, the animation module may determine an elapsed time based on a current time and the start time, para. [0071], disclosing generating the animation record including information for the initial state, the destination state, and the animation function, para. [0076], disclosing the animation may be defined to complete within a specified time, the duration of the animation can be completed with the specified amount of time. The timing function to define the pacing of an animation over its duration indicates the duration of the animation can be determined, and the specified time to complete the animation can correspond to the duration, the starting time can be determined from the received information, and the destination state and the animation function can correspond to description information about an end frame of the first animation effect), wherein the first control group comprises one or more controls (Xiao, para. [0034], disclosing the graphical elements can correspond to UI elements (e.g., windows, dialogs, labels, images, buttons, menus, text fields, pickers, sliders, switches, etc.), para. [0051], disclosing the graphical UI element can be an image or another type of graphical object, indicating the graphical UI elements in the first control group can correspond to one or more controls), and an animation trigger event triggers display of the first animation effect (Xiao, para. [0076], disclosing an input gesture may be an event to trigger the start of the animation);
determining description information about a target frame of the first animation effect based on the start value of the first animation effect, the duration of the first animation effect, a value corresponding to the target frame, and the description information about the end frame of the first animation effect, when generating display data of the target frame (Xiao, para. [0052], disclosing the animation module generates synthetic data to represent an intermediate state between the initial state and the destination state, information related to the intermediate state can be stored as an animation record, para. [0053], disclosing making a copy of the destination state to represent the intermediate state and injects values into the intermediate state to represent a state between the initial state and the destination state, in an example, the initial state has a value of x=1, the destination state has a value of x=11, the intermediate state has a value of 4, which is injected in the intermediate state to update a node in the view tree. The intermediate state can correspond to a target frame, the value injected to the intermediate state and the copy of the destination state can correspond to the description information about the target frame determined based on the start value of the first animation effect, a value corresponding to the target frame, and the description information about the end frame of the first animation effect (the copy of the destination state)); and
generating, by a rendering thread of the application or a rendering process, the display data of the target frame based on the description information about the target frame (Xiao, para. [0073], disclosing providing a copy of the destinations state that includes the value related to the intermediate state for rendering the animation, an interpolation curve may be applied to blend the animation from the initial state to the destination state based at least in part of the intermediate state).
However, Xiao does not expressly disclose determining description information about a target frame of the first animation effect based on the start time of the first animation effect, the duration of the first animation effect, a time corresponding to the target frame.
On the other hand, Carter discloses determining description information about a target frame of the first animation effect based on the start time of the first animation effect, the duration of the first animation effect, a time corresponding to the target frame (Carter, para. [0053], disclosing key frame animation has a timeline that starts at zero, may include one or more key frames at intermediate times relative to the start, and ends at the time defined by the last key frame, the duration of the animation is the difference between the time defined by the last key frame and the time defined by the first key frame, by interpolation, a value for the target object property can be determined for any time within the duration of the timeline, para. [0054], disclosing key frames define values of a target object property for different input values from a source, indicating the value for a target object property at a time within the duration of the timeline can correspond to description information about a target frame of the animation corresponding to the first animation effect, which can be determined based on time defined by the first key frame as the start time of the first animation effect, the duration of the animation corresponding to the duration of the first animation effect, and the time within the duration of the timeline when the value for the target object property is determined as a time corresponding to the target frame).
Before the invention was effectively filed, it would have been obvious for a person skilled in the art to combine Xiao and Carter. The suggestion/motivation would have been to provide key frame animation for UI elements, as suggested by Carter (see Carter, para. [0059]).
Regarding claim 21, Xiao in view of Carter discloses the method according to claim 20, wherein the information about the first animation effect is configured by the application through an animation interface provided by a system (Xiao, para. [0034], disclosing APIs to design user interfaces and to handle operations including animations and layout of graphical elements, para. [0036], disclosing a user interface (UI) API to create UIs that animate and interact with gesture events, para. [0049], disclosing the UI framework includes an animation module that provides information related to an animation).
Regarding claim 22, Xiao in view of Carter discloses the method according to claim 20, wherein the determining description information about a target frame based on the start time of the first animation effect, the duration of the first animation effect, the time corresponding to the target frame, and the description information about the end frame of the first animation effect when generating display data of the target frame in the duration of the first animation effect is performed by the UI thread, the rendering thread, or the rendering process (Xiao, para. [0052], disclosing the animation module generates synthetic data to represent an intermediate state between the initial state and the destination state, information related to the intermediate state can be stored as an animation record, para. [0053], disclosing making a copy of the destination state to represent the intermediate state and injects values into the intermediate state to represent a state between the initial state and the destination state, in an example, the initial state has a value of x=1, the destination state has a value of x=11, the intermediate state has a value of 4, which is injected in the intermediate state to update a node in the view tree, Carter, para. [0040], disclosing the rendering library implements a programming interface to create and receive a source variable for a key frame animation, para. [0054], disclosing key frames define values of a target object property for different input values from a source, indicating the determining description information about a target frame can be performed by animation module in Xiao or the programming interface in the rendering library as the rendering thread or rendering process). Before the invention was effectively filed, it would have been obvious for a person skilled in the art to combine Xiao and Carter. The suggestion/motivation would have been to provide key frame animation for UI elements, as suggested by Carter (see Carter, para. [0059]).
Regarding claim 25, Xiao in view of Carter discloses the method according to claim 20, wherein the generating, by the rendering thread or the rendering process, the display data of the target frame based on the description information about the target frame specifically comprises: updating, by the rendering thread or the rendering process, a first render tree based on the description information about the target frame (Xiao, para. [0053], disclosing the animation module makes a copy of the destination state to represent the intermediate state and injects values into the intermediate state to represent a state between the initial state and the destination state, the intermediate state has a value of 4, which is injected in the intermediate state to update a node in its view tree, the animation module then instructs the view tree of the intermediate state to update itself in order to show the graphical element at a different position based on the updated value of the node, the view tree can correspond to the first render tree), and generating, by the rendering thread or the rendering process, the display data of the target frame based on the updated first render tree (Xiao, para. [0053], disclosing the animation module instructing the view tree of the intermediate state to update itself in order to show the graphical element at a different position based on the updated value of the node, the view tree can correspond to the first render tree, the showing the graphical element at a different position indicates the display data of the intermediate state corresponding to the target frame is generated based on the updated view tree with the injected value).
Regarding claim 30, Xiao in view of Carter discloses the method according to claim 25, wherein the first render tree is a render tree corresponding to the end frame of the first animation effect; or the first render tree is a render tree corresponding to an interface shown before the first animation effect starts (Xiao, para. [0053], disclosing the animation module instructing the view tree of the intermediate state to update itself in order to show the graphical element at a different position based on the updated value of the node, the view tree before updating can correspond to the first render tree corresponding to an interface shown before the first animation effect starts, and the view tree after updating can be corresponding to the end frame of the first animation effect because the end frame is corresponding to an intermediate state at the destination state corresponding to the end frame of the first animation effect).
Regarding claim 34, Xiao in view of Carter discloses the method according to claim 20, wherein the animation trigger event comprises at least one of a user interaction, a network status change, and a message sent by another application on the electronic device to the application (Xiao, para. [0076], disclosing an input gesture may be an event to trigger the start of the animation, indicating the input gesture can correspond to a user interaction).
Regarding claim 35, it recites similar limitations of claim 1 but in an electronic device form. The rationale of claim 1 rejection is applied to reject claim 35. In addition, Xiao discloses an electronic device comprising one or more processors and a memory (see Xiao, FIG. 28).
Regarding claim 36, it recites similar limitations of claim 1 but in a non-transitory computer-readable storage medium form. The rationale of claim 1 rejection is applied to reject claim 36. In addition, Xiao discloses an electronic device comprising one or more processors and memory and a storage (see Xiao, FIG. 28).
Allowable Subject Matter
Claim(s) 23, 24, and 31-33 is/are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Claims 26-29 would be allowable if rewritten to overcome the rejection(s) under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), 2nd paragraph, set forth in this Office action and to include all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter:
Regarding claim 23, none of the prior art references on the record discloses determining, by the UI thread, properties of controls in the end frame of the first animation effect based on the description information about the end frame of the first animation effect; and comparing, by the UI thread, the properties of the controls in the end frame of the first animation effect with properties of controls shown before the first animation effect starts, to determine a second control group, wherein the second control group is controls whose properties change in the duration of the first animation effect, and the second control group comprises the first control group; and the first control group is displayed through the first animation effect, a control in the second control group but not in the first control group is displayed through a second animation effect, and the animation trigger event further triggers display of the second animation effect.
Claim 24 should be depending on claim 23 (see the Claim Objection above) with additional limitations.
Regarding claim 26, none of the prior art references on the record discloses wherein the updating, by the rendering thread or the rendering process, the first render tree based on the description information about the target frame specifically comprises: updating, by the rendering thread or the rendering process, the first render tree based on the properties of the controls in the second control group in the target frame.
Claims 27-29 depend from claim 26 with respective additional limitations.
Regarding claim 31, none of the prior art references on the record discloses wherein after the determining, by the UI thread of the application, and based on information about the first animation effect of the first control group that is set by the application, the duration of the first animation effect, the start time of the first animation effect, and the description information about the end frame of the first animation effect, the method further comprises: deregistering, by the UI thread, an animation callback of the first animation effect, wherein the animation callback triggers the UI thread to modify a property of the control in the first control group.
Regarding claim 32, none of the prior art references on the record discloses during the duration of the first animation effect, receiving, by the UI thread, a vertical synchronization signal at a first moment, and generating, by the rendering thread or the rendering process, the display data of the target frame at a second moment based on the description information about the target frame, wherein the second moment is after first duration of the first moment, and the first duration is preset.
Regarding claim 33, none of the prior art references on the record discloses determining, by the rendering thread or the rendering process, a second parameter in the duration of the first animation effect, wherein the second parameter comprises both a size and a location of the control in the first control group; sending, by the rendering thread or the rendering process, the second parameter to the UI thread; and determining, by the UI thread, the size and the location of the control in the first control group based on the second parameter.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to HAIXIA DU whose telephone number is (571)270-5646. The examiner can normally be reached Monday - Friday 8:00 am-4:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kee Tung can be reached at 571-272-7794. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/HAIXIA DU/Primary Examiner, Art Unit 2611