DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . This action is made non-final.
Claims 1-10, 12-15, 18, 20, and 21 are pending in the case. Claims 1, 15, and 21 are independent claims. Claims 11, 16, 17, and 19 have been canceled.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1, 3-7, 12, 13, 15, 18, 20, and 21 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Alonso Ruiz et al. (US 2020/0301556 A1).
Regarding claim 1, Alonso Ruiz teaches a method comprising:
outputting, for display by a display device, a graphical user interface of an application executing at a computing device (multifunction device 100 of FIG. 1A and [0063], FIG. 23L and [0330-0334], [0208]: see user interface 502 of an application executing at a computing device/multifunction device 100 which includes touch-sensitive display system 112); and
responsive to receiving, by the computing device, an indication of a start of a user input swipe gesture (FIGS. 23L-N and [0330-0334]: see start of a user input swipe gesture represented by contact 23222-a and movement 2324 as seen in FIGS. 23L-N):
outputting, for display by the display device, a scaled down version of the graphical user interface of the application, wherein the scaled down version is reduced in size by a scaling factor ([0208], FIGS. 23L-M and [0330-0334]: as seen in FIG. 23M, a scaled down version of the GUI is displayed as represented by representation 508);
adjusting the scaled down version based on dynamic adjustment of the scaling factor according to an amount of displacement of the user input swipe gesture ([0208], FIGS. 23M-N, and [0330-0334]: as seen in FIG. 23N, the scaled down version of FIG. 23M is adjusted according to an amount of displacement of the user input as the user continues swiping to the right. In this example, the scaling factor dynamically increases as the user increases an amount of displacement);
outputting, for display by the display device, and at least partially concealed by the scaled down version of the graphical user interface of the application, a visual indication of a result of the user input swipe gesture (FIGS. 23M-N and [0330-0334], [0208]: a visual indication of a result is represented by at least representation 510, which is at least partially concealed by the scaled down version as seen in either FIG. 23M or FIG. 23N); and
responsive to receiving, by the computing device, an indication of a commitment of the user input swipe gesture, outputting, for display by the display device, a graphical user interface that corresponds to the result of the user input swipe gesture (FIGS. 23L-O in light of FIGS. 23I-K and [0330-0334]: as seen in FIG. 23J and 23O, the user swipes past boundary 2312 and boundary 2320, respectively, these boundaries corresponding to each other. The release of the movement past this boundary 2320/2312 results in display of a graphical user interface seen in FIG. 23K).
Regarding claim 3, Alonso Ruiz further teaches the method of claim 1, wherein the graphical user interface of the application comprises a home page of the application, and wherein the graphical user interface that corresponds to the result of the user input swipe gesture comprises a home page of an operating system of the computing device (FIG. 5Q and [0240], [0340], [0358]: see example home screen card 554. As stated in [0358], “In some embodiments, the stack includes (1026) user interface representations for a home screen (e.g., representations of any of one or more user interfaces accessible immediately after the startup of the device, such as a notification center, a search UI, or a springboard or dashboard showing applications available on the device, such as representation 554 of user interface 552 of a home screen in FIG. 5Q)”. Thus, the graphical user interface of the application comprises a home page of the application, within the stack, and the subsequent graphical user interface that corresponds to the result comprises a home page of an operating system of the computing device, like representation 554).
Regarding claim 4, Alonso Ruiz further teaches the method of claim 1, wherein receiving the indication of the start of the user input swipe gesture comprises:
receiving an indication of a swipe gesture originating at an edge of the display device, the swipe gesture having at least a displacement in a direction perpendicular to the edge (FIGS. 23L-N and [0330-0334]: see the swipe gesture having displacement in a horizontal direction which is perpendicular to the left vertical edge).
Regarding claim 5, Alonso Ruiz further teaches the method of claim 4, wherein the edge is a vertical edge of the display device in an orientation of the display device at a time at which the indication of the start of the user input swipe gesture was received (FIGS. 23L-N and [0330-0334]: see the swipe gesture having displacement in a horizontal direction which is perpendicular to the left vertical edge).
Regarding claim 6, Alonso Ruiz further teaches the method of claim 4, further comprising:
determining whether the displacement of the swipe gesture in the direction perpendicular to the edge is greater than a commitment threshold (FIGS. 23L-N, in light of FIGS. 23I-K, and [0330-0334]: crossing boundary 2320, which corresponds to boundary 2312, is a commitment threshold),
wherein receiving the indication of the commitment of the user input swipe gesture comprises receiving, by the computing device, an indication that the user input swipe gesture has been released while the displacement of the swipe gesture in the direction perpendicular to the edge is greater than the commitment threshold (FIG. 23K and [0330-0334]: the user input swipe gesture has been released while the displacement is greater than the commitment threshold/boundary 2320, which corresponds to boundary 2312).
Regarding claim 7, Alonso Ruiz further teaches the method of claim 6, further comprising:
generating, by the computing device, haptic feedback that indicates when the displacement of the swipe gesture in the direction perpendicular to the edge crosses the commitment threshold ([0346] and [0541]: crossing the commitment threshold causes generation of haptic feedback).
Regarding claim 12, Alonso Ruiz further teaches the method of claim 1, further comprising:
responsive to receiving, by the computing device, an indication of a non-commitment of the user input swipe gesture, outputting, for display by the display device, an unscaled version of the graphical user interface of the application (FIGS. 23L-R and [0330-0334]: movement to the left of boundary 2320 and then to the left of boundary 2318 reverses the feedback, which results in an unscaled version of the graphical user interface seen in FIG. 23L).
Regarding claim 13, Alonso Ruiz further teaches the method of claim 1, wherein the user input swipe gesture originates at an edge of the display device, wherein the displacement is measured in a direction perpendicular to the edge ([0208], FIGS. 23L-N and [0330-0334]: the user input swipe gesture originates at an edge, as seen in FIG. 23L. As the user continues to swipe to the right, displacement is measured in a horizontal direction which is perpendicular to the left vertical edge).
Regarding claims 15, 18, and 20, the claims recite a computing device comprising: a display device; one or more processors; and a memory that stores instructions that, when executed by the one or more processors, cause the one or more processors (multifunction device 100 of FIG. 1A and [0063]: a computing device/multifunction device 100 includes touch-sensitive display system 112, processors 120, and memory 102) to perform operations corresponding to the method of claims 1, 6, and 12, respectively, and are therefore rejected on the same premises.
Regarding claim 21, the claim recites a non-transitory computer-readable storage medium storing instructions that, when executed by one or more processors of a computing device (multifunction device 100 of FIG. 1A and [0063], [0008], and claim 12), cause the one or more processors to perform operations corresponding to the method of claim 1 and is therefore rejected on the same premise.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 2 is/are rejected under 35 U.S.C. 103 as being unpatentable over Alonso Ruiz et al. (US 2020/0301556 A1), in view of Chen (US 2023/0229462 A1).
Regarding claim 2, Alonso Ruiz teaches the method of claim 1. Alonso Ruiz further teaches wherein the graphical user interface of the application comprises a current page of the application (FIG. 23L and [0208]: user interface 502 comprises a current page of the application).
Alonso Ruiz does not explicitly teach wherein the graphical user interface that corresponds to the result of the user input swipe gesture comprises a previous page of the application.
Chen teaches wherein the graphical user interface that corresponds to the result of the user input swipe gesture comprises a previous page of the application (Table 1 and [0077-0078]: for example, sliding right from the left edge allows navigation to a previous page of the application. See this in the context of FIGS. 8A-B which shows sliding, but in a different direction.).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Alonso Ruiz to incorporate the teachings of Chen so as to have wherein the graphical user interface that corresponds to the result of the user input swipe gesture comprises a previous page of the application. Doing so would allow the user to navigate at a more granular, application-specific level. In this way, the user may more efficiently traverse pages within a singular application particularly relevant to the user at that time. In this way, the user is not only limited to navigating between different applications.
Claims 8-10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Alonso Ruiz et al. (US 2020/0301556 A1), in view of Wang (US 2016/0041702 A1).
Regarding claim 8, Alonso Ruiz teaches the method of claim 4. Alonso Ruiz does not explicitly teach further comprising: responsive to receiving, by the computing device, the indication of the start of the user input swipe gesture: outputting, for display by the display device and proximate to the edge, a graphical element indicating that a back gesture is being recognized.
Wang teaches responsive to receiving, by the computing device, the indication of the start of the user input swipe gesture (FIGS. 8-12 and [0043-0050]: as seen in the example of FIG. 9, indication of the start corresponds to at least the user performing pull gesture from position 20 to 21):
outputting, for display by the display device and proximate to the edge, a graphical element indicating that a back gesture is being recognized (FIGS. 8-12 and [0043-0050]: continuing the example of FIG. 9, a graphical element indicating a back gesture is being recognized is represented by menu item 25).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Alonso Ruiz to incorporate the teachings of Wang so as to include responsive to receiving, by the computing device, the indication of the start of the user input swipe gesture: outputting, for display by the display device and proximate to the edge, a graphical element indicating that a back gesture is being recognized. Doing so would allow the user to confirm that they are performing the intended gesture to navigate to the correct content. This would prevent wasting of processing resources by helping prevent the user from mistakenly accessing irrelevant content.
Regarding claim 9, Alonso Ruiz in view of Wang teaches the method of claim 8.
Wang further teaches outputting the graphical element indicating that the back gesture is being recognized comprises: adjusting, based on whether release of the user input swipe gesture will commit, an appearance of the graphical element (FIGS. 8-12 and [0043-0050]: continuing the example of FIG. 9, if the user swipes left, then, as seen in FIG. 10, the graphical element/menu item 29 changes in appearance, indicating that release of the user input swipe gesture will commit; See the rationale provided for the rejection of claim 8).
Regarding claim 10, Alonso Ruiz in view of Wang teaches the method of claim 9. Alonso Ruiz further teaches wherein determining that release of the user input swipe gesture will commit comprises determining that the displacement of the swipe gesture in the direction perpendicular to the edge is greater than a commitment threshold (FIG. 23K and [0330-0334]: the user input swipe gesture has been released while the displacement is greater than the commitment threshold/boundary 2320, which corresponds to boundary 2312).
Claim 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Alonso Ruiz et al. (US 2020/0301556 A1), in view of Kim et al. (US 2013/0268883 A1).
Regarding claim 14, Alonso Ruiz teaches the method of claim 13.
Alonso Ruiz does not explicitly teach wherein determining the scaling factor comprises: determining, as a non-linear function of the displacement of the swipe gesture in the direction perpendicular to the edge, the scaling factor.
Kim teaches wherein determining the scaling factor comprises: determining, as a non-linear function of the displacement of the swipe gesture in the direction perpendicular to the edge, the scaling factor ([0017], [0019], FIGS. 6A-C and [0126-0133]: the scaling factor, as a non-linear function, is determined).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Alonso Ruiz to incorporate the teachings of Kim and have wherein determining the scaling factor comprises: determining, as a non-linear function of the displacement of the swipe gesture in the direction perpendicular to the edge, the scaling factor. Doing so would provide a visually appealing scaling effect to alert the user of a change in the displayed content so as to encourage more precise and effective navigation to the desired content.
Response to Arguments
Applicant's arguments filed 12/10/2025 have been fully considered but they are not persuasive.
In Remarks, Applicant argues:
As for amended independent claim 1, Ruiz does not teach the newly added limitations (p. 8 of Remarks).
The Examiner respectfully disagrees.
Regarding point (a), Applicant’s amendments have changed the scope of claim 1. Nevertheless, Alonso Ruiz still anticipates the claim. Alonso Ruiz prefaces that a user may perform swipe gestures with a virtual stack of cards, including representations 508, 510, 526, 534, 540, and 552 ([0208]). The newly amended limitations are concisely taught by Alonso Ruiz’s FIGS. 23L-M and [0330-0334]. Starting from FIG. 23L, a user input swipe gesture originates from a left vertical edge.
PNG
media_image1.png
568
437
media_image1.png
Greyscale
As the gesture continues horizontally to the right, a scaled down version of the GUI is displayed as represented by representation 508 seen in FIG. 23M. As the gesture further continues to the right, the scaled down version is adjusted according to an amount of displacement of the user input as seen in FIG. 23N.
PNG
media_image2.png
572
877
media_image2.png
Greyscale
In this example, the scaling factor is dynamically increased as the user increases an amount of displacement, resulting in the further scaling down of representation 508 seen in FIG. 23N. As such, Alonso Ruiz clearly teaches all limitations of the amended claim.
In conclusion, amended independent claims 1, 15, and 21 are properly rejected under 35 U.S.C. 102(a)(1) as being anticipated by Alonso Ruiz et al. (US 2020/0301556 A1). The dependent claims accordingly remain rejected.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to KENNY NGUYEN whose telephone number is (571)272-4980. The examiner can normally be reached M-Th 7AM to 5PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KIEU D VU can be reached on (571)272-4057. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/KENNY NGUYEN/Primary Examiner, Art Unit 2171