DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
This Office action is responsive to the Request for Continued Examination (RCE) filed under 37 CFR §1.53(d) for the instant application on March 6, 2026. Applicants have properly set forth the RCE, which has been entered into the application, and an examination on the merits follows herewith.
Claims 1, 13 and 15 are amended; claims 2 and 16-18 are canceled; claim 23 is newly added; and claims 1, 3-15 and 19-23 are pending and have been considered below.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1, 4-9, 12, 15 and 20-23 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Plain et al. (U.S. Patent No. 9,741,257).
With regard to claim 1, Plain teaches a screen projection method ([abstract] A system and method for collaborative online desktop), comprising:
invoking, by a first electronic device (Fig. 1A, teacher computing device 104), a first module of a first application to run a first desktop, wherein the first desktop is associated with a first display area ([abstract] a teacher can view the teacher's own desktop; FIG. 5A is a screenshot of a teacher's computer screen according to one embodiment of the present disclosure);
displaying, by the first electronic device, first display content based on the first display area, wherein the first display content comprises the first desktop ([abstract] a teacher can view the teacher's own desktop; FIG. 5A is a screenshot of a teacher's computer screen according to one embodiment of the present disclosure; [col. 2, lines 10-40] The present disclosure solves these and other issues by providing coordinated shared collaboration tools that allow a program to be launched, shared, and displayed so that a teacher and a student can both see the graphical elements displayed on the screen of the other's computing devices (or portions of those graphical elements) simultaneously)
invoking, by the first electronic device in response to a first user operation, a second module of the first application to run a second desktop ([col. 2, lines 10-40] These tools allow for sharing of desktop applications in such a way that both presenter and participant(s), who are physically located in separate places—can see the display on the screen of each other's computing devices (whether the entire display or a portion thereof, e.g. an active window or a particular application that is running on the computer device), and can receive indications of mouse clicks and/or other user inputs, such that a teacher can teach a student how to perform a task using the teacher's computing device and then, on the same device, watch the student perform the task), wherein the second desktop is associated with a second display area (Fig. 1A, student computing device 120; Fig. 2A; FIG. 6 is a screenshot of the screen of a student's computing device; [col. 14, lines 31-49] The first and second sets of graphical elements may be, for example, the graphical elements on the teacher's and student's desktops, respectively, or they may be, as another example, the graphical elements of one window on the teacher's and student's desktops);
sending, by the first electronic device, second display content corresponding to the second display area to a second electronic device ([col. 14, lines 40-49] a computing device used in a collaborative session simultaneously sends information corresponding to a first set of graphical elements via a network interface and receives information corresponding to a second set of graphical elements via the network interface (step 904). The first and second sets of graphical elements may be, for example, the graphical elements on the teacher's and student's desktops, respectively, or they may be, as another example, the graphical elements of one window on the teacher's and student's desktops), wherein the second display content comprises the second desktop (Fig. 1A, student computing device 120; Fig. 2A; FIG. 6 is a screenshot of the screen of a student's computing device; [abstract] student(s) can see their own desktop), and the second display content at least adds a function that is unavailable in a first application window of the first electronic device or shields a function of the first electronic device that is not applicable to the second electronic device ([col. 5, lines 1-14] In some embodiments, certain session parameters are determined by each presenter/participant. Such parameters may include, for example, whether the screen of a given computing device will be visible to one or more other participants or presenters in the collaborative session; [col. 10, lines 1-20] the coordination application 116 may be configured so that selection of the share desktop button 524 will result only in the sharing of the desktop of the user (teacher or student) who selected the button);
displaying, by the first electronic device in response to a second user operation performed on the first display content, third display content based on a first task stack that is of the first application and that runs in the first display area (Fig. 5B; [claim 11] a third data set corresponding to a third plurality of graphical elements displayed by a third computing device via a third graphical user interface, the third data set sufficient to enable the first computing device to display the third plurality of graphical elements via the first graphical user interface);
determining, by the first electronic device in response to a third user operation performed on the second display content displayed by the second electronic device ([claim 11] simultaneously displaying, on separate portions of the graphical user interface, the first plurality of graphical elements, the second plurality of graphical elements, and the third plurality of graphical elements), that display content corresponding to the second display area is fourth display content based on a second task stack that is of the first application and that runs in the second display area ([col. 10, lines 27-53] FIG. 5B shows a screenshot 506 of a teacher computing device 104 running a coordination application 116 during a one-to-many collaborative session… FIG. 5B includes student windows 536a, b, c, d, . . . n showing the desktops 548a, b, c, d, . . . n of each of the students participating in the one-to-many collaborative session), wherein the second task tack is different from the first task stack ([col. 10, lines 27-53] Fig. 5B, student windows 536a, b, c, d, . . . n showing the desktops 548a, b, c, d, . . . n of each of the students participating in the one-to-many collaborative session); and
sending, by the first electronic device, the fourth display content to the second electronic device (Fig. 5B, student windows 536a, b, c, d, . . . n showing the desktops 548a, b, c, d, . . . n of each of the students participating).
With regard to claim 4, the limitations are addressed above and Plain teaches wherein before the displaying, by the first electronic device, first display content based on the first display area ([abstract] a teacher can view the teacher's own desktop; FIG. 5A is a screenshot of a teacher's computer screen according to one embodiment of the present disclosure), the screen projection method further comprises:
invoking, by the first electronic device, a fifth module of a third application to run a first display object of a first variable ([col. 11, lines 5-15] the collaboration application 116 may be configured to provide a visual indication to students participating in a collaborative session whenever the teacher in the session clicks her mouse. The visual indication may be a color change (e.g. changing an edge color of window, or the background color of a window or screen), a change in relative size of one graphical element to another, an animation, a flashing screen or flashing cursor, or any other visual indication), wherein the first variable is associated with the first display area ([col. 11, lines 5-15] the collaboration application 116 may be configured to provide a visual indication to students participating in a collaborative session whenever the teacher in the session clicks her mouse), and the first display content comprises the first display object ([col. 11, lines 5-15] The visual indication may be a color change (e.g. changing an edge color of window, or the background color of a window or screen), a change in relative size of one graphical element to another, an animation, a flashing screen or flashing cursor, or any other visual indication); and the first variable is associated with the second display area ([col. 11, lines 5-15] the collaboration application 116 may be configured to provide a visual indication to students participating in a collaborative session), and the second display content comprises the first display object ([col. 11, lines 5-15] The visual indication may be a color change (e.g. changing an edge color of window, or the background color of a window or screen), a change in relative size of one graphical element to another, an animation, a flashing screen or flashing cursor, or any other visual indication).
With regard to claim 5, the limitations are addressed above and Plain teaches further comprising:
in response to a fourth user operation performed on the first display content, invoking, by the first electronic device, the fifth module of the third application to modify the first display object of the first variable to a second display object ([col. 12, lines 17-21] highlighting a student's screen may include one or more of changing the size of the student's screen (on the visual display 716) relative to its original or default size and/or relative to the size of other student screens appearing on the visual display 716; placing or enhancing a border around the student's screen; changing the color of some or all of the student's screen or of an edge of a window within the student's screen or of the student's screen itself);
updating, by the first electronic device, display content corresponding to the first display area to fifth display content ([col. 9, lines 1-7] if a teacher/presenter and/or a student/participant desires to establish an additional video communication channel (e.g. providing video signals corresponding to images other than those displayed on the screen of a computing device 304 of a collaborative session, such as images of a teacher or student using a computing device 304) as part of the collaborative session), wherein the fifth display content comprises the second display object (Fig. 5B, student windows 536a, b, c, d, . . . n showing the desktops 548a, b, c, d, . . . n of each of the students participating);
updating, by the first electronic device, the display content corresponding to the second display area to sixth display content ([col. 9, lines 1-7] if a teacher/presenter and/or a student/participant desires to establish an additional video communication channel (e.g. providing video signals corresponding to images other than those displayed on the screen of a computing device 304 of a collaborative session, such as images of a teacher or student using a computing device 304) as part of the collaborative session); and
sending the sixth display content to the second electronic device, wherein the sixth display content comprises the second display object (Fig. 5B, student windows 536a, b, c, d, . . . n showing the desktops 548a, b, c, d, . . . n of each of the students participating).
With regard to claim 6, the limitations are addressed above and Plain teaches wherein the first variable indicates a display object of a wallpaper (Fig. 5A; [col. 2, lines 10-25] providing coordinated shared collaboration tools that allow a program to be launched, shared, and displayed so that a teacher and a student can both see the graphical elements displayed on the screen of the other's computing devices (or portions of those graphical elements) simultaneously. These tools allow for sharing of desktop applications; [col. 7, lines 40-46] the graphical user interface 348 displays the screenshots depicted in FIGS. 5A through 8B), the display object of the wallpaper comprises at least one of a static picture or a dynamic picture ([col. 7, lines 40-46] the graphical user interface 348 displays the screenshots depicted in FIGS. 5A through 8B), and the wallpaper comprises at least one of the following: a lock-screen wallpaper used when a screen is locked, or a desktop wallpaper used when the screen is not locked (Fig. 5A, displays an unlocked image of the desktop wallpaper).
With regard to claim 7, the limitations are addressed above and Plain teaches wherein:
a plurality of themes are preset in the first electronic device (Figs. 5A-5B; [col. 12, lines 17-21] the collaboration application 116 may be configured to provide a visual indication to students participating in a collaborative session), and each one of the plurality of themes indicates at least one of a desktop layout style ([col. 12, lines 17-21] any other visual indication), an icon display style ([col. 12, lines 17-21] it may be important for a teacher or student to determine when a student or teacher, respectively, is clicking his or her mouse, a coordination application 116 may be configured to provide one or more indications of mouse clicks by a teacher and/or one or more students in a collaborative session), or an interface color ([col. 12, lines 17-21] The visual indication may be a color change (e.g. changing an edge color of window, or the background color of a window or screen), a change in relative size of one graphical element to another, an animation, a flashing screen or flashing cursor, or any other visual indication); and
the first variable indicates a display object of a theme (Figs. 5A-5B; [col. 12, lines 17-21] the collaboration application 116 may be configured to provide a visual indication to students participating in a collaborative session), and the display object of the theme is display content corresponding to one of the plurality of themes (Figs. 5A-5B; [col. 12, lines 17-21] the collaboration application 116 may be configured to provide a visual indication to students participating in a collaborative session).
With regard to claim 8, the limitations are addressed above and Plain teaches wherein:
the first module of the first application comprises a first common class ([col. 5, lines 10-15] Such parameters may include, for example, whether the screen of a given computing device will be visible to one or more other participants or presenters in the collaborative session. In other embodiments, however, control of such parameters may be provided to the teacher/presenter, e.g. to allow the teacher/presenter to enforce educational policies), a first user interface (UI) control class ([col. 4, lines 55-67] A computing device 104 or 120 that initiates a collaborative session may specify the parameters of the session, including who may participate, whether there is a cap on the number of participants, who will be the presenter, and so forth. The specified parameters may then be provided to each computing device 104, 120 that joins the collaborative session), and a desktop task stack of the first desktop that are used for creating and running the first desktop ([col. 9, lines 55-64] Using the coordination application 516, the teacher can watch Student 1's cursor 544 in real time to ensure that Student 1 is properly completing assigned tasks); and
the second module of the first application comprises a second common class ([col. 14, lines 40-49] second sets of graphical elements), a second UI control class ([col. 14, lines 40-49] the graphical elements on the teacher's and student's desktops), and a desktop task stack of the second desktop that are used for creating and running the second desktop ([col. 9, lines 55-64] Using the coordination application 516, the teacher can watch Student 1's cursor 544 in real time to ensure that Student 1 is properly completing assigned tasks),
wherein some or all classes in the second common class are inherited from the first common class, and some or all classes in the second UI control class are inherited from the first UI control class ([col. 14, lines 40-49] a computing device used in a collaborative session simultaneously sends information corresponding to a first set of graphical elements via a network interface and receives information corresponding to a second set of graphical elements via the network interface (step 904). The first and second sets of graphical elements may be, for example, the graphical elements on the teacher's and student's desktops, respectively, or they may be, as another example, the graphical elements of one window on the teacher's and student's desktops).
With regard to claim 9, the limitations are addressed above and Plain teaches wherein:
the second common class comprises one or more of the following: a desktop launcher provider, a database assistant, a desktop launcher setting class, a desktop launcher constant class, a personal computer (PC) layout configuration ([col. 10, lines 1-7] The coordination application 116 may be configured so that selection of a share desktop button 524 will automatically cause the teacher's desktop to be shared with any students participating in a collaborative session and will also automatically cause the desktop of participating students to be shared with the teacher), a PC device file, a PC cell counter, a PC desktop launcher policy, a PC desktop launcher model, or a PC loading task; and
the second UI control class comprises one or more of the following: a PC drag layer, a PC desktop workspace ([abstract] collaborative online desktop sharing), a PC cell layout, a PC program dock view, a PC folder (Fig. 5A, desktop folder), or a PC folder icon (Fig. 5A, desktop folder; [col. 9, lines 50-55] The visual display 516 also shows the desktop 548 of Student 1 in a student window 536).
With regard to claim 12, the limitations are addressed above and Plain teaches wherein:
an identity (ID) of a display area associated with the second module is an ID of the second display area (Fig. 5A, “Application XYZ” and “Student 1’s Screen”); and
the invoking, by the first electronic device in response to a first user operation, a second module of the first application to run a second desktop ([col. 14, lines 40-49] a computing device used in a collaborative session simultaneously sends information corresponding to a first set of graphical elements via a network interface and receives information corresponding to a second set of graphical elements via the network interface (step 904). The first and second sets of graphical elements may be, for example, the graphical elements on the teacher's and student's desktops, respectively, or they may be, as another example, the graphical elements of one window on the teacher's and student's desktops), wherein the second desktop is associated with a second display area (Fig. 1A, student computing device 120; Fig. 2A; FIG. 6 is a screenshot of the screen of a student's computing device; [abstract] student(s) can see their own desktop), comprises:
receiving, by a PC management service in response to the first user operation, a mode switching instruction ([col. 13, lines 40-45] This may be a useful option particularly when the instruction or presentation being given requires switching around between windows or programs on a computer desktop), wherein the instruction instructs to switch a current non-projection mode to a projection mode ([col. 13, lines 40-55] This may be a useful option particularly when the instruction or presentation being given requires switching around between windows or programs on a computer desktop . The ability to share less than all of a user's desktop—which effectively allows users to multi-task without the knowledge of other users—may be well-suited for environments in which participants are not obligated to participate or for environments in which privacy is of greater concern than educational outcome (e.g. a collaborative session between participants from two different companies));
invoking, by the PC management service in response to the instruction, a PC desktop service ([abstract] A system and method for collaborative online desktop), wherein the PC desktop service invokes an activity management service ([col. 3, lines 45-60] where the one-to-one collaborative session is facilitated by a communication management server 112 running a coordination service 114. The teacher computing device 104 and the student computing device 120 may each be running a coordination application 116. In some embodiments, the coordination application 116 may be hosted in the cloud and provided through a Software-as-a-Service (SaaS) platform. The communication management server 112 (processing instructions from the coordination service 114) can receive a request from the teacher computing device 104 or from the student computing device 120 to establish a collaboration session), and the activity management service invokes an activity task manager to start the second module of the first application ([col. 3, lines 45-60] where the one-to-one collaborative session is facilitated by a communication management server 112 running a coordination service 114. The teacher computing device 104 and the student computing device 120 may each be running a coordination application 116);
invoking a root activity container to determine the ID of the display area associated with the second module (Fig. 5A; [col. 9, lines 25-30] coordination application 116 provides a visual display 516 on the graphical user interface of the teacher computing device 104);
when the ID of the display area associated with the second module is the ID of the second display area, querying and using an Activity of the second desktop as an Activity of a to-be-launched desktop; or
when the ID of the display area associated with the second module is an ID of the first display area (Fig. 5A, “Application XYZ” and “Student 1’s Screen”), querying and using an Activity of the first desktop as an Activity of a to-be-launched desktop ([abstract] wherein a teacher can view the teacher's own desktop as well as the desktop(s) of the teacher's student(s); [col. 2, lines 10-25] providing coordinated shared collaboration tools that allow a program to be launched, shared, and displayed so that a teacher and a student can both see the graphical elements displayed on the screen of the other's computing devices (or portions of those graphical elements) simultaneously); and
invoking, by an activity start controller, an activity starter to start the second desktop (Fig. 1A, student computing device 120; Fig. 2A; FIG. 6 is a screenshot of the screen of a student's computing device; [abstract] student(s) can see their own desktop).
With regard to claim 15, Plain teaches an electronic device ([abstract] A system and method for collaborative online desktop), comprising: a touchscreen ([col. 7, lines 25-30] A graphical user interface 348 as used in embodiments of the present disclosure may be or include hardware (such as a computer monitor, television screen, laptop screen, tablet screen, smart phone screen, and the like, any one of which may be a resistive, capacitive, surface acoustic wave, or infrared touch screen, an LCD screen, an LED screen, a plasma screen, or a CRT screen)); at least one processor ([col. 6, lines 37-45] A processor 308 as used in embodiments of the present disclosure may correspond to one or many microprocessors that are contained within a common housing, circuit board, or blade with the computer readable storage medium 320), and
one or more memories coupled to the at least one processor ([col. 3, lines 20-30] a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM)) and storing programming instructions for execution by the at least one processor to:
invoke a first module of a first application to run a first desktop (Fig. 1A, teacher computing device 104), wherein the first desktop is associated with a first display area ([abstract] a teacher can view the teacher's own desktop; FIG. 5A is a screenshot of a teacher's computer screen according to one embodiment of the present disclosure);
display first display content based on the first display area, wherein the first display content comprises the first desktop ([abstract] a teacher can view the teacher's own desktop; FIG. 5A is a screenshot of a teacher's computer screen according to one embodiment of the present disclosure; [col. 2, lines 10-40] The present disclosure solves these and other issues by providing coordinated shared collaboration tools that allow a program to be launched, shared, and displayed so that a teacher and a student can both see the graphical elements displayed on the screen of the other's computing devices (or portions of those graphical elements) simultaneously);
invoke, in response to a first user operation, a second module of the first application to run a second desktop ([col. 2, lines 10-40] These tools allow for sharing of desktop applications in such a way that both presenter and participant(s), who are physically located in separate places—can see the display on the screen of each other's computing devices (whether the entire display or a portion thereof, e.g. an active window or a particular application that is running on the computer device), and can receive indications of mouse clicks and/or other user inputs, such that a teacher can teach a student how to perform a task using the teacher's computing device and then, on the same device, watch the student perform the task), wherein the second desktop is associated with a second display area (Fig. 1A, student computing device 120; Fig. 2A; FIG. 6 is a screenshot of the screen of a student's computing device; [col. 14, lines 31-49] The first and second sets of graphical elements may be, for example, the graphical elements on the teacher's and student's desktops, respectively, or they may be, as another example, the graphical elements of one window on the teacher's and student's desktops);
send second display content corresponding to the second display area to a second electronic device ([col. 14, lines 40-49] a computing device used in a collaborative session simultaneously sends information corresponding to a first set of graphical elements via a network interface and receives information corresponding to a second set of graphical elements via the network interface (step 904). The first and second sets of graphical elements may be, for example, the graphical elements on the teacher's and student's desktops, respectively, or they may be, as another example, the graphical elements of one window on the teacher's and student's desktops), wherein the second display content comprises the second desktop (Fig. 1A, student computing device 120; Fig. 2A; FIG. 6 is a screenshot of the screen of a student's computing device; [abstract] student(s) can see their own desktop), and the second display content at least adds a function that is unavailable in a first application window of the electronic device or shields a function of the electronic device that is not applicable to the second electronic device ([col. 5, lines 1-14] In some embodiments, certain session parameters are determined by each presenter/participant. Such parameters may include, for example, whether the screen of a given computing device will be visible to one or more other participants or presenters in the collaborative session; [col. 10, lines 1-20] the coordination application 116 may be configured so that selection of the share desktop button 524 will result only in the sharing of the desktop of the user (teacher or student) who selected the button);
display, in response to a second user operation performed on the first display content, third display content based on a first task stack that is of the first application and that runs in the first display area (Fig. 5B; [claim 11] a third data set corresponding to a third plurality of graphical elements displayed by a third computing device via a third graphical user interface, the third data set sufficient to enable the first computing device to display the third plurality of graphical elements via the first graphical user interface);
determine, in response to a third user operation performed on the second display content displayed by the second electronic device ([claim 11] simultaneously displaying, on separate portions of the graphical user interface, the first plurality of graphical elements, the second plurality of graphical elements, and the third plurality of graphical elements), that display content corresponding to the second display area is fourth display content based on a second task stack that is of the first application and that runs in the second display area ([col. 10, lines 27-53] FIG. 5B shows a screenshot 506 of a teacher computing device 104 running a coordination application 116 during a one-to-many collaborative session… FIG. 5B includes student windows 536a, b, c, d, . . . n showing the desktops 548a, b, c, d, . . . n of each of the students participating in the one-to-many collaborative session), wherein the second task stack is different from the first task stack ([col. 10, lines 27-53] Fig. 5B, student windows 536a, b, c, d, . . . n showing the desktops 548a, b, c, d, . . . n of each of the students participating in the one-to-many collaborative session); and
send the fourth display content to the second electronic device (Fig. 5B, student windows 536a, b, c, d, . . . n showing the desktops 548a, b, c, d, . . . n of each of the students participating).
With regard to claim 20, the device claim corresponds to the method claim 4, respectively, and therefore is rejected with the same rationale.
With regard to claim 21, the device claim corresponds to the method claim 5, respectively, and therefore is rejected with the same rationale.
With regard to claim 22, the device claim corresponds to the method claim 6, respectively, and therefore is rejected with the same rationale.
With regard to claim 23, the limitations are addressed above and Plain teaches further comprising:
obtaining, by the first electronic device, device information of the second electronic device ([col. 14, lines 40-49] a computing device used in a collaborative session simultaneously sends information corresponding to a first set of graphical elements via a network interface and receives information corresponding to a second set of graphical elements via the network interface (step 904). The first and second sets of graphical elements may be, for example, the graphical elements on the teacher's and student's desktops, respectively, or they may be, as another example, the graphical elements of one window on the teacher's and student's desktops); and
performing, by the first electronic device and based on the device information of the second electronic device, function optimization ([col. 14, lines 40-49] a computing device used in a collaborative session simultaneously sends information corresponding to a first set of graphical elements via a network interface and receives information corresponding to a second set of graphical elements via the network interface (step 904). The first and second sets of graphical elements may be, for example, the graphical elements on the teacher's and student's desktops, respectively, or they may be, as another example, the graphical elements of one window on the teacher's and student's desktops), wherein the function optimization comprises at least one of adding the function that is unavailable in the first application window of the first electronic device, or shielding the function of the first electronic device that is not applicable to the second electronic device ([col. 5, lines 1-14] In some embodiments, certain session parameters are determined by each presenter/participant. Such parameters may include, for example, whether the screen of a given computing device will be visible to one or more other participants or presenters in the collaborative session; [col. 10, lines 1-20] the coordination application 116 may be configured so that selection of the share desktop button 524 will result only in the sharing of the desktop of the user (teacher or student) who selected the button).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 3, 10-11, 13-14 and 19 are rejected under 35 U.S.C. 103 as being unpatentable over Plain et al. (U.S. Patent No. 9,741,257) in view of Reeves et al. (U.S. 2020/0166968).
With regard to claim 3, the limitations are addressed above and Plain teaches wherein:
before the displaying, by the first electronic device, first display content based on the first display area ([abstract] a teacher can view the teacher's own desktop; FIG. 5A is a screenshot of a teacher's computer screen according to one embodiment of the present disclosure), the screen projection method further comprises:
invoking, by the first electronic device, a third module of a second application ([claim 11] simultaneously displaying, on separate portions of the graphical user interface, the first plurality of graphical elements, the second plurality of graphical elements, and the third plurality of graphical elements); and
the screen projection method further comprises:
invoking, by the first electronic device in response to the first user operation, a fourth module of the second application (Fig. 5B, the teacher can watch the desktops 548a, b, c, d, . . . n, and particularly the cursors 544a, b, c, d, . . . n, of the participating students to ensure that each is properly completing assigned tasks). However, Plain does not specifically teach:
- to run a first status bar, wherein the first status bar is associated with the first display area, and the first display content comprises the first status bar
- to run a second status bar, wherein the second status bar is associated with the second display area, and the second display content comprises the second status bar
Reeves teaches devices for selectively presenting a user interface or “desktop” across two devices are provided [abstract]. Reeves also teaches to run a first status bar (Fig. 15; [0235] A user interface 1308 with other status indicators is shown in FIG. 15. User interface 1308 is associated with the device 100. The user interface 1308 may also include a portion 1500 that displays status indicators; [0240] The processor 204 of the device 100 can receive a selection of a status indicator, in step 1708. A selection can be a user interface action conducted on a status indicator. For example, a user may hover a pointer over status indicator 1420), wherein the first status bar is associated with the first display area ([0201] The first portion 732 may have a display identifier 816 for the first display while the second portion 736 may have a display identifier 816 for the second display 114), and the first display content comprises the first status bar ([0235] A user interface 1308 with other status indicators is shown in FIG. 15. User interface 1308 is associated with the device 100. The user interface 1308 may also include a portion 1500 that displays status indicators; [0240] The processor 204 of the device 100 can receive a selection of a status indicator, in step 1708. A selection can be a user interface action conducted on a status indicator. For example, a user may hover a pointer over status indicator 1420). Additionally, Reeves teaches to run a second status bar ([0192] A window stack 700, 728 may have various arrangements or organizational structures. In the embodiment shown in FIG. 7A, the device 100 includes a first stack 760 associated with a first touch sensitive display 110 and a second stack associated with a second touch sensitive display 114), wherein the second status bar is associated with the second display area ([0192] the second stack 764 can be arranged from a first window 708 to a next window 712 to a last window 716, and finally to a desktop 718, which, in embodiments, is a single desktop area, with desktop 722, under all the windows in both window stack 760 and window stack 764), and the second display content comprises the second status bar ([0192] A window stack 700, 728 may have various arrangements or organizational structures. In the embodiment shown in FIG. 7A, the device 100 includes a first stack 760 associated with a first touch sensitive display 110 and a second stack associated with a second touch sensitive display 114). Therefore, it would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which said subject matter pertains to have modified the videoconferencing system taught by Plain, with the multi-screens device having a first and a second status bar as taught by Reeves, to have achieved an efficient system and method of collaborative online desktop sharing.
With regard to claim 10, the limitations are addressed above and Plain teaches wherein:
the third module of the second application comprises a first component, a first dependency control class, and a third UI control class that are used for creating and running ([claim 11] simultaneously displaying, on separate portions of the graphical user interface, the first plurality of graphical elements, the second plurality of graphical elements, and the third plurality of graphical elements); and
the second module of the first application comprises a second component, a second dependency control class, and a fourth UI control class that are used for creating and running ([col. 10, lines 27-53] FIG. 5B shows a screenshot 506 of a teacher computing device 104 running a coordination application 116 during a one-to-many collaborative session… FIG. 5B includes student windows 536a, b, c, d, . . . n showing the desktops 548a, b, c, d, . . . n of each of the students participating in the one-to-many collaborative session),
wherein some or all components in the second component are inherited from the first component (Fig. 5A, “Application XYZ” and “Student 1’s Screen”), some or all classes in the second dependency control class are inherited from the first dependency control class ([col. 9, lines 55-64] Using the coordination application 516, the teacher can watch Student 1's cursor 544 in real time to ensure that Student 1 is properly completing assigned tasks), and some or all classes in the fourth UI control class are inherited from the third UI control class ([col. 10, lines 27-53] FIG. 5B shows a screenshot 506 of a teacher computing device 104 running a coordination application 116 during a one-to-many collaborative session… FIG. 5B includes student windows 536a, b, c, d, . . . n showing the desktops 548a, b, c, d, . . . n of each of the students participating in the one-to-many collaborative session). However, Plain does not specifically teach:
- the first status bar
- the second status bar
Reeves teaches devices for selectively presenting a user interface or “desktop” across two devices are provided [abstract]. Reeves also teaches a first status bar (Fig. 15; [0235] A user interface 1308 with other status indicators is shown in FIG. 15. User interface 1308 is associated with the device 100. The user interface 1308 may also include a portion 1500 that displays status indicators; [0240] The processor 204 of the device 100 can receive a selection of a status indicator, in step 1708. A selection can be a user interface action conducted on a status indicator. For example, a user may hover a pointer over status indicator 1420), and a second status bar ([0192] A window stack 700, 728 may have various arrangements or organizational structures. In the embodiment shown in FIG. 7A, the device 100 includes a first stack 760 associated with a first touch sensitive display 110 and a second stack associated with a second touch sensitive display 114), wherein the second status bar is associated with the second display area ([0192] the second stack 764 can be arranged from a first window 708 to a next window 712 to a last window 716, and finally to a desktop 718, which, in embodiments, is a single desktop area, with desktop 722, under all the windows in both window stack 760 and window stack 764), and the second display content comprises the second status bar ([0192] A window stack 700, 728 may have various arrangements or organizational structures. In the embodiment shown in FIG. 7A, the device 100 includes a first stack 760 associated with a first touch sensitive display 110 and a second stack associated with a second touch sensitive display 114). Therefore, it would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which said subject matter pertains to have modified the videoconferencing system taught by Plain, with the multi-screens device having a first and a second status bar as taught by Reeves, to have achieved an efficient system and method of collaborative online desktop sharing.
With regard to claim 11, the limitations are addressed above and Plain teaches wherein:
the second component comprises one or more of the following: a PC dependency class, a PC system provider ([col. 3, lines 40-45] the coordination application 116 may be hosted in the cloud and provided through a Software-as-a-Service (SaaS) platform; [col. 6, lines 45-66] The processor 308 may be or include, without limitation, any one or more of a Qualcomm® Snapdragon® 800 and 801, Qualcomm® Snapdragon® 610 and 615 with 4G LTE Integration and 64-bit computing, Apple® A7 processor with 64-bit architecture, Apple® M7 motion coprocessors, Samsung® Exynos® series,...), a PC system bar, or the second status bar;
the second dependency control class comprises one or more of the following: a PC status bar window control class, a screen control class, a lock-screen control class, or a remote control class ([col. 9, lines 20-24] The server 404 may be located (physically) remotely from any teacher or student computing device, provided that the server 404 is connected to a communication network such as communication network 108 ); and
the fourth UI control class comprises one or more of the following: a PC status bar window view, a PC notification panel view ([col. 13, lines 6-20] other notification options are within the scope of the present disclosure. For example, notifications may involve playing a sound (which may or may not identify the student or students meeting the notification criteria), updating or displaying a message (e.g. periodically updating a list of students who meet the notification criteria), a PC quick setting fragment, a PC status bar fragment, or a PC status bar view.
With regard to claim 13, the limitations are addressed above and Plain teaches wherein the invoking, by the first electronic device in response to the first user operation, a fourth module of the second application to run a second (Fig. 5B, the teacher can watch the desktops 548a, b, c, d, . . . n, and particularly the cursors 544a, b, c, d, . . . n, of the participating students to ensure that each is properly completing assigned tasks), associated with the second display area (Fig. 1A, student computing device 120; Fig. 2A; FIG. 6 is a screenshot of the screen of a student's computing device; [abstract] student(s) can see their own desktop), comprises:
receiving, by a PC management service in response to the first user operation, a mode switching instruction ([col. 13, lines 40-45] This may be a useful option particularly when the instruction or presentation being given requires switching around between windows or programs on a computer desktop), wherein the instruction instructs to switch a current non-projection mode to a projection mode ([col. 13, lines 40-55] This may be a useful option particularly when the instruction or presentation being given requires switching around between windows or programs on a computer desktop . The ability to share less than all of a user's desktop—which effectively allows users to multi-task without the knowledge of other users—may be well-suited for environments in which participants are not obligated to participate or for environments in which privacy is of greater concern than educational outcome (e.g. a collaborative session between participants from two different companies));
starting, by the PC management service in response to the instruction, a productivity service, wherein the productivity service invokes a system ([col. 10, lines 1-7] The coordination application 116 may be configured so that selection of a share desktop button 524 will automatically cause the teacher's desktop to be shared with any students participating in a collaborative session and will also automatically cause the desktop of participating students to be shared with the teacher), and the system creates based on a configuration file ([col. 8, lines 5-20] The network interface 324 may be configured to facilitate a connection between the computing device 304 and the communication network 108 and may further be configured to encode and decode communications (e.g., packets) according to a protocol utilized by the communication network 108);
invoking, a callback interface of a command queue to add a callback ([col. 12, lines 30-34] until the teacher provides an input (e.g. clicking on the student screen 736) that causes the student screen 736 to return to its original size and formatting, or for a given period of time);
initializing, a layout (Figs. 5A-5B);
registering an object corresponding to the second area (Figs. 5A-5B);
creating, a PC status window view (Fig. 5A, desktop folder; [col. 9, lines 50-55] The visual display 516 also shows the desktop 548 of Student 1 in a student window 536);
adding, the PC status window view to a status window control class (Figs. 5A-5B); and
invoking, by the window control class, a window management interface to add the second area to a window management service (Fig. 5B; [col. 10, lines 27-53] FIG. 5B shows a screenshot 506 of a teacher computing device 104 running a coordination application 116 during a one-to-many collaborative session… FIG. 5B includes student windows 536a, b, c, d, . . . n showing the desktops 548a, b, c, d, . . . n of each of the students participating in the one-to-many collaborative session), wherein is added to the second display area (Fig. 5B). However, Plain does not specifically teach:
- a second status bar, wherein the second status bar is associated with the second display area
- a system bar to start the second status bar,
- an IstatusBar object
- second status bar to a status bar management service;
Reeves teaches a second status bar ([0192] A window stack 700, 728 may have various arrangements or organizational structures. In the embodiment shown in FIG. 7A, the device 100 includes a first stack 760 associated with a first touch sensitive display 110 and a second stack associated with a second touch sensitive display 114), wherein the second status bar is associated with the second display area ([0192] the second stack 764 can be arranged from a first window 708 to a next window 712 to a last window 716, and finally to a desktop 718, which, in embodiments, is a single desktop area, with desktop 722, under all the windows in both window stack 760 and window stack 764), a system bar to start the second status bar and an IstatusBar object ([0192] A window stack 700, 728 may have various arrangements or organizational structures. In the embodiment shown in FIG. 7A, the device 100 includes a first stack 760 associated with a first touch sensitive display 110 and a second stack associated with a second touch sensitive display 114), and second status bar to a status bar management service ([0192] the second stack 764 can be arranged from a first window 708 to a next window 712 to a last window 716, and finally to a desktop 718, which, in embodiments, is a single desktop area, with desktop 722, under all the windows in both window stack 760 and window stack 764). Therefore, it would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which said subject matter pertains to have modified the videoconferencing system taught by Plain, with the multi-screens device having a first and a second status bar as taught by Reeves, to have achieved an efficient system and method of collaborative online desktop sharing.
With regard to claim 14, the limitations are addressed above. However, Plain does not specifically teach:
- wherein in the non-projection mode, the command queue supports a first status bar associated with the first display area; and in the projection mode, the command queue supports both the first status bar associated with the first display area and the second status bar associated with the second display area
Reeves teaches wherein in the non-projection mode, the command queue supports a first status bar associated with the first display area (Fig. 15; [0201] The first portion 732 may have a display identifier 816 for the first display while the second portion 736 may have a display identifier 816 for the second display 114; [0235] A user interface 1308 with other status indicators is shown in FIG. 15. User interface 1308 is associated with the device 100. The user interface 1308 may also include a portion 1500 that displays status indicators; [0240] The processor 204 of the device 100 can receive a selection of a status indicator, in step 1708. A selection can be a user interface action conducted on a status indicator. For example, a user may hover a pointer over status indicator 1420); and in the projection mode, the command queue supports both the first status bar associated with the first display area and the second status bar associated with the second display area ([0192] the second stack 764 can be arranged from a first window 708 to a next window 712 to a last window 716, and finally to a desktop 718, which, in embodiments, is a single desktop area, with desktop 722, under all the windows in both window stack 760 and window stack 764…A window stack 700, 728 may have various arrangements or organizational structures. In the embodiment shown in FIG. 7A, the device 100 includes a first stack 760 associated with a first touch sensitive display 110 and a second stack associated with a second touch sensitive display 114). Therefore, it would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which said subject matter pertains to have modified the videoconferencing system taught by Plain, with the multi-screens device having a first and a second status bar as taught by Reeves, to have achieved an efficient system and method of collaborative online desktop sharing.
With regard to claim 19, the device claim corresponds to the method claim 3, respectively, and therefore is rejected with the same rationale.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANDREA C. LEGGETT whose telephone number is (571)270-7700. The examiner can normally be reached M-F 9am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kieu Vu can be reached at 571-272-4057. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/ANDREA C LEGGETT/Primary Examiner, Art Unit 2171