Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
Claims 1-13, 15-18, 20-22 are pending. As an initial matter, the rejection of claim 19 under 35 USC 112th second paragraph has been withdrawn because claim 19 has been cancelled, rendering the rejection moot.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1-13, 15-18, 20-22 have been considered but are moot because the arguments are directed towards the newly amended independent claims that changes the scope of the claims as a whole and are open to new grounds of rejection/interpretation.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-13, 15-18, 20-22 is/are rejected under 35 U.S.C. 103 as being unpatentable over Essiounine et al. (US 20190255446, hereinafter “Essi”) in view of Kavallierou (US 20200346109, hereinafter “Kava”).
Re claim 1, Essi teaches a rendering method comprising:
obtaining, by a physical engine, a physical simulation result based on an operation instruction, wherein the operation instruction affects at least one three-dimensional model in a target scene and wherein the physical simulation result comprises physical information of the at least one three-dimensional model based on the operation instruction ([0073] It may be noted that hosting the characters (or characters and game) on a common server may allow for greater efficiency in the use of computing resources. For example, game logic executing on the common server may be shared between multiple characters. For instance, a character hits a wall and the server uses the physics engine to calculate physics commands indicative of the physical interaction of the character with the wall. If another character hits the same wall in the same or similar manner, the physics commands that were previously calculated may be cached and reused for the other character's collision with the wall. In other implementations, the characters may not be hosted on a common server) and ([0032] In implementations, collaboration platform 120 may include a creator module 126. In implementations, creator module 126 may allow users to become creators to design or create environments in an existing game 122 or create new games or create new game objects within games or environments. In some implementations, a game 122 may have a common set of rules or common goal, and the environments of a game 122 share the common set of rules or common goal. In implementations, different games may have different rules or goals from one another. In some implementations, games may have one or more environments (also referred to as “gaming environments” or “virtual environment” herein) where multiple environments may be linked. An example of an environment may be a three-dimensional (3D) environment. The one or more environments of a game 122 may be collectively referred to a “world” or “gaming world” or “virtual world” or “universe” herein. An example of a world may be a 3D world of a game 122. For example, a user may build a virtual environment that is linked to another virtual environment created by another user. A character of the virtual game may cross the virtual border to enter the adjacent virtual environment. In implementations, game objects (e.g., also referred to as “item(s)” or “objects” or “virtual game item(s)” herein) may refer to objects that are used, created, shared or otherwise depicted in games 122 of the collaboration platform 120. For example, game objects may include a part, model, character, tools, weapons, clothing, buildings, vehicles, currency, flora, fauna, components of the aforementioned (e.g., windows of a building), and so forth. It may be noted that 3D environments or 3D worlds use graphics that use a three-dimensional representation of geometric data representative of game content (or at least present game content to appear as 3D content whether or not 3D representation of geometric data is used). 2D environments or 2D worlds use graphics that use two-dimensional representation of geometric data representative of game content).
sending, by the physical engine, the physical simulation result to a first rendering engine to render a first rendered image based on the physical simulation result and sending, by the physical engine, the physical simulation result to a second rendering engine to render a second rendered image based on the physical simulation result (see Fig. 1, including game engine, collaboration application, and group gameplay module in both a first client device 110b and a second client device 110b), (see Fig. 2, wherein a plurality of second client devices and a primary client device is taught with each secondary client device having a display and a primary client having a primary display), ([0073] It may be noted that hosting the characters (or characters and game) on a common server may allow for greater efficiency in the use of computing resources. For example, game logic executing on the common server may be shared between multiple characters. For instance, a character hits a wall and the server uses the physics engine to calculate physics commands indicative of the physical interaction of the character with the wall. If another character hits the same wall in the same or similar manner, the physics commands that were previously calculated may be cached and reused for the other character's collision with the wall. In other implementations, the characters may not be hosted on a common server), and ([0087] FIG. 4C illustrates view 460 of the gameplay of a multiplayer game on the primary display 250 and view 470 of the gameplay of the same multiplayer game on the display 112 of the secondary client device 210 at a third point in time, in accordance with implementations of the disclosure. View 460 shows a split-screen view of the gameplay as shown on primary display 250. Each of the sections of the split-screen shows a view from a third-person perspective relative to the particular character. For example, the top left split-screen shows the view from the third-person perspective of character 1. In other implementations, the split-screen view may show the views from the first-person perspective of each of the characters. View 470 shows the view from the third-person perspective with respect to character 1. It may be noted that view 460 and view 470 show gameplay for the same point in time but from different views).
Essi does not explicitly teach wherein the first rendering engine is deployed on a remote computing platform, wherein the second rendering engine is deployed on the remote computing platform, sending by the first rendering engine, the first rendered image to a first terminal device, and sending, by the second rendering engine, the second rendered image to a second terminal device.
However, Kava teaches wherein the first rendering engine is deployed on a remote computing platform and wherein the second rendering engine is deployed on the remote computing platform (abstract: Methods and apparatus for controlling the rendering of a video game instance includes obtaining game state information and player inputs from a plurality of client devices participating in a video game session. Based on the obtained game state information, a client device is identified as being likely to render a video game instance with a quality that is less than a threshold quality. A cloud game client is allocated to the identified client device. The player inputs associated with the identified client device and the obtained game state information are provided to the cloud game client, which renders the video game instance for the identified client device. The video game instance rendered at the cloud device is then transmitted to the identified client device for display at an associated display), ([0030] In the present disclosure, each video game playing device provides a respective participant of a video game session with access to an online multiplayer game. For this reason, each computing device is referred to as a client device. The video game session may correspond to e.g. a sports or combat match, such as a battle-royale mode combat match. Each client device is in communication with a central game server, via a communications network. The game server receives and processes each player's input and generates an authoritative source of events occurring within the video game. The game server transmits data about its internal state to the connected client devices, enabling each client device to maintain their own accurate version of the virtual world for display to a corresponding player. In this way, each player is able to concurrently populate a common virtual environment), and ([0074] In some examples, it may be that multiple instances of the cloud game client 608 are installed at one or more cloud devices, with each cloud game client 608 serving different players or groups of players. The monitoring unit 606 may be used to control which cloud game client 608 instances are used by the different players. For example, in response to detecting that the game requirements for a given player has lowered to the point that the local device can or is likely to be able render at the full framerate (e.g. 60 fps), the monitoring unit 606 may instruct the corresponding cloud game client 608 instance to cease rendering the video game instance for that player. This may result in the cloud game client 608 instance being returned to the pool of available cloud game client 608 instances, for use by another game client, as requested by the monitoring unit 606. In some examples, the cloud game client 608 instance may be re-allocated to a different client device (for which the e.g. GPU requirements are still expected to exceed the capabilities of the local device). Kava teaches wherein the first rendering engine is deployed on a remote computing platform and wherein the second rendering engine is deployed on the remote computing platform (multiple instances cloud game client as a first and second rendering engine, which renders the video game instance for the identified client devices).
sending by the first rendering engine, the first rendered image to a first terminal device, and sending, by the second rendering engine, the second rendered image to a second terminal device (abstract: Methods and apparatus for controlling the rendering of a video game instance includes obtaining game state information and player inputs from a plurality of client devices participating in a video game session. Based on the obtained game state information, a client device is identified as being likely to render a video game instance with a quality that is less than a threshold quality. A cloud game client is allocated to the identified client device. The player inputs associated with the identified client device and the obtained game state information are provided to the cloud game client, which renders the video game instance for the identified client device. The video game instance rendered at the cloud device is then transmitted to the identified client device for display at an associated display), ([0030] In the present disclosure, each video game playing device provides a respective participant of a video game session with access to an online multiplayer game. For this reason, each computing device is referred to as a client device. The video game session may correspond to e.g. a sports or combat match, such as a battle-royale mode combat match. Each client device is in communication with a central game server, via a communications network. The game server receives and processes each player's input and generates an authoritative source of events occurring within the video game. The game server transmits data about its internal state to the connected client devices, enabling each client device to maintain their own accurate version of the virtual world for display to a corresponding player. In this way, each player is able to concurrently populate a common virtual environment), and ([0074] In some examples, it may be that multiple instances of the cloud game client 608 are installed at one or more cloud devices, with each cloud game client 608 serving different players or groups of players. The monitoring unit 606 may be used to control which cloud game client 608 instances are used by the different players. For example, in response to detecting that the game requirements for a given player has lowered to the point that the local device can or is likely to be able render at the full framerate (e.g. 60 fps), the monitoring unit 606 may instruct the corresponding cloud game client 608 instance to cease rendering the video game instance for that player. This may result in the cloud game client 608 instance being returned to the pool of available cloud game client 608 instances, for use by another game client, as requested by the monitoring unit 606. In some examples, the cloud game client 608 instance may be re-allocated to a different client device (for which the e.g. GPU requirements are still expected to exceed the capabilities of the local device). Kava teaches sending by the first rendering engine, the first rendered image to a first terminal device, and sending, by the second rendering engine, the second rendered image to a second terminal device (multiple instances cloud game client as a first and second rendering engine, which renders the video game instance for the identified client devices of multiple players).
Essi and Kava teach claim 1. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Essi’s method including sending by the first rendering engine, the first rendered image to a first terminal device, and sending, by the second rendering engine, the second rendered image to a second terminal device, as taught by Kava, as the references are in the analogous art of physical engine base systems in communication with multiple terminals. An advantage of the modification is that it achieves the result of using a separate cloud client to help render for identified devices that are unable to properly render the graphics for display, including multiple client devices.
Re claim 2, Essi and Kava teach claim 1. Furthermore, Essi teaches wherein the target scene comprises a first virtual character corresponding to the first rendering engine and a second virtual character corresponding to the second rendering engine (see Fig. 4C, item 460, wherein a plurality of virtual characters are rendered, and item 470, wherein a display of 112 of the secondary client device 210 is rendered for particular character 1) and ([0087] FIG. 4C illustrates view 460 of the gameplay of a multiplayer game on the primary display 250 and view 470 of the gameplay of the same multiplayer game on the display 112 of the secondary client device 210 at a third point in time, in accordance with implementations of the disclosure. View 460 shows a split-screen view of the gameplay as shown on primary display 250. Each of the sections of the split-screen shows a view from a third-person perspective relative to the particular character. For example, the top left split-screen shows the view from the third-person perspective of character 1. In other implementations, the split-screen view may show the views from the first-person perspective of each of the characters. View 470 shows the view from the third-person perspective with respect to character 1. It may be noted that view 460 and view 470 show gameplay for the same point in time but from different views) and ([0014] In some implementations, responsive to the control instructions received from the secondary client devices (control instructions that bypassed the primary client device), the server computer sends second gameplay instruction directly to the secondary client devices (e.g., second gameplay instructions that bypass the primary client device). In some implementations, the secondary client devices may render a presentation of second views of the gameplay of the multiplayer game on the displays of the secondary client devices using the second gameplay instructions from the server computer (e.g., collaboration platform), rather than using instructions from the primary client device).
Re claim 3, Essi and Kava teach claim 1. Furthermore, Essi teaches wherein the physical information comprises one or more of coordinates of a point on the at least one three-dimensional model, a movement speed of the point, a movement acceleration of the point, a movement angle of the point, or a movement direction of the point ([0029] In one implementation, collaboration platform 120 may consolidate the game content from the client devices 110 and transmit the consolidated game content (e.g., gaming video, rendering commands, user input, graphics library commands, position and velocity information of the characters of a game, etc.) to each of the client devices 110 to display interactions of the multiple users in a multiplayer gaming environment. In another implementation, collaboration platform 120 may transmit the game content from one or more client devices 110 to another client device for the other client device to consolidate and display the game content. In another implementation, the collaboration platform 120 may receive the game content (e.g., first user transmitting user input via client device 110A and second user transmitting user input via client device 110B), generate game results (e.g., first user beats second user), and transmit the game results to the client devices 110).
Re claim 4, Essi and Kava teach claim 1. Furthermore, Kava teaches wherein the remote computing platform comprises they physical engine (abstract: Methods and apparatus for controlling the rendering of a video game instance includes obtaining game state information and player inputs from a plurality of client devices participating in a video game session. Based on the obtained game state information, a client device is identified as being likely to render a video game instance with a quality that is less than a threshold quality. A cloud game client is allocated to the identified client device. The player inputs associated with the identified client device and the obtained game state information are provided to the cloud game client, which renders the video game instance for the identified client device. The video game instance rendered at the cloud device is then transmitted to the identified client device for display at an associated display), ([0030] In the present disclosure, each video game playing device provides a respective participant of a video game session with access to an online multiplayer game. For this reason, each computing device is referred to as a client device. The video game session may correspond to e.g. a sports or combat match, such as a battle-royale mode combat match. Each client device is in communication with a central game server, via a communications network. The game server receives and processes each player's input and generates an authoritative source of events occurring within the video game. The game server transmits data about its internal state to the connected client devices, enabling each client device to maintain their own accurate version of the virtual world for display to a corresponding player. In this way, each player is able to concurrently populate a common virtual environment), ([0074] In some examples, it may be that multiple instances of the cloud game client 608 are installed at one or more cloud devices, with each cloud game client 608 serving different players or groups of players. The monitoring unit 606 may be used to control which cloud game client 608 instances are used by the different players. For example, in response to detecting that the game requirements for a given player has lowered to the point that the local device can or is likely to be able render at the full framerate (e.g. 60 fps), the monitoring unit 606 may instruct the corresponding cloud game client 608 instance to cease rendering the video game instance for that player. This may result in the cloud game client 608 instance being returned to the pool of available cloud game client 608 instances, for use by another game client, as requested by the monitoring unit 606. In some examples, the cloud game client 608 instance may be re-allocated to a different client device (for which the e.g. GPU requirements are still expected to exceed the capabilities of the local device). Kava teaches wherein the first rendering engine is deployed on a remote computing platform and wherein the second rendering engine is deployed on the remote computing platform (multiple instances cloud game client as a first and second rendering engine, which renders the video game instance for the identified client devices) and ([0049] At a fourth step S504, a cloud game client (CGC) is allocated to the identified client device. The cloud game client may correspond to a duplicate of a video game client implemented at the client devices. The cloud game client may comprise a pool of an instance of hardware available on the network (forming part of ‘the cloud’) having more powerful CPU and GPU capabilities than the client device. Hence, allocating the cloud game client may involve identifying one or more cloud devices (forming the cloud gaming service) having greater CPU and or GPU resources available for rendering the video game instance than the client device(s) identified at step S503. The cloud game client may be already installed at the identified cloud devices, which simply wait to receive the synchronization status of a given player, so that a corresponding view of the virtual environment can be rendered for that player). For motivation, see claim 1.
Re claim 5, Essi and Kava teaches claim 1. Furthermore, Kava teaches wherein the physical engine is a shared physical engine configured to obtain and share a shareable physical simulation result with the first terminal device and the second terminal device (abstract: Methods and apparatus for controlling the rendering of a video game instance includes obtaining game state information and player inputs from a plurality of client devices participating in a video game session. Based on the obtained game state information, a client device is identified as being likely to render a video game instance with a quality that is less than a threshold quality. A cloud game client is allocated to the identified client device. The player inputs associated with the identified client device and the obtained game state information are provided to the cloud game client, which renders the video game instance for the identified client device. The video game instance rendered at the cloud device is then transmitted to the identified client device for display at an associated display), ([0030] In the present disclosure, each video game playing device provides a respective participant of a video game session with access to an online multiplayer game. For this reason, each computing device is referred to as a client device. The video game session may correspond to e.g. a sports or combat match, such as a battle-royale mode combat match. Each client device is in communication with a central game server, via a communications network. The game server receives and processes each player's input and generates an authoritative source of events occurring within the video game. The game server transmits data about its internal state to the connected client devices, enabling each client device to maintain their own accurate version of the virtual world for display to a corresponding player. In this way, each player is able to concurrently populate a common virtual environment), ([0074] In some examples, it may be that multiple instances of the cloud game client 608 are installed at one or more cloud devices, with each cloud game client 608 serving different players or groups of players. The monitoring unit 606 may be used to control which cloud game client 608 instances are used by the different players. For example, in response to detecting that the game requirements for a given player has lowered to the point that the local device can or is likely to be able render at the full framerate (e.g. 60 fps), the monitoring unit 606 may instruct the corresponding cloud game client 608 instance to cease rendering the video game instance for that player. This may result in the cloud game client 608 instance being returned to the pool of available cloud game client 608 instances, for use by another game client, as requested by the monitoring unit 606. In some examples, the cloud game client 608 instance may be re-allocated to a different client device (for which the e.g. GPU requirements are still expected to exceed the capabilities of the local device). Kava teaches wherein the first rendering engine is deployed on a remote computing platform and wherein the second rendering engine is deployed on the remote computing platform (multiple instances cloud game client as a first and second rendering engine, which renders the video game instance for the identified client devices), and ([0049] At a fourth step S504, a cloud game client (CGC) is allocated to the identified client device. The cloud game client may correspond to a duplicate of a video game client implemented at the client devices. The cloud game client may comprise a pool of an instance of hardware available on the network (forming part of ‘the cloud’) having more powerful CPU and GPU capabilities than the client device. Hence, allocating the cloud game client may involve identifying one or more cloud devices (forming the cloud gaming service) having greater CPU and or GPU resources available for rendering the video game instance than the client device(s) identified at step S503. The cloud game client may be already installed at the identified cloud devices, which simply wait to receive the synchronization status of a given player, so that a corresponding view of the virtual environment can be rendered for that player). For motivation, see claim 1.
Re claim 6, Essi and Kava teach claim 4. Furthermore, Kava teaches wherein at least one of the first terminal device or the second terminal device has a limited capability for rending images in comparison to the physical engine (abstract: Methods and apparatus for controlling the rendering of a video game instance includes obtaining game state information and player inputs from a plurality of client devices participating in a video game session. Based on the obtained game state information, a client device is identified as being likely to render a video game instance with a quality that is less than a threshold quality. A cloud game client is allocated to the identified client device. The player inputs associated with the identified client device and the obtained game state information are provided to the cloud game client, which renders the video game instance for the identified client device. The video game instance rendered at the cloud device is then transmitted to the identified client device for display at an associated display). For motivation, see claim 1.
Re claim 7, Essi and Kava teach claim 1. Furthermore, Essi teaches wherein the first rendered image and the second rendered image of the same target scene are rendered based on different viewing angles (see Fig. 2, wherein a plurality of second client devices and a primary client device is taught with each secondary client device having a display and a primary client having a primary display) and ([0087] FIG. 4C illustrates view 460 of the gameplay of a multiplayer game on the primary display 250 and view 470 of the gameplay of the same multiplayer game on the display 112 of the secondary client device 210 at a third point in time, in accordance with implementations of the disclosure. View 460 shows a split-screen view of the gameplay as shown on primary display 250. Each of the sections of the split-screen shows a view from a third-person perspective relative to the particular character. For example, the top left split-screen shows the view from the third-person perspective of character 1. In other implementations, the split-screen view may show the views from the first-person perspective of each of the characters. View 470 shows the view from the third-person perspective with respect to character 1. It may be noted that view 460 and view 470 show gameplay for the same point in time but from different views).
Re claim 8, Essi and Kava teach claim 7. Furthermore, Essi teaches wherein the different viewing angles correspond to different users to observe the target scene ([0052] In implementations, a view (also referred to as “field of view” herein) may refer to the extent of the observable game world that may be seen at any given moment from the perspective of the game camera and that is presented in the display of a client device. For example, the view of the game camera may be from a first-person perspective or a third-person perspective or some combination thereof), ([0055] In implementations, system 200 includes primary display 250. In some implementations, primary display 250 may be coupled to primary client device 245. In implementations, primary display 250 may receive display instructions from primary client device 245 and present a view of the gameplay of a game, such as a multiplayer game, on the primary display 250 for consumption by the users. In some implementations, the primary display 250 may be a television or a projector, for example. In implementations, the view displayed from the primary display 250 may be from a perspective (e.g., third-person perspective) so that all the playing users that are proximately located (e.g., same room) may watch the in-game actions of the playing users' respective characters).
Re claim 9, Essi teaches a method comprising:
receiving, separately by a first rendering engine and a second rendering engine, a physical simulation result from a physical engine, wherein the physical simulation result is based on an operation instruction ([0073] It may be noted that hosting the characters (or characters and game) on a common server may allow for greater efficiency in the use of computing resources. For example, game logic executing on the common server may be shared between multiple characters. For instance, a character hits a wall and the server uses the physics engine to calculate physics commands indicative of the physical interaction of the character with the wall. If another character hits the same wall in the same or similar manner, the physics commands that were previously calculated may be cached and reused for the other character's collision with the wall. In other implementations, the characters may not be hosted on a common server) and ([0032] In implementations, collaboration platform 120 may include a creator module 126. In implementations, creator module 126 may allow users to become creators to design or create environments in an existing game 122 or create new games or create new game objects within games or environments. In some implementations, a game 122 may have a common set of rules or common goal, and the environments of a game 122 share the common set of rules or common goal. In implementations, different games may have different rules or goals from one another. In some implementations, games may have one or more environments (also referred to as “gaming environments” or “virtual environment” herein) where multiple environments may be linked. An example of an environment may be a three-dimensional (3D) environment. The one or more environments of a game 122 may be collectively referred to a “world” or “gaming world” or “virtual world” or “universe” herein. An example of a world may be a 3D world of a game 122. For example, a user may build a virtual environment that is linked to another virtual environment created by another user. A character of the virtual game may cross the virtual border to enter the adjacent virtual environment. In implementations, game objects (e.g., also referred to as “item(s)” or “objects” or “virtual game item(s)” herein) may refer to objects that are used, created, shared or otherwise depicted in games 122 of the collaboration platform 120. For example, game objects may include a part, model, character, tools, weapons, clothing, buildings, vehicles, currency, flora, fauna, components of the aforementioned (e.g., windows of a building), and so forth. It may be noted that 3D environments or 3D worlds use graphics that use a three-dimensional representation of geometric data representative of game content (or at least present game content to appear as 3D content whether or not 3D representation of geometric data is used). 2D environments or 2D worlds use graphics that use two-dimensional representation of geometric data representative of game content) and (see Fig. 1, including game engine, collaboration application, and group gameplay module in both a first client device 110b and a second client device 110b), (see Fig. 2, wherein a plurality of second client devices and a primary client device is taught with each secondary client device having a display and a primary client having a primary display for displaying rendered images),
wherein the operation instruction affects at least one three-dimensional model in a target scene, and wherein the physical simulation result comprises physical information of the at least one three-dimensional model based on the operation instruction ([0073] It may be noted that hosting the characters (or characters and game) on a common server may allow for greater efficiency in the use of computing resources. For example, game logic executing on the common server may be shared between multiple characters. For instance, a character hits a wall and the server uses the physics engine to calculate physics commands indicative of the physical interaction of the character with the wall. If another character hits the same wall in the same or similar manner, the physics commands that were previously calculated may be cached and reused for the other character's collision with the wall. In other implementations, the characters may not be hosted on a common server) and ([0032] In implementations, collaboration platform 120 may include a creator module 126. In implementations, creator module 126 may allow users to become creators to design or create environments in an existing game 122 or create new games or create new game objects within games or environments. In some implementations, a game 122 may have a common set of rules or common goal, and the environments of a game 122 share the common set of rules or common goal. In implementations, different games may have different rules or goals from one another. In some implementations, games may have one or more environments (also referred to as “gaming environments” or “virtual environment” herein) where multiple environments may be linked. An example of an environment may be a three-dimensional (3D) environment. The one or more environments of a game 122 may be collectively referred to a “world” or “gaming world” or “virtual world” or “universe” herein. An example of a world may be a 3D world of a game 122. For example, a user may build a virtual environment that is linked to another virtual environment created by another user. A character of the virtual game may cross the virtual border to enter the adjacent virtual environment. In implementations, game objects (e.g., also referred to as “item(s)” or “objects” or “virtual game item(s)” herein) may refer to objects that are used, created, shared or otherwise depicted in games 122 of the collaboration platform 120. For example, game objects may include a part, model, character, tools, weapons, clothing, buildings, vehicles, currency, flora, fauna, components of the aforementioned (e.g., windows of a building), and so forth. It may be noted that 3D environments or 3D worlds use graphics that use a three-dimensional representation of geometric data representative of game content (or at least present game content to appear as 3D content whether or not 3D representation of geometric data is used). 2D environments or 2D worlds use graphics that use two-dimensional representation of geometric data representative of game content)
performing, by the first rendering engine, first rendering based on the physical simulation result to obtain a first rendered image; and performing, by the second rendering engine, second rendering based on the physical simulation result to obtain a second rendered image (see Fig. 4C, item 460, wherein a plurality of virtual characters are rendered, and item 470, wherein a display of 112 of the secondary client device 210 is rendered for particular character 1) and ([0087] FIG. 4C illustrates view 460 of the gameplay of a multiplayer game on the primary display 250 and view 470 of the gameplay of the same multiplayer game on the display 112 of the secondary client device 210 at a third point in time, in accordance with implementations of the disclosure. View 460 shows a split-screen view of the gameplay as shown on primary display 250. Each of the sections of the split-screen shows a view from a third-person perspective relative to the particular character. For example, the top left split-screen shows the view from the third-person perspective of character 1. In other implementations, the split-screen view may show the views from the first-person perspective of each of the characters. View 470 shows the view from the third-person perspective with respect to character 1. It may be noted that view 460 and view 470 show gameplay for the same point in time but from different views) and ([0014] In some implementations, responsive to the control instructions received from the secondary client devices (control instructions that bypassed the primary client device), the server computer sends second gameplay instruction directly to the secondary client devices (e.g., second gameplay instructions that bypass the primary client device). In some implementations, the secondary client devices may render a presentation of second views of the gameplay of the multiplayer game on the displays of the secondary client devices using the second gameplay instructions from the server computer (e.g., collaboration platform), rather than using instructions from the primary client device).
Essi does not explicitly teach receiving, separately by a first rendering engine and a second rendering engine deployed on a remote computing platform, a physical simulation result from a physical engine, sending by the first rendering engine, the first rendered image to a first terminal device, and sending, by the second rendering engine, the second rendered image to a second terminal device.
However, Kava teaches receiving, separately by a first rendering engine and a second rendering engine deployed on a remote computing platform, a physical simulation result from a physical engine, sending by the first rendering engine, the first rendered image to a first terminal device, and sending, by the second rendering engine, the second rendered image to a second terminal device (abstract: Methods and apparatus for controlling the rendering of a video game instance includes obtaining game state information and player inputs from a plurality of client devices participating in a video game session. Based on the obtained game state information, a client device is identified as being likely to render a video game instance with a quality that is less than a threshold quality. A cloud game client is allocated to the identified client device. The player inputs associated with the identified client device and the obtained game state information are provided to the cloud game client, which renders the video game instance for the identified client device. The video game instance rendered at the cloud device is then transmitted to the identified client device for display at an associated display), ([0030] In the present disclosure, each video game playing device provides a respective participant of a video game session with access to an online multiplayer game. For this reason, each computing device is referred to as a client device. The video game session may correspond to e.g. a sports or combat match, such as a battle-royale mode combat match. Each client device is in communication with a central game server, via a communications network. The game server receives and processes each player's input and generates an authoritative source of events occurring within the video game. The game server transmits data about its internal state to the connected client devices, enabling each client device to maintain their own accurate version of the virtual world for display to a corresponding player. In this way, each player is able to concurrently populate a common virtual environment), and ([0074] In some examples, it may be that multiple instances of the cloud game client 608 are installed at one or more cloud devices, with each cloud game client 608 serving different players or groups of players. The monitoring unit 606 may be used to control which cloud game client 608 instances are used by the different players. For example, in response to detecting that the game requirements for a given player has lowered to the point that the local device can or is likely to be able render at the full framerate (e.g. 60 fps), the monitoring unit 606 may instruct the corresponding cloud game client 608 instance to cease rendering the video game instance for that player. This may result in the cloud game client 608 instance being returned to the pool of available cloud game client 608 instances, for use by another game client, as requested by the monitoring unit 606. In some examples, the cloud game client 608 instance may be re-allocated to a different client device (for which the e.g. GPU requirements are still expected to exceed the capabilities of the local device). Kava teaches receiving, separately by a first rendering engine and a second rendering engine deployed on a remote computing platform, a physical simulation result from a physical engine (multiple instances of cloud game client to serve multiple player client devices) sending by the first rendering engine, the first rendered image to a first terminal device, and sending, by the second rendering engine, the second rendered image to a second terminal device (multiple different players, each with their own player client device has gaming results sent to it for rendering). Essi and Kava teach claim 9. For motivation, see claim 1.
Re claim 10, Essi and Kava teach claim 9. Furthermore, Essi teaches Furthermore, Essi teaches herein performing, by the first rendering engine, the first rendering comprises:
obtaining, by the first rendering engine, a first viewing angle from which a first user observes the target scene; performing the first rendering based on the first viewing angle and the physical simulation result to obtain the first rendered image, and wherein performing, by the second rendering engine, the second rendering comprises: obtaining, by the second rendering engine, a second viewing angle from which a second user observes the target scene; and performing the second rendering based on the second viewing angle and the physical simulation result to obtain the second rendered image ([0052] In implementations, a view (also referred to as “field of view” herein) may refer to the extent of the observable game world that may be seen at any given moment from the perspective of the game camera and that is presented in the display of a client device. For example, the view of the game camera may be from a first-person perspective or a third-person perspective or some combination thereof), ([0055] In implementations, system 200 includes primary display 250. In some implementations, primary display 250 may be coupled to primary client device 245. In implementations, primary display 250 may receive display instructions from primary client device 245 and present a view of the gameplay of a game, such as a multiplayer game, on the primary display 250 for consumption by the users. In some implementations, the primary display 250 may be a television or a projector, for example. In implementations, the view displayed from the primary display 250 may be from a perspective (e.g., third-person perspective) so that all the playing users that are proximately located (e.g., same room) may watch the in-game actions of the playing users' respective characters).
Claim 11 claims limitations in scope to claim 2 and is rejected for at least the reasons above.
Claim 12 claims limitations in scope to claim 3 and is rejected for at least the reasons above.
Claim 13 claims limitations in scope to claim 4 and is rejected for at least the reasons above.
Re claim 15, Essi teaches an apparatus for a remote computing platform comprising:
a memory configured to store instructions; and a rendering engine comprising one or more processors coupled to the memory and configured to execute the instructions to cause the apparatus to: (see Fig. 1, wherein a remote computing platform comprises a plurality of terminal processing devices and data store from processing and rendering information on a remote computing platform via a network).
receive a physical simulation result from a physical engine, wherein the physical simulation result is based on an operation instruction, wherein the operation instruction affects at least one three-dimensional model in a target scene, and wherein the physical simulation result comprises physical information of the at least one three-dimensional model based on the operation instruction ([0073] It may be noted that hosting the characters (or characters and game) on a common server may allow for greater efficiency in the use of computing resources. For example, game logic executing on the common server may be shared between multiple characters. For instance, a character hits a wall and the server uses the physics engine to calculate physics commands indicative of the physical interaction of the character with the wall. If another character hits the same wall in the same or similar manner, the physics commands that were previously calculated may be cached and reused for the other character's collision with the wall. In other implementations, the characters may not be hosted on a common server) and ([0032] In implementations, collaboration platform 120 may include a creator module 126. In implementations, creator module 126 may allow users to become creators to design or create environments in an existing game 122 or create new games or create new game objects within games or environments. In some implementations, a game 122 may have a common set of rules or common goal, and the environments of a game 122 share the common set of rules or common goal. In implementations, different games may have different rules or goals from one another. In some implementations, games may have one or more environments (also referred to as “gaming environments” or “virtual environment” herein) where multiple environments may be linked. An example of an environment may be a three-dimensional (3D) environment. The one or more environments of a game 122 may be collectively referred to a “world” or “gaming world” or “virtual world” or “universe” herein. An example of a world may be a 3D world of a game 122. For example, a user may build a virtual environment that is linked to another virtual environment created by another user. A character of the virtual game may cross the virtual border to enter the adjacent virtual environment. In implementations, game objects (e.g., also referred to as “item(s)” or “objects” or “virtual game item(s)” herein) may refer to objects that are used, created, shared or otherwise depicted in games 122 of the collaboration platform 120. For example, game objects may include a part, model, character, tools, weapons, clothing, buildings, vehicles, currency, flora, fauna, components of the aforementioned (e.g., windows of a building), and so forth. It may be noted that 3D environments or 3D worlds use graphics that use a three-dimensional representation of geometric data representative of game content (or at least present game content to appear as 3D content whether or not 3D representation of geometric data is used). 2D environments or 2D worlds use graphics that use two-dimensional representation of geometric data representative of game content).
perform first rendering based on a first viewing angle from which a first user observes the target scene and the physical simulation result to obtain a first rendered image; and
perform second rendering based on a second viewing angle from which a second user observes the target scene and the physical simulation result to obtain a second rendered image (see Fig. 1, including game engine, collaboration application, and group gameplay module in both a first client device 110b and a second client device 110b), (see Fig. 2, wherein a plurality of second client devices and a primary client device is taught with each secondary client device having a display and a primary client having a primary display), ([0073] It may be noted that hosting the characters (or characters and game) on a common server may allow for greater efficiency in the use of computing resources. For example, game logic executing on the common server may be shared between multiple characters. For instance, a character hits a wall and the server uses the physics engine to calculate physics commands indicative of the physical interaction of the character with the wall. If another character hits the same wall in the same or similar manner, the physics commands that were previously calculated may be cached and reused for the other character's collision with the wall. In other implementations, the characters may not be hosted on a common server), and ([0087] FIG. 4C illustrates view 460 of the gameplay of a multiplayer game on the primary display 250 and view 470 of the gameplay of the same multiplayer game on the display 112 of the secondary client device 210 at a third point in time, in accordance with implementations of the disclosure. View 460 shows a split-screen view of the gameplay as shown on primary display 250. Each of the sections of the split-screen shows a view from a third-person perspective relative to the particular character. For example, the top left split-screen shows the view from the third-person perspective of character 1. In other implementations, the split-screen view may show the views from the first-person perspective of each of the characters. View 470 shows the view from the third-person perspective with respect to character 1. It may be noted that view 460 and view 470 show gameplay for the same point in time but from different views).
Essi does not explicitly teach send the first rendered image to a first terminal device and send the second rendered image to a second terminal device.
However, Kava teaches send the first rendered image to a first terminal device and send the second rendered image to a second terminal device (abstract: Methods and apparatus for controlling the rendering of a video game instance includes obtaining game state information and player inputs from a plurality of client devices participating in a video game session. Based on the obtained game state information, a client device is identified as being likely to render a video game instance with a quality that is less than a threshold quality. A cloud game client is allocated to the identified client device. The player inputs associated with the identified client device and the obtained game state information are provided to the cloud game client, which renders the video game instance for the identified client device. The video game instance rendered at the cloud device is then transmitted to the identified client device for display at an associated display), ([0030] In the present disclosure, each video game playing device provides a respective participant of a video game session with access to an online multiplayer game. For this reason, each computing device is referred to as a client device. The video game session may correspond to e.g. a sports or combat match, such as a battle-royale mode combat match. Each client device is in communication with a central game server, via a communications network. The game server receives and processes each player's input and generates an authoritative source of events occurring within the video game. The game server transmits data about its internal state to the connected client devices, enabling each client device to maintain their own accurate version of the virtual world for display to a corresponding player. In this way, each player is able to concurrently populate a common virtual environment), and ([0074] In some examples, it may be that multiple instances of the cloud game client 608 are installed at one or more cloud devices, with each cloud game client 608 serving different players or groups of players. The monitoring unit 606 may be used to control which cloud game client 608 instances are used by the different players. For example, in response to detecting that the game requirements for a given player has lowered to the point that the local device can or is likely to be able render at the full framerate (e.g. 60 fps), the monitoring unit 606 may instruct the corresponding cloud game client 608 instance to cease rendering the video game instance for that player. This may result in the cloud game client 608 instance being returned to the pool of available cloud game client 608 instances, for use by another game client, as requested by the monitoring unit 606. In some examples, the cloud game client 608 instance may be re-allocated to a different client device (for which the e.g. GPU requirements are still expected to exceed the capabilities of the local device). Kava teaches send the first rendered image to a first terminal device and send the second rendered image to a second terminal device (multiple instances cloud game client as a rendering engine, which renders the video game instance for the identified client devices, include first and second terminal devices of given players). Essi and Kava teach claim 15. For motivation, see claim 1.
Re claim 16, Essi and Kava teaches claim 15. Furthermore, Essi teaches wherein the target scene comprises a first virtual character corresponding to the rendering engine, and wherein the first virtual character is in the target scene (see Fig. 4C, item 460, wherein a plurality of virtual characters are rendered, and item 470, wherein a display of 112 of the secondary client device 210 is rendered for particular character 1) and ([0087] FIG. 4C illustrates view 460 of the gameplay of a multiplayer game on the primary display 250 and view 470 of the gameplay of the same multiplayer game on the display 112 of the secondary client device 210 at a third point in time, in accordance with implementations of the disclosure. View 460 shows a split-screen view of the gameplay as shown on primary display 250. Each of the sections of the split-screen shows a view from a third-person perspective relative to the particular character. For example, the top left split-screen shows the view from the third-person perspective of character 1. In other implementations, the split-screen view may show the views from the first-person perspective of each of the characters. View 470 shows the view from the third-person perspective with respect to character 1. It may be noted that view 460 and view 470 show gameplay for the same point in time but from different views) and ([0014] In some implementations, responsive to the control instructions received from the secondary client devices (control instructions that bypassed the primary client device), the server computer sends second gameplay instruction directly to the secondary client devices (e.g., second gameplay instructions that bypass the primary client device). In some implementations, the secondary client devices may render a presentation of second views of the gameplay of the multiplayer game on the displays of the secondary client devices using the second gameplay instructions from the server computer (e.g., collaboration platform), rather than using instructions from the primary client device).
Claim 17 claims limitations in scope to claim 3 and is rejected for at least the reasons above.
Claim 18 claims limitations in scope to claim 4 and is rejected for at least the reasons above.
Re claim 20, Essi and Kava teach claim 15. Furthermore, Essi teaches wherein the physical engine is deployed on the first terminal device or the second terminal device (see Fig. 1, including game engine, collaboration application, and group gameplay module in both a first client device 110b and a second client device 110b), (see Fig. 2, wherein a plurality of second client devices and a primary client device is taught with each secondary client device having a display and a primary client having a primary display), ([0073] It may be noted that hosting the characters (or characters and game) on a common server may allow for greater efficiency in the use of computing resources. For example, game logic executing on the common server may be shared between multiple characters. For instance, a character hits a wall and the server uses the physics engine to calculate physics commands indicative of the physical interaction of the character with the wall. If another character hits the same wall in the same or similar manner, the physics commands that were previously calculated may be cached and reused for the other character's collision with the wall. In other implementations, the characters may not be hosted on a common server), and ([0087] FIG. 4C illustrates view 460 of the gameplay of a multiplayer game on the primary display 250 and view 470 of the gameplay of the same multiplayer game on the display 112 of the secondary client device 210 at a third point in time, in accordance with implementations of the disclosure. View 460 shows a split-screen view of the gameplay as shown on primary display 250. Each of the sections of the split-screen shows a view from a third-person perspective relative to the particular character. For example, the top left split-screen shows the view from the third-person perspective of character 1. In other implementations, the split-screen view may show the views from the first-person perspective of each of the characters. View 470 shows the view from the third-person perspective with respect to character 1. It may be noted that view 460 and view 470 show gameplay for the same point in time but from different views). Essi teaches client devices with game engines deployed in both first and second client devices.
Claims 21-22 claim limitations in scope to claim 20 and is rejected for at least the reasons above.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Peter Hoang whose telephone number is (571)270-1346. The examiner can normally be reached Monday-Friday 8:00 am - 5:00 pm PST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Hajnik F. Daniel can be reached at (571) 272-7642. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PETER HOANG/ Primary Examiner, Art Unit 2616