DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Information Disclosure Statement
The information disclosure statements (IDS) submitted on August 7th, 2025 and January 22nd, 2026 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statements are being considered by the examiner.
Response to Arguments
This is in response to applicant’s amendment/response filed on January 22nd 2026 which have been entered and made of record.
Applicant’s arguments regarding claim objections for claim 16 are persuasive. Claim objections for claim 16 have been withdrawn.
Applicant’s arguments regarding claim rejections under 35 U.S.C. 103 have been fully considered but they are not persuasive.
Applicant argues Claim 1 has been amended to include recitations not previously presented in the claims. These recitations have not been examined or cited in a rejection. Their addition, then, overcomes the present rejection.
For example, Applicant has reviewed the references and cannot find any disclosure, teaching, or suggestion of "user interactivity and item interactivity," "complexity data associated with rendering complexity," "network category information associated with network locations including edges, near-edges, mid-edges, and far-edges of one or more networks," "utilizing different network locations to render different portions of the computer-altered reality data based on the priorities," and "collecting the rendered computer-altered reality data from the network locations and aggregating the rendered computer-altered reality data." The primary reference, Kurabayashi, generally discusses a "mixed-reality system [that] is to provide the user with a seamless mixed-reality space, in which an object in a virtual space (virtual object) is effectively in accurate contact with an object such as the ground or a building in real space (real object)." Kurabayashi, para. [0012]. Kurabayashi is silent, however, performing these operations using the steps recited in claim 1, as amended.
Furthermore, Zhang and Guim Bernat fail to remedy the deficiencies of Kurabayashi regarding amended claim 1, and the Office makes no assertions to that effect.
Examiner respectable disagrees. Kurabayashi teaches an MR System for generating virtual objects and ensuring said virtual objects are positioned and rendered consistency throughout the environment (Para. 0016). To ensure the virtual objects are consistent the position and movement of the virtual objects (Para. 0083) and the user(Location of HMD or Users Visual Field, Para. 0087-0088 and 0092) are tracked. According to broadest reasonable interpretation (BRI) characteristic data (Data from the MR sensors, Para. 0016) for the initial computer-altered reality data (MR Environment, 0043) encompasses any data associated to the MR Environment (Virtual Object Data, User Data, Three-Dimensional Space Data, Rendering Data, etc.… Para. 0020-0022) Interacting with a user or item (Virtual Object) requires tracking or modifying of the position or movement of said user or item, as an interaction can be moving the user or item. Thus, Kurabayashi teaches activity data associated with user interactivity and item interactivity.
Kurabayashi teaches rendering virtual objects (Rendering Part 13, Para. 0094) and that the workload is shared between the Network and Local Device(Server 100 and HMD 200, Para. 0096). To render virtual objects their associated data is used, this data includes user environment(Para. 0019), textures, reflections/illumination(Para. 0130), position/orientation (Para. 0095), meshes, colors/shades(Para. 0130), polygons/voxels/pixels, etc.… According to BRI complexity data associated with rendering complexity is any rendering data that causes the rendering to be complex. Therefore virtual objects that have large quantities of associated data (Multiple Textures, Reflections, Polygons, Colors, etc.….) can cause complexity when rendering. Thus, Kurabayashi teaches complexity data associated with rendering complexity;
Guim Bernat teaches utilizing edge computing (Para. 0171-0172) for AR/VR workloads(Para. 0195 and 0305-0312). Edge computing is used to reduce latency, traffic, energy consumption, and improve service capabilities(Para. 0002) by utilizing multiple edges. In edge computing an aggregation edge layer can be utilized, this layer can be next to the access edge layer which is closest to the end user or device (Para. 0101). The aggregation edge layer is used by edge aggregation nodes (Para. 0118, 0121, and 0123) and aggregation points (Para. 0141 and 0143) to aggregate traffic, requests, and data. Therefore, AR/VR data (AR/VR Applications Para. 0149, AR/VR Applications data can include render data.) sent to the edges can be aggregated before being sent back to the user’s device. Thus, Guim Bernat teaches transmitting the rendered computer-altered reality data to the user device by collecting the rendered computer-altered reality data from the network locations and aggregating the rendered computer-altered reality data.
Guim Bernat teaches classifying network layers based on latency, distance, and timing (Para. 0109). These layers can be considered near edge, middle edge, and far edge (Para. 0109 and 0112). To identify and determine which edges or networks to use. The workload of the application is considered, since workloads contain specific services, objectives, and requirements (Para. 0134 and 0142) that need to be met. Thus, Guim Bernat teaches network category information associated with network locations including edges, near- edges, mid-edges, and far-edges of one or more networks.
Zhang teaches utilizing edge computing for a VR game (VR-MMOGs, Section: Abstract and 1 Introduction) where the edge cloud used is based on the type of in-game event (Section: 4.1 Flow of Gaming in EC+ and 4.2 Edge Cloud Migration). Two of the in-game events are local view change events and global game events (Section: 3.1 View Change Events vs. Game Events, Para. 1-2). These events can involve different aspects of the VR game (Fig. 1 and Section: 3.1 View Change Events vs. Game Events, Para. 1-3) and have their own requirements (Tolerable Latency, Even Size, Frequency, Table 1) to maintain a seamless gaming experience(Section: 3.1 View Change Events vs. Game Events, Para. 6-7). Therefore different events are prioritized based on their requirements and have edges selected based on meeting these requirements (Section 5 Edge Cloud Selection of User Mobility). Thus, Zhang teaches utilizing different network locations to render different portions of the computer-altered reality data based on the priorities.
Regarding the remaining arguments applicant argues with respect to the amended claim language, which is fully addressed in the prior art rejections set forth below.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 1 and 16 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claims 1 and 16, states the following limitation “the characteristics data including at least one of user selection data or activity data associated with user interactivity and item interactivity”. Examiner is unsure if “user interactivity” and “item interactivity” are only associated with “activity data” or also associated with “user selection data”. Therefore, examiner is interpretating “user interactivity” and “item interactivity” to only be associated with “activity data” Thus, the claim will be examined as best understood by the Examiner.
Claims 17-20 inherit their indefiniteness from claim 16 from which they depend and claims 1-6 and 21inherit their indefiniteness from claim 1 from which they depend.
Claims 1-6 and 17-21 will also be examined as best understood by the Examiner.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-4, 6-20 are rejected under 35 U.S.C. 103 as being unpatentable over Kurabayashi U.S. Patent Application No. 20190311471 A1 (hereinafter Kurabayashi) in view of Guim Bernat et al U.S. Patent Application No. 20210144517 A1 (hereinafter Guim Bernat) in further view of "Towards Efficient Edge Cloud Augmentation for Virtual Reality MMOGs" by Wuyang Zhang, Jiachen Chen, Yanyong Zhang, and Dipankar Raychaudhuri (hereinafter Zhang).
Regarding Claim 1, Kurabayashi teaches identifying initial computer-altered reality data (MR Environment, Para. 0043), associated with a computer-altered reality item (Virtual Object, Para. 0045 or Real Objects, Para. 0073), the initial computer-altered reality data including at least one of computer-generated data or environment data (User-Determining Part 12), the environment data being received from a user device (HMD 200, Para. 0087):
receiving characteristics data (Data from Sensors) for the initial computer-altered reality data (MR Environment, Para. 0043), the characteristics data (Data from Sensors) including at least one of user selection data or activity data (High-Precision Tracking of User, Para. 0087-0088 or Position and Movement of Virtual Object, Para. 0083) associated with user interactivity (Change in Position/Movement of User, Para. 0087-0088 and 0092) and item interactivity (Change in Position/Movement of Virtual Object, Para. 0083); As stated above, Kurabayashi teaches an MR System for generating virtual objects and ensuring said virtual objects are positioned and rendered consistency throughout the environment (Para. 0016). To ensure the virtual objects are consistent the position and movement of the virtual objects (Para. 0083) and the user(Location of HMD or Users Visual Field, Para. 0087-0088 and 0092) are tracked. According to broadest reasonable interpretation (BRI) characteristic data (Data from the MR sensors, Para. 0016) for the initial computer-altered reality data (MR Environment, 0043) encompasses any data associated to the MR Environment (Virtual Object Data, User Data, Three-Dimensional Space Data, Rendering Data, etc.… Para. 0020-0022) Interacting with a user or item (Virtual Object) requires tracking or modifying of the position or movement of said user or item, as an interaction can be moving the user or item. Thus, Kurabayashi teaches activity data associated with user interactivity and item interactivity.
analyzing the characteristics data (Data from Sensors) to identifying characteristics data metrics information (Position of Real/Virtual Objects) that includes at least one characteristics data metric (Position or Movement of Virtual Object, Para. 0083 or Point Cloud of Real Objects in Virtual Space, Para. 0073-0075) associated with the computer-altered reality item (Virtual Object, Para. 0045 or Real Objects, Para. 0073), the at least one characteristics data metrics (Position or Movement of Virtual Object, Para. 0083 or Point Cloud of Real Objects in Virtual Space, Para. 0073-0075) including at least one of an interaction metric or a motion metric (Position or Movement of Virtual Object, Para. 0083 or Point Cloud of Real Objects in Virtual Space, Para. 0073-0075) and complexity data (Virtual Object Rendering Data containing reflections/illumination Para. 0130, position/orientation Para. 0095, and colors/shades Para. 0130)associated with rendering complexity; As stated above, Kurabayashi teaches rendering virtual objects (Rendering Part 13, Para. 0094) and that the workload is shared between the Network and Local Device(Server 100 and HMD 200, Para. 0096). To render virtual objects their associated data is used, this data includes user environment(Para. 0019), textures, reflections/illumination(Para. 0130), position/orientation (Para. 0095), meshes, colors/shades(Para. 0130), polygons/voxels/pixels, etc.… According to BRI complexity data associated with rendering complexity is any rendering data that causes the rendering to be complex. Therefore virtual objects that have large quantities of associated data (Multiple Textures, Reflections, Polygons, Colors, etc.….) can cause complexity when rendering. Thus, Kurabayashi teaches complexity data associated with rendering complexity;
and transmitting the rendered (Rendering Part 13) computer-altered reality data (MR Environment Para. 0043 containing Virtual Objects Para. 0095) to the user device (HMD 200 and Display Unit 202, Para. 0095) by collecting the rendered computer-altered reality data (MR Environment Para 0043 containing Virtual Objects Para. 0095) from the network (Server 100, Para. 0096)
However, Kurabayashi fails to teach:
identifying priority information associated with the characteristics data based at least in part on the characteristics data metrics information to identify priorities associated with different portions of the computer-altered reality data;
identifying rendering location information based on the priority information and network category information associated with network locations including edges, near- edges, mid-edges, and far-edges of one or more networks;
causing rendering of the initial computer-altered reality data to generate rendered computer-altered reality data based on the rendering location information by utilizing different network locations to render different portions of the computer-altered reality data based on the priorities;
collecting the rendered computer-altered reality data from the network locations and aggregating the rendered computer-altered reality data.
Kurabayashi and Guim Bernat analogous to the claimed invention because both of them are in the same field of XR/VR/AR/MR Rendering Data utilizing Networks.
Guim Bernat teaches:
identifying rendering location information (Edge Cloud Network) based on the priority information (Workload services, objectives, and requirements, Para. 0091, 0134 and 0142) and network category information (Latency, Distance, and Timing Para. 0109) associated with network locations including edges, near- edges, mid-edges, and far-edges of one or more networks (Para. 0109 and 0112); As stated above, Guim Bernat teaches classifying network layers based on latency, distance, and timing (Para. 0109). These layers can be considered near edge, middle edge, and far edge (Para. 0109 and 0112). To identify and determine which edges or networks to use. The workload of the application is considered, since workloads contain specific services, objectives, and requirements (Para. 0134 and 0142) that need to be met. Thus, Guim Bernat teaches network category information associated with network locations including edges, near- edges, mid-edges, and far-edges of one or more networks.
collecting the rendered computer-altered reality data (AR/VR Data Sent to Network/Edges, Para. 0195 and 0305-0312) from the network locations (Network/Edges) and aggregating the rendered computer-altered reality data (Aggregating Data on Aggregating Nodes Para. 0118, 0121,and 0123 and aggregation points Para. 0141 and 0143). As stated above, Guim Bernat teaches utilizing edge computing (Para. 0171-0172) for AR/VR workloads(Para. 0195 and 0305-0312). Edge computing is used to reduce latency, traffic, energy consumption, and improve service capabilities(Para. 0002) by utilizing multiple edges. In edge computing an aggregation edge layer can be utilized, this layer can be next to the access edge layer which is closest to the end user or device (Para. 0101). The aggregation edge layer is used by edge aggregation nodes (Para. 0118, 0121, and 0123) and aggregation points (Para. 0141 and 0143) to aggregate traffic, requests, and data. Therefore, AR/VR data (AR/VR Applications Para. 0149, AR/VR Applications data can include render data.) sent to the edges can be aggregated before being sent back to the user’s device. Thus, Guim Bernat teaches transmitting the rendered computer-altered reality data to the user device by collecting the rendered computer-altered reality data from the network locations and aggregating the rendered computer-altered reality data.
Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi’s characteristics data to be prioritized by Guim Bernat’s Workload Requirements and Utilize Edge Computing. Since AR and VR systems are high-resource demanding services (Guim Bernat, Para. 0611). Prioritizing data in such services would be beneficial to the service to guarantee Quality of Service and utilizing edge computing offers orchestration and management for the services (Guim Bernat, Para. 0004).
Kurabayashi and Guim Bernat fails to explicitly teach:
identifying priority information associated with the characteristics data based at least in part on the characteristics data metrics information to identify priorities associated with different portions of the computer-altered reality data;
causing rendering of the initial computer-altered reality data to generate rendered computer-altered reality data based on the rendering location information by utilizing different network locations to render different portions of the computer-altered reality data based on the priorities;
Kurabayashi, Guim Bernat, and Zhang are considered analogous to the claimed invention because all three of them are in the same field of XR/VR/AR/MR rendering data using networks. Additionally, Guim Bernat and Zhang are in the same field of using edge computing.
Zhang teaches:
identifying priority information (Reduce Latency, Higher Frequency, Event Size, Table 1) associated with the characteristics data (View Change Events or Game Events) based at least in part on the characteristics data metrics information (Movement/Interaction with Game Objects or Players) to identify priorities associated with different portions (Different Game Objects, Different Players, etc.…) of the computer-altered reality data (VR-MMOGS Game Data); Zhang teaches that VR-MMOGs are essentially large-scale event driven systems (Section: 3 A Closer Look at VR-MMOGS, Page 4). That have two fundamental types of user events, local view change events and global game events (Section: 3.1 View Change Events vs. Game Events, Para. 1-2). Such events encompass a wide variate of items (Game Objects) in the game that can move or be interacted with by the user (Fig.1, “Movement of NPC’ or “Shooting fire”) including the user itself (Player Data). For a user to interact/modify/update said items or themselves the state, location, properties of said item/user need to be known to the user or game world (Section: 3.1 View Change Events vs. Game Events, Para. 3). Thus, Zhang teaches that user events contain items (game objects) with associated interaction and motion metrics tied to them.
Zhang also teaches an important metric associated with the user events is the feedback delay a player experience when an event occurs. (Section: 3.1 View Change Events vs. Game Events, Para. 4-5). Depending on the event certain levels of feedback delay are acceptable to the player’s experience. (Table 1) To ensure events have suitable feedback delay, certain events are prioritized (Section: 3.1: View Change Events vs. Game Events, Para. 6-7). Thus, Zhang teaches that events are prioritized based on a metric, which encompass the items (Game Objects) in said events.
causing the rendering(Renderer & Encoder) of the initial computer-altered reality data (VR-MMOGS Game Data) to generate rendered computer-altered reality data (Renderer & Encoder, Fig. 2C) based on the rendering location information (Edge Cloud Network, Section 5.1 Modeling edge Selection problem using MDP, Para. 1) by utilizing different network locations (Different Edges) to render different portions (View Change Events or Game Events) of the computer-altered reality data(VR-MMOGS Game Data) based on the priorities (Reduce Latency, Higher Frequency, Event Size, Table 1); As stated above, Zhang teaches utilizing edge computing for a VR game (VR-MMOGs, Section: Abstract and 1 Introduction) where the edge cloud used is based on the type of in-game event (Section: 4.1 Flow of Gaming in EC+ and 4.2 Edge Cloud Migration). Two of the in-game events are local view change events and global game events (Section: 3.1 View Change Events vs. Game Events, Para. 1-2). These events can involve different aspects of the VR game (Fig. 1 and Section: 3.1 View Change Events vs. Game Events, Para. 1-3) and have their own requirements (Tolerable Latency, Even Size, Frequency, Table 1) to maintain a seamless gaming experience(Section: 3.1 View Change Events vs. Game Events, Para. 6-7). Therefore different events are prioritized based on their requirements and have edges selected based on meeting these requirements (Section 5 Edge Cloud Selection of User Mobility). Thus, Zhang teaches utilizing different network locations to render different portions of the computer-altered reality data based on the priorities.
Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi’s Computer-altered Reality Data Prioritized by Guim Bernat to Utilize Edge Computing to incorporate Zhang’s Prioritization based on Movement/Interaction of Virtual Game Elements. In AR/VR applications various virtual elements can be interacted with or moved. Utilizing Zhang’s Prioritization of events ensures a pleasant user experience (Section: 3.1 View Change Events vs Game events Para. 6-7) by prioritizing events that happen frequently, have lower latency thresholds, and larger events sizes.
Regarding claim 2, Kurabayashi teaches wherein the initial computer-altered reality data includes initial extended reality (XR) data (Mixed-Reality System. Para. 0041), and the rendered computer-altered reality data includes rendered XR data (Rendering Part 13, Para. 0094).
However, Kurabayashi and Guim Bernat fails to teach causing the rendering of the initial computer-altered reality data further comprises causing rendering of the initial XR data to generate the rendered XR data based on the rendering location information.
Zhang teaches causing the rendering of the initial computer-altered reality data further comprises causing rendering (Fig. 2C Renderer & Encoder) of the initial XR data (VR-MMOGS Game Data) to generate the rendered XR data (Rendered VR-MMOGS Game Data) based on the rendering location information (Edge Cloud Network; Section 5.1 Modeling Edge Selection Problem Using MDP, Para. 1) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi's rendered MR data to be based on Zhang's location of the edge cloud network. In Kurabayashi they use GPS, Bluetooth, or Wi-Fi in the User-Environment Determining Part 12 to determine a provisional user environment (Para. 0090) and in a Mixed-reality system users can change location. Therefore, it would be beneficial to render the MR data based on render location information from edge computing, since the user’s environment can change.
Regarding claim 3, Kurabayashi fail to teach wherein identifying the rendering location information further comprises: identifying network category information, the network category information including at least one of an edge category identifier, a near-edge category identifier, a mid-edge category identifier, or a far-edge category identifier; and identifying, as an identified network category identifier in the rendering location information, the edge category identifier, the near-edge category identifier, the mid-edge category identifier, or the far-edge category identifier.
However, Guim Bernat teaches identifying network category information (Edge Computing System, Para. 0093), the network category information including at least one of an edge category identifier (Classifying Edge Layers, Para. 0109), a near-edge category identifier, a mid-edge category identifier, or a far-edge category identifier; and identifying, as an identified network category identifier in the rendering location information (Edge Cloud Network - Distance, Latency, and Timing Characteristics, Para. 0109), the edge category identifier, the near-edge category identifier, the mid-edge category identifier, or the far-edge category identifier (Para. 0109). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi’s rendering of data to incorporate Guim Bernat’s edge computing system that classifies a network’s edge layers. Since doing so would provide the benefit of distinguishing between edge layers based on characteristics desirable to the service being provided. Such as, if latency is prioritized for communication, choosing edge layers classified as “near edge” or “close edge” would be ideal to reduce latency (Para. 0112).
Regarding claim 4, Kurabayashi and Guim Bernat teach a network and server used for rendering initial computer-altered reality data (Kurabayashi, Network 50 and Server 100, Para. 0047).
However, Kurabayashi and Guim Bernat fail to teach wherein causing the rendering of the initial computer-altered reality data further comprises: causing at least one server associated with a network category to render, as the rendered computer-altered reality data, the initial computer-altered reality data based on network category information in the rendering location information.
PNG
media_image1.png
313
242
media_image1.png
Greyscale
Zhang teaches causing at least one a server (Game Server) associated with a network category (Edge) to render (Renderer & Encoder) data (See Zhang’s Fig. 2c above) based on rendered location information. (Edge Location, Section 5.1 Modeling edge Selection problem using MDP, Para. 1)Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi and Guam Bernat’s network/server to incorporate Zhang’s game server that is associated with an edge network to render data based on an edges location. Since determining which edge is used for rendering based on the edge’s location. Allows for networks to fulfill QoS requirements like latency by choosing closer edges to the user to render data that needs to be frequently updated. (Zhang, Section 4 EC+: A VR-MMOG Architecture Augmented by Edge Clouds and Guim Bernat, Para. 0305).
Regarding claim 6, Kurabayashi teaches a network and server used to transmit data to a user’s display (Kurabayashi, Network 50, Server 100, and Display Unit 202, Para. 0047).
However, Kurabayashi fails to teach transmitting, to at least one server associated with a network location, the initial computer-altered reality data, the network location being indicated, by a network location identifier in the rendering location information, as an edge, a near-edge, a mid-edge, or a far-edge.
Guim Bernat teaches a server (Para. 0095) associated with a network location (Edge location) that indicates a network location identifier (Network Edge Layers, Para. 0109) based on rendering location information (Edge Cloud Network - Distance, Latency, and Timing Characteristics, Para. 0109). The network location identifier, identifying network edge layers as “near edge”, “close edge”, “local edge”, “middle edge”, and “far edge” (Para. 0109). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi Network/ Server to incorporate Guim Bernat’s identifying of network edge layers based on rendering location information. Since doing so would provide the benefit of distinguishing between edge layers based on characteristics desirable to the service being provided. Such as, if latency is prioritized for communication, choosing edge layers classified as “near edge” or “close edge” would be ideal to reduce latency (Para. 0112).
Regarding claim 7, Kurabayashi teaches a system comprising: at least one processor; (Processing Unit 101, Para. 0051) and non-transitory memory storing instructions (Storage unit 104) that, when executed by the at least one processor, cause the at least one processor to perform operations comprising: (Para. 0054)
receiving characteristics data (Data from Sensors) associated with computer-altered reality data (MR Environment, Para. 0043), the characteristics data (Data from Sensors) including at least activity data (High-Precision Tracking of User, Para. 0087-0088 or Position and Movement of Virtual Object, Para. 0083) associated with user interactivity (Change in Position/Movement of User, Para. 0087-0088 and 0092) and item interactivity (Change in Position/Movement of Virtual Object, Para. 0083); As stated above, Kurabayashi teaches an MR System for generating virtual objects and ensuring said virtual objects are positioned and rendered consistency throughout the environment (Para. 0016). To ensure the virtual objects are consistent the position and movement of the virtual objects (Para. 0083) and the user(Location of HMD or Users Visual Field, Para. 0087-0088 and 0092) are tracked. According to broadest reasonable interpretation (BRI) characteristic data (Data from the MR sensors, Para. 0016) for the initial computer-altered reality data (MR Environment, 0043) encompasses any data associated to the MR Environment (Virtual Object Data, User Data, Three-Dimensional Space Data, Rendering Data, etc.… Para. 0020-0022) Interacting with a user or item (Virtual Object) requires tracking or modifying of the position or movement of said user or item, as an interaction can be moving the user or item. Thus, Kurabayashi teaches activity data associated with user interactivity and item interactivity.
analyzing the characteristics data (Data from Sensors) to identifying characteristics data metrics information (Position of Real/Virtual Objects) that includes at least one characteristics data metric (Position or Movement of Virtual Object, Para. 0083 or Point Cloud of Real Objects in Virtual Space, Para. 0073-0075) associated with the computer-altered reality item (Virtual Object, Para. 0045 or Real Objects, Para. 0073), the at least one characteristics data metric (Position or Movement of Virtual Object, Para. 0083 or Point Cloud of Real Objects in Virtual Space, Para. 0073-0075) including at least one of an interaction metric or a motion metric (Position or Movement of Virtual Object, Para. 0083 or Point Cloud of Real Objects in Virtual Space, Para. 0073-0075) and complexity data (Virtual Object Rendering Data containing reflections/illumination Para. 0130, position/orientation Para. 0095, and colors/shades Para. 0130)associated with rendering complexity; As stated above, Kurabayashi teaches rendering virtual objects (Rendering Part 13, Para. 0094) and that the workload is shared between the Network and Local Device(Server 100 and HMD 200, Para. 0096). To render virtual objects their associated data is used, this data includes user environment(Para. 0019), textures, reflections/illumination(Para. 0130), position/orientation (Para. 0095), meshes, colors/shades(Para. 0130), polygons/voxels/pixels, etc.… According to BRI complexity data associated with rendering complexity is any rendering data that causes the rendering to be complex. Therefore virtual objects that have large quantities of associated data (Multiple Textures, Reflections, Polygons, Colors, etc.….) can cause complexity when rendering. Thus, Kurabayashi teaches complexity data associated with rendering complexity;
and transmitting the rendered (Rendering Part 13) computer-altered reality data (MR Environment Para. 0043 containing Virtual Objects Para. 0095) to the user device (HMD 200 and Display Unit 202, Para. 0095) by collecting the rendered computer-altered reality data (MR Environment Para 0043 containing Virtual Objects Para. 0095) from the network (Server 100, Para. 0096)
However, Kurabayashi fails to teach:
identifying computer-altered reality data associated with a user profile of a user associated with a user device;
identifying priority information associated with the characteristics data based at least in part on the characteristics data metrics information to identify priorities associated with different portions of the computer-altered reality data;
identifying rendering location information based on the characteristics data and the priority information and network category information associated with network locations including edges, near- edges, mid-edges, and far-edges of one or more networks;
causing generation of rendered computer-altered reality data based on the rendering location information by utilizing different network locations to render different portions of the computer-altered reality data based on the priorities;
collecting the rendered computer-altered reality data from the network locations and aggregating the rendered computer-altered reality data.
Guim Bernat teaches:
identifying rendering location information (Edge Cloud Network) based on the priority information (Workload services, objectives, and requirements, Para. 0091, 0134 and 0142) and network category information (Latency, Distance, and Timing Para. 0109) associated with network locations including edges, near- edges, mid-edges, and far-edges of one or more networks (Para. 0109 and 0112); As stated above, Guim Bernat teaches classifying network layers based on latency, distance, and timing (Para. 0109). These layers can be considered near edge, middle edge, and far edge (Para. 0109 and 0112). To identify and determine which edges or networks to use. The workload of the application is considered, since workloads contain specific services, objectives, and requirements (Para. 0134 and 0142) that need to be met. Thus, Guim Bernat teaches network category information associated with network locations including edges, near- edges, mid-edges, and far-edges of one or more networks.
collecting the rendered computer-altered reality data (AR/VR Data Sent to Network/Edges, Para. 0195 and 0305-0312) from the network locations (Network/Edges) and aggregating the rendered computer-altered reality data (Aggregating Data on Aggregating Nodes Para. 0118, 0121,and 0123 and aggregation points Para. 0141 and 0143). As stated above, Guim Bernat teaches utilizing edge computing (Para. 0171-0172) for AR/VR workloads(Para. 0195 and 0305-0312). Edge computing is used to reduce latency, traffic, energy consumption, and improve service capabilities(Para. 0002) by utilizing multiple edges. In edge computing an aggregation edge layer can be utilized, this layer can be next to the access edge layer which is closest to the end user or device (Para. 0101). The aggregation edge layer is used by edge aggregation nodes (Para. 0118, 0121, and 0123) and aggregation points (Para. 0141 and 0143) to aggregate traffic, requests, and data. Therefore, AR/VR data (AR/VR Applications Para. 0149, AR/VR Applications data can include render data.) sent to the edges can be aggregated before being sent back to the user’s device. Thus, Guim Bernat teaches transmitting the rendered computer-altered reality data to the user device by collecting the rendered computer-altered reality data from the network locations and aggregating the rendered computer-altered reality data.
Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi’s characteristics data to be prioritized by Guim Bernat’s Workload Requirements and Utilize Edge Computing. Since AR and VR systems are high-resource demanding services (Guim Bernat, Para. 0611). Prioritizing data in such services would be beneficial to the service to guarantee Quality of Service and utilizing edge computing offers orchestration and management for the services (Guim Bernat, Para. 0004).
Guim Bernat fails to explicitly teach:
identifying computer-altered reality data associated with a user profile of a user associated with a user device;
identifying priority information associated with the characteristics data based at least in part on the characteristics data metrics information to identify priorities associated with different portions of the computer-altered reality data;
causing generation of rendered computer-altered reality data based on the rendering location information by utilizing different network locations to render different portions of the computer-altered reality data based on the priorities;
Zhang teaches:
identifying computer-altered reality data (VR-MMOGS Game Data) associated with a user profile (See Fig.2C Above) of a user associated with a user device. (Section 3.2.2 Cloud-Centric Video Streaming Gaming, Para. 1)
identifying priority information (Reduce Latency, Higher Frequency, Event Size, Table 1) associated with the characteristics data (View Change Events or Game Events) based at least in part on the characteristics data metrics information (Movement/Interaction with Game Objects or Players) to identify priorities associated with different portions (Different Game Objects, Different Players, etc.…) of the computer-altered reality data (VR-MMOGS Game Data);Zhang teaches that VR-MMOGs are essentially large-scale event driven systems (Section: 3 A Closer Look at VR-MMOGS, Page 4). That have two fundamental types of user events, local view change events and global game events (Section: 3.1 View Change Events vs. Game Events, Para. 1-2). Such events encompass a wide variate of items (Game Objects) in the game that can move or be interacted with by the user (Fig.1, “Movement of NPC’ or “Shooting fire”) including the user itself (Player Data). For a user to interact/modify/update said items or themselves the state, location, properties of said item/user need to be known to the user or game world (Section: 3.1 View Change Events vs. Game Events, Para. 3). Thus, Zhang teaches that user events contain items (game objects) with associated interaction and motion metrics tied to them.
Zhang also teaches an important metric associated with the user events is the feedback delay a player experience when an event occurs. (Section: 3.1 View Change Events vs. Game Events, Para. 4-5). Depending on the event certain levels of feedback delay are acceptable to the player’s experience. (Table 1) To ensure events have suitable feedback delay, certain events are prioritized (Section: 3.1: View Change Events vs. Game Events, Para. 6-7). Thus, Zhang teaches that events are prioritized based on a metric, which encompass the items (Game Objects) in said events.
causing generation of rendered (Renderer & Encoder) computer-altered reality data (VR-MMOGS Game Data) based on the rendering location information(Edge Cloud Network, Section 5.1 Modeling edge Selection problem using MDP, Para. 1) by utilizing different network locations(Different Edges) to render different portions (View Change Events or Game Events) of the computer-altered reality data(VR-MMOGS Game Data) based on the priorities (Reduce Latency, Higher Frequency, Event Size, Table 1); As stated above, Zhang teaches utilizing edge computing for a VR game (VR-MMOGs, Section: Abstract and 1 Introduction) where the edge cloud used is based on the type of in-game event (Section: 4.1 Flow of Gaming in EC+ and 4.2 Edge Cloud Migration). Two of the in-game events are local view change events and global game events (Section: 3.1 View Change Events vs. Game Events, Para. 1-2). These events can involve different aspects of the VR game (Fig. 1 and Section: 3.1 View Change Events vs. Game Events, Para. 1-3) and have their own requirements (Tolerable Latency, Even Size, Frequency, Table 1) to maintain a seamless gaming experience(Section: 3.1 View Change Events vs. Game Events, Para. 6-7). Therefore different events are prioritized based on their requirements and have edges selected based on meeting these requirements (Section 5 Edge Cloud Selection of User Mobility). Thus, Zhang teaches utilizing different network locations to render different portions of the computer-altered reality data based on the priorities.
Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi’s Computer-altered Reality Data Prioritized by Guim Bernat to Utilize Edge Computing to incorporate Zhang’s Prioritization based on Movement/Interaction of Virtual Game Elements. In AR/VR applications various virtual elements can be interacted with or moved. Utilizing Zhang’s Prioritization of events ensures a pleasant user experience (Section: 3.1 View Change Events vs Game events Para. 6-7) by prioritizing events that happen frequently, have lower latency thresholds, and larger events sizes.
Regarding Claim 8, has similar limitations as of claims 4 and 5, therefore it is rejected under the same rationale as claims 4 and 5.
Regarding Claim 9, Kurabayashi and Guim Bernat fails to teach wherein causing the generation of rendered computer-altered reality data further comprises: causing rendering of the computer-altered reality data at a network location based on an interaction metric, the interaction metric corresponding to a level of interaction associated with a computer-altered reality data item, the computer-altered reality data item being indicated by a computer-altered reality data item identifier in the characteristics data.
However, Zhang teaches rendering data (Renderer & Encoder) at a network’s location (See Fig.2C Above) based on an interaction metric (Action) that corresponds to a level of interaction (Frequency, Table 1) associated with an item (User, NPC, or Game Object) that has been identified from characteristic data (View Change Events or Game Events). View change events and game events contain various actions that can be interactions with items, (Section: 3.1 View Change Events vs. Game Events) such as interacting with a user’s character to change their field of view. (Fig. 1) These events are then categorized by latency, size, and frequently to prioritize them. (Table 1) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi data that has been prioritized by Guim Bernat’s method to incorporate prioritizing data based on an item’s characteristic like interaction, as taught by Zhang. This allows data to be prioritized in a way that items/users that are interacted with frequently are prioritized more to ensure low latency. Since, high levels of latency are bad in high-resource demanding systems like AR/VR (Guim Bernat, Para. 0611) and can cause individuals to experience dizziness (Zhang, Section: 3.1 View Change Events vs. Game Events, Para. 6).
Regarding claim 10, Kurabayashi and Guim Bernat fails to teach wherein causing the generation of rendered computer-altered reality data further comprises: causing rendering of the computer-altered reality data at a network location based on a motion metric, the motion metric corresponding to a level of motion associated a computer-altered reality data item, the computer-altered reality data item being indicated by a computer-altered reality data item identifier in the characteristics data.
However, Zhang teaches rendering data (Renderer & Encoder) at a network’s location (See Fig.2C Above) based on a motion metric (Action or Movement) that corresponds to a level of motion (Frequency, Table 1) associated with an item (User, NPC, or Game Object) that has been identified from characteristic data (View Change Events or Game Events). View change events and game events (Section: 3.1 View Change Events vs. Game Events) contain various actions that relate to motion, such as movement of an item, npc or user. (Fig.1) These events are then categorized by latency, size, and frequently to prioritize them. (Table 1) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi data that has been prioritized by Guim Bernat’s method to incorporate prioritizing data based on an item’s characteristic like motion, as taught by Zhang. This allows data to be prioritized in a way that items/users that are moving frequently are prioritized more to ensure low latency. Since, high levels of latency are bad in high-resource demanding systems like AR/VR (Guim Bernat, Para. 0611) and can cause individuals to experience dizziness (Zhang, Section: 3.1 View Change Events vs. Game Events, Para. 6).
Regarding claim 11, Kurabayashi and Guim Bernat fails to teach wherein causing the generation of rendered computer-altered reality data further comprises:
causing rendering, at a first network location, and based on a first interaction metric and a first motion metric, of a first portion of computer-altered reality data, the first network location being associated with a first distance from the user device;
and causing rendering, at a second network location and based on a second interaction metric and a second motion metric, of a second portion of the computer-altered reality data, the second network location being associated with a second distance from the user device, the second distance being greater than or equal to the first distance, at least one of the second interaction metric and or the second motion metric being less than at least one of the first interaction metric and or the first motion metric, respectively.
However, Zhang teaches rendering (Renderer & Encoder, See Fig. 2C Above) at a first network location (First Edge Cloud) based on a first interaction metric and a first motion metrics (View Change Event and Game Event). A portion of the data and the first network location (Gaming Service) being associated with a first distance (Player’s location) from a user device (Section 5.1 Modeling Edge Selection Problem Using MDP). Zhang further teaches rendering (Renderer & Encoder, See Fig. 2C Above) a second network location (Destination Edge Cloud) and based on a second interaction metric and second motion metric (View Change Event and Game Event). A second portion of data and the second network location (Gaming Service) being associated with a second distance (Change in Player’s Location) from the user device. The second distance (Player’s Changed Location) being greater than or equal (Network Transmission Cost to the first distance (Player’s Initial Location) and at least one of the second interaction metric or motion metric (View Change Event and Game Event) being less than (Change in Workload) one of the first interaction metric or first motion metric (View Change Event and Game Event). (Section 5.1 Modeling Edge Selection Problem Using MDP) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi and Guim Bernat network location to be determined by data that compares interaction/motion metrics and the distance between the network and user device, as taught by Zhang. Since doing so would provide the benefit of prioritizing data that is interacted or moved frequently to be on a network closer to the user device to reduce latency. When defining a compute edge for AR/VR choosing an edge closer to the device brings computer resources closer to workload data. (Guim Bernat, Para. 0108, 0263, 0614, and 1136)
Regarding claim 12, Kurabayashi teaches wherein the computer-altered reality data includes extended reality (XR) data (Mixed-Reality System. Para. 0041), and the rendered computer-altered reality data includes rendered XR data (Rendering Part 13, Para. 0094).
However, Kurabayashi fails to teach wherein causing the generation of the rendered computer-altered reality data further comprises:
causing, based on a first portion of the XR data, at least one first server associated with a network edge location to generate a first rendered portion of the rendered XR data;
causing, based on a second portion of the XR data, at least one second server associated with a network near-edge location to generate a second rendered portion of the rendered XR data;
causing, based on a third portion of the XR data, at least one third server associated with a network mid-edge location to generate a third rendered portion of the rendered XR data;
and causing, based on a fourth portion of the XR data, at least one fourth server associated with a network far-edge location to generate a fourth rendered portion of the rendered XR data.
Guim Bernat teaches four portions of XR data (Fig. 1 and Fig.2) associated with four servers with network edges “near edge”, “local edge”, “middle edge”, “far edge” that render the portions of XR data. (Para. 0109-0112) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi single network to incorporate Guim Bernat’s edge layers. Since doing so would provide the benefit of computing data across several servers. By computing data that requires low latency at the closer edges and data that does not at the farther away edges, its spreads the workload of the system. (Para. 0110, 1136)
Regarding claim 13, has similar limitations as of claims 2 and 3 and the additional limitation of an edge location of a network to which the user device is communicatively coupled, therefore it is rejected under the same rationale as claim 2 and 3 and the rejection below of the additional limitation.
Kurabayashi fails to teach an edge location of a network to which the user device is communicatively coupled.
However, Guim Bernat teaches an edge location of a network to which the user device is communicatively coupled (Guim Bernat, Para. 0712). Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi network to incorporate Guim Bernat’s edge network that is communicatively coupled to the user’s device. Since doing so allows user’s device and edge network to exchange data between each other.
Regarding claim 14, Kurabayashi teaches wherein the computer-altered reality data includes extended reality (XR) data (Mixed-Reality System. Para. 0041), and the rendered computer-altered reality data includes rendered XR data (Rendering Part 13, Para. 0094).
However, Kurabayashi fails to teach causing the generation of the rendered computer-altered reality data further comprises:
causing, based on a first portion of the XR data having a first priority, at least one first server associated with a network edge location or a service provider network cloud location to generate a first rendered portion of the rendered XR data;
and causing, based on a second portion of the XR data having a second priority that is less than the first priority, at least one second server associated with a decentralized overlay network location or a public cloud overlay network location to generate a second rendered portion of the rendered XR data.
Guim Bernat teaches a first and second portion of XR data (AR/VR, Para. 0305) that has a first and second priority (Priority Data or QoS constraints, Para. 0091 and 0265) associated with a first or second server that renders the portion of XR data.) The first server associated with a network edge location or service provider network cloud location. (Para. 0109-0112) The second server associated with a decentralized overlay network location or public cloud overlay network location. (Para. 0120 and Fig. 1 and 2) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi network/server to incorporate Guim Bernat’s overlay network that has multiple edge layers that prioritize data to determine which edge layer is used. Since doing so provides the benefit of using edge layers to reduce latency by sending higher priority data to the closer edges and pushing the less priority data to the further away edges. (Para. 0091, 0614, 1136)
Regarding claim 15, Kurabayashi fails to teach, wherein causing the generation of the rendered computer-altered reality data further comprises:
causing, based on a first item of the computer-altered reality data having a first priority, at least one first server associated with service provider network to generate a first rendered portion of the rendered computer-altered reality data;
and causing, based on a second item of the computer-altered reality data having a second priority that is less than the first priority, at least one second server associated with a public network to generate a second rendered portion of the rendered computer-altered reality data.
Guim Bernat teaches data (AR/VR, Para. 0305) that has a first and second priority (Priority Data or QoS constraints, Para. 0091 and 0265) associated with a first or second server that renders the data. The first server associated with a service provider network and the second server associated with a public network (Para. 0097-0098 and Fig. 1) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi network/server to incorporate Guim Bernat’s edge layers containing servers that receive data based on priorities or constraints. Since doing so provides the benefit of using edge layers to reduce latency by the high priority data being sent to the closer edges and pushing the less priority data to the further away edges. (Para. 0091, 0614, 1136)
However, Kurabayashi and Guim Bernat fails to teach:
causing, based on a first item of the computer-altered reality data having a first priority, at least one first server associated with service provider network to generate a first rendered portion of the rendered computer-altered reality data;
and causing, based on a second item of the computer-altered reality data having a second priority that is less than the first priority, at least one second server associated with a public network to generate a second rendered portion of the rendered computer-altered reality data.
Zhang teaches causing, based on a first item (NPC, User, Game Environment, Fig. 1) of the computer-altered reality data (VR-MMOGS Game Data) having a first priority (Latency, Frequency, Size, Table 1), at least one first server associated with service provider network to generate a first rendered (Renderer & Encoder) portion of the rendered computer-altered reality data (See Fig. 2C Above); and causing, based on a second item (NPC, User, Game Environment, Fig. 1) of the computer-altered reality data (VR-MMOGS Game Data) having a second priority (Latency, Frequency, Size, Table 1), that is less than the first priority, at least one second server associated with a public network to generate a second rendered portion (Renderer & Encoder, See Fig. 2C Above) of the rendered computer-altered reality data. (Section: 3.1 View Change Events Vs Game Events, Fig. 1) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi’s network/server and Guim Bernat’s edge layers that utilize multiple servers which prioritize data to incorporate Zhang’s data in the prioritization that is based on items. AR/VR environments include online gaming which is a top priority for edge computing. (Guim Bernat, Para. 0265) Most online gaming data consists of items that need updating such as players, NPCs, objects. (Zhang, Section 3.1 View Change Events Vs. Game Events) Hence, it would make sense for an AR/VR system to prioritize data based on item characteristics to determine edge layers for edge computing.
Regarding server claim 16 is drawn to the server corresponding to the method of using same as claimed in claim 7, Therefore server claim 16 correspond to system claim 7 is rejected for the same reasons of obviousness as used above.
Regarding claim 17, Kurabayashi fails to teach wherein causing rendering of the computer-altered reality data further comprises:
causing rendering, at a first network location associated with a first distance from the user device, of a first item of the computer-altered reality data;
and causing rendering, at a second network location associated with a second distance from the user device, of a second item of the computer-altered reality data, the second distance being greater than or equal to the first distance based on a first priority associated with the first item greater than a second priority associated with the second item.
Guim Bernat teaches a first network and second network location (Edge cloud Layers) associated with a first and second distance from a user’s device. (Para. 0100 and 0109, Fig. 1) The second distance being greater than or equal to the first distance based on a first and second priority. (Latency, Para. 0198, 0112, and 0265) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi network/server to incorporate Guim Bernat’s multiple edge layers that prioritize data based on distance and priority. Since doing so provides the benefit of using edge layers to reduce latency by sending the high priority data to the closer edges and pushing the less priority data to the further away edges. (Para. 0091, 0614, 1136)
However, Kurabayashi and Guim Bernat fails to further teach:
causing rendering, at a first network location associated with a first distance from the user device, of a first item of the computer-altered reality data;
and causing rendering, at a second network location associated with a second distance from the user device, of a second item of the computer-altered reality data, the second distance being greater than or equal to the first distance based on a first priority associated with the first item greater than a second priority associated with the second item.
Zhang teaches causing rendering, at a first network location associated with a first distance from the user device, of a first item (NPC, User, Game Environment, Fig. 1) of the computer-altered reality data; and causing rendering, at a second network location (Edge Cloud Network) associated with a second distance (Player’s Location) from the user device (Section 5.1 Modeling Edge Selection Problem Using MDP), of a second item (NPC, User, Game Environment, Fig. 1) of the computer-altered reality data, the second distance being greater than or equal to the first distance based on a first priority (Latency, Frequency, Size, Table 1), associated with the first item greater than a second priority (Latency, Frequency, Size, Table 1), associated with the second item. (Section: 3.1 View Change Events Vs Game Events; Section 5.1 Modeling Edge Selection Problem Using MDP) Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi’s network/server and Guim Bernat’s edge layers containing servers that prioritize data based on priorities and distance to incorporate Zhang’s data in the prioritization that is based on items. AR/VR environments include online gaming which is a top priority for edge computing. (Guim Bernat, Para. 0265) Most online gaming data consists of items that need updating such as players, NPCs, objects. (Zhang, Section 3.1 View Change Events Vs. Game Events) Hence, it would make sense for an AR/VR system to prioritize data based on item characteristics to determine edge layers for edge computing.
Regarding claim 18, has similar limitations as of claim 11 and 17. Therefore it is rejection under the same rationale as claim 11 and 17.
Regarding claim 19, has similar limitations as of claim 13, therefore it is rejected under the same rationale as claim 13.
Regarding claim 20, has similar limitations as of claim 14, therefore it is rejected under the same rationale as claim 14.
Claim 21 is rejected under 35 U.S.C. 103 as being unpatentable over Kurabayashi U.S. Patent Application No. 20190311471 A1 (hereinafter Kurabayashi) in view of Guim Bernat et al U.S. Patent Application No. 20210144517 A1 (hereinafter Guim Bernat) and "Towards Efficient Edge Cloud Augmentation for Virtual Reality MMOGs" by Wuyang Zhang, Jiachen Chen, Yanyong Zhang, and Dipankar Raychaudhuri (hereinafter Zhang) in further view of McHugh et al U.S. Patent No. 11145096 B2 (hereinafter McHugh).
Regarding claim 21, Kurabayashi, Guim Bernat, and Zhang fail to explicitly teach the method of claim 1, wherein the motion data includes multiple levels of motion each associated with a respective priority level, wherein a greater amount of motion is associated with a higher priority.
Kurabayashi, Guim Bernat, Zhang, and McHugh are considered analogous to the claimed invention because all three of them are in the same field of XR/VR/AR/MR rendering data.
McHugh teaches the method of claim 1, wherein the motion (Motion Sensor, Col. 27 Lines 39-56) data (Motion/Interaction of Virtual Object) includes multiple levels of motion (Col. 18 Lines 16-54) each associated with a respective priority level, wherein a greater amount of motion is associated with a higher priority (Col. 27 Lines 58-67 and Col. 28 Lines 1-8). McHugh’s priority system allows to assign priority levels to virtual objects based on criteria, which can be the movement or interaction of said virtual object. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kurabayashi’s computer-altered reality data that has been altered by Guim Bernat which is rendered by Zhang’s to incorporate McHugh’s Motion Priority System. Since doing so would provide the benefit of enabling a rendering engine to identify rules and relations between virtual objects (McHugh, Col. 14 Lines 63-67 and Col.15 Lines 1-11), which allows the rendering engine to modify virtual objects to provide more flexibility to the system.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRIANNA R COCHRAN whose telephone number is (571)272-4671. The examiner can normally be reached Mon-Fri. 7:30am - 5:00pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at (571) 272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/BRIANNA RENAE COCHRAN/Examiner, Art Unit 2615
/ALICIA M HARRINGTON/Supervisory Patent Examiner, Art Unit 2615