DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
2. The information disclosure statement (IDS) submitted on 02/20/2025 and 06/20/2025. The submission is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Claim Rejections - 35 USC § 103
3. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
4. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
5. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
6. Claim(s) 21-24, 28-35, 37, 39 and 40 is/are rejected under 35 U.S.C. 103 as being unpatentable over Todasco (US 2017/0123750 A1) in view of Hackett et al. (US 2016/0370971 A1).
7. With reference to claim 21, Todasco teaches A tangible, non-transitory, machine-readable medium storing instructions that, when executed, effectuate operations (“a non-transitory machine readable medium has stored thereon machine readable instructions executable to cause a machine to perform operations such as creating a virtual object that is displayable on one or more electronic displays when detected,” [0020]) Todasco also teaches obtaining, with a computer system, a three-dimensional model in a three-dimensional virtual space, the virtual space having a world-space coordinate system, (“the system may implement the virtual object request at process 101 in a virtual world and/or map for interaction. In some examples, the server may store the existence of the virtual object in a database with associated settings and information for rendering by user devices. For example, the virtual object may be associated with a 3-D model that may be rendered for viewing on a display. The 3-D model may be created by a user, selected from a predetermined set of 3-D models, and/or a combination of user created elements (such as text, color, lines, drawings, etc.) applied to a predetermined 3-D model. The system may give the virtual object a location in space in a virtual world and/or map. In some embodiments, the virtual object may be assigned global positioning system (GPS) coordinates. In some embodiments, the virtual object may be assigned a coordinate location within a virtual world coordinate system. In some examples, the coordinate location in the virtual world may be Cartesian coordinates.” [0030] “a device that includes computer system 700 may comprise a personal computing device (e.g., a smart or mobile phone, a computing tablet, a personal computer, laptop, wearable device, PDA, Bluetooth device, key FOB, badge, etc.).” [0085]) Todasco further teaches receiving, with the computer system, a first adjustment of a position of the model in the world-space coordinate system; determining, with the computer system, in response, moving the model in the world-space coordinate system (“the virtual object may be assigned a coordinate location within a virtual world coordinate system. In some examples, the coordinate location in the virtual world may be Cartesian coordinates. In some embodiments, the virtual world coordinate system may be translatable, transformed, and/or mapped to a GPS coordinate system along with other real world positional information, such as elevation and orientation. In some embodiments, the 3-D model may have one or more vectors to indicate the orientation of the object and/or the direct the object is facing in relation to one or more coordinate systems.” [0030] “the user device may process the user input and accordingly adjust the virtual object, such as changing the coordinate location of the virtual object, changing a setting on the virtual object, playing an animation related to the virtual object, changing possession of the virtual object, changing accounts and/or user device that may view/display or not view/display the virtual object, causing an action (e.g. transferring currency, transferring ownership of goods, providing a ticket for access to a service, displaying an animation, etc.), and/or the like. … the user device may provide the system and/or server orchestrating the virtual world updates to the changes and/or adjustments of the virtual object that occurs at process 207.” [0044-0045] “a device that includes computer system 700 may comprise a personal computing device (e.g., a smart or mobile phone, a computing tablet, a personal computer, laptop, wearable device, PDA, Bluetooth device, key FOB, badge, etc.).” [0085])
PNG
media_image1.png
789
347
media_image1.png
Greyscale
Todasco does not explicitly teach the model having a material thereon; the material is unlocked relative to the position of the model and without moving the material in the world-space coordinate system; receiving, with the computer system, a request to lock the material to the position of the model and, in response to a receiving a second adjustment of the position of the model, moving the material with the model in the world-space coordinate system. These are what Hackett teaches. Hackett teaches the model having a material thereon; (“As a user moves the pointer object around in 3D space, the system 100 can generate a quad and positional information corresponding to the quad. The normal of one or both of the triangles that define the quad can be used to define a forward vector. That is, the normal of the pointer object represents the normal of a first triangle in the quad. Similarly, the normal of the pointer object in the current position represents the normal of the second triangle. A right vector can be obtained by performing the cross product of the two normals. Each movement the user makes can be used to generate quads, and each quad can be stitched or appended together to generate a smooth brushstroke (e.g., ribbon of color, texture, line drawing, or other object or artifact representing user movement when generating 3D drawing content in the VR space). The look of a quad can be is defined by the texture, material, color, and shade or luminance.” [0050-0051]) Hackett also teaches the material is unlocked relative to the position of the model and without moving the material in the world-space coordinate system; receiving, with the computer system, a request to lock the material to the position of the model and, in response to a receiving a second adjustment of the position of the model, moving the material with the model in the world-space coordinate system. (“The systems described below can track the drawing motions of the user, generate artistic or annotated content based on those motions, and provide for moving around the content, the x-y plane or y-z plane (or other coordinate system) that the content is being generated in. … the user can add a number of fabric swatches to a dress form object to generate a skirt on the dress form object. The user can then move the dress form object or move around the dress form object to view aspects of the skirt, including, but not limited to, fabric movement, shadows and light generated by the fabric, and the overall drape of the skirt. The user can also add additional drawings, details, or content to the skirt by tilting/moving the dress form object.” [0030] “the devices 102, 103, 104, 106, and 108 can be laptop or desktop computers, smartphones, personal digital assistants, portable media players, tablet computers, gaming devices, or other appropriate computing devices that can communicate, using the network 101, with other computing devices or computer systems.” [0044] “the material as used herein may pertain to fabric and the properties of the fabric (e.g., fabric weight, fabric drape, and fabric shear recovery). The properties can be used to determine the look and movement of particular quads. The movement tracking module 112 includes a fabric movement simulator 114 that can simulate fabric weight, fabric drape, and fabric shear recovery of a fabric in order to simulate realistic 3D movement of the representations of fabric when placed/drawn on the dress form objects described herein. The fabric movement simulator 114 can access a number of fabric databases to access fabric data for application to brushstrokes of fabric that the user can place on or near the dress form object 118 in the VR space. In one example, the fabric movement simulator 114 can simulate fabric movement by obtaining fabric weight information associated with a user-selected fabric. The simulator 114 can then obtain, from the at least one input device associated with a user, (1) user movement direction and (2) force information associated with the dress form objects described herein. The fabric movement simulator 114 can then move at least a portion of the fabric at a speed based on the fabric weight and the force. For example, the simulator 114 can calculate typical forces with respect to user movements, fabric properties, and dress form object movements (e.g., pertaining to time and distance) and can select a threshold level of movement for a particular combination. The simulated movements may be generated and displayed in a first direction in response to determining the dress form object is moving in a second and opposite direction. That is, if the user twists the dress form object to the right, the draped fabric may shift or sway left a particular calculable amount.” [0052] “In operation of VR drawing system 108, the user is in control of an input device (e.g., such as a pointer object in a graphical user interface). When the pointer object is activated, the system 108 can record the pointer object position. As the pointer object moves, the system 108 can measure the difference from a previously recorded pointer object position and generate a new quad in response to the pointer object being moved by the user.” [0060]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
8. With reference to claim 22, Todasco does not explicitly teach the model comprises a mesh having polygons that define a shape of the model; and the material is mapped to one or more of the polygons, wherein the mapping is based on a position of the one or more of the polygons in the world-space coordinate system. These are what Hackett teaches. Hackett teaches the model comprises a mesh having polygons that define a shape of the model; (“Imported objects can be used to provide a visual reference for a user beginning to draw in three-dimensions within the VR space. The objects can be traced, or in some implementations, can be used as a guide in which the user can judge distances and shapes for recreating a drawing or other notation for the object. In some implementations, the user can draw on the imported object to annotate portions of the object. In some implementations, the user can modify the object by shrinking, growing, elongating, moving, turning, tilting, or otherwise manipulating the object and properties associated with object. In some implementations, the imported objects may be 2D or 3D and can include 3D models, scans, mesh models, depth collages, etc.” [0032]) Hackett also teaches the material is mapped to one or more of the polygons, wherein the mapping is based on a position of the one or more of the polygons in the world-space coordinate system. (“The systems described below can track the drawing motions of the user, generate artistic or annotated content based on those motions, and provide for moving around the content, the x-y plane or y-z plane (or other coordinate system) that the content is being generated in. … the user can add a number of fabric swatches to a dress form object to generate a skirt on the dress form object. The user can then move the dress form object or move around the dress form object to view aspects of the skirt, including, but not limited to, fabric movement, shadows and light generated by the fabric, and the overall drape of the skirt. The user can also add additional drawings, details, or content to the skirt by tilting/moving the dress form object.” [0030] “The triangular geometries include at least two triangles defining a three dimensional starting point for a cursor, represented in the virtual reality environment, and a three-dimensional ending point for the cursor. The positional information can include a beginning pointer location and a current pointer location. As a user moves the pointer object around in 3D space, the system 100 can generate a quad and positional information corresponding to the quad. The normal of one or both of the triangles that define the quad can be used to define a forward vector. That is, the normal of the pointer object represents the normal of a first triangle in the quad. Similarly, the normal of the pointer object in the current position represents the normal of the second triangle.” [0050] “the material as used herein may pertain to fabric and the properties of the fabric (e.g., fabric weight, fabric drape, and fabric shear recovery). The properties can be used to determine the look and movement of particular quads. The movement tracking module 112 includes a fabric movement simulator 114 that can simulate fabric weight, fabric drape, and fabric shear recovery of a fabric in order to simulate realistic 3D movement of the representations of fabric when placed/drawn on the dress form objects described herein. The fabric movement simulator 114 can access a number of fabric databases to access fabric data for application to brushstrokes of fabric that the user can place on or near the dress form object 118 in the VR space. In one example, the fabric movement simulator 114 can simulate fabric movement by obtaining fabric weight information associated with a user-selected fabric. The simulator 114 can then obtain, from the at least one input device associated with a user, (1) user movement direction and (2) force information associated with the dress form objects described herein. The fabric movement simulator 114 can then move at least a portion of the fabric at a speed based on the fabric weight and the force. For example, the simulator 114 can calculate typical forces with respect to user movements, fabric properties, and dress form object movements (e.g., pertaining to time and distance) and can select a threshold level of movement for a particular combination. The simulated movements may be generated and displayed in a first direction in response to determining the dress form object is moving in a second and opposite direction. That is, if the user twists the dress form object to the right, the draped fabric may shift or sway left a particular calculable amount.” [0052]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
9. With reference to claim 23, Todasco does not explicitly teach moving the model without moving the material comprises: receiving, with the computer system, new positional information of the model after moving the model; determining, with the computer system, based on the new positional information for the model, an updated mapping of the material to one or more polygons defining the shape of the model, the one or more polygons bounding a different portion of the material; and rendering, with the computer system, the model with the different portion of the material. This is what Hackett teaches (“The systems described below can track the drawing motions of the user, generate artistic or annotated content based on those motions, and provide for moving around the content, the x-y plane or y-z plane (or other coordinate system) that the content is being generated in. … the user can add a number of fabric swatches to a dress form object to generate a skirt on the dress form object. The user can then move the dress form object or move around the dress form object to view aspects of the skirt, including, but not limited to, fabric movement, shadows and light generated by the fabric, and the overall drape of the skirt. The user can also add additional drawings, details, or content to the skirt by tilting/moving the dress form object.” [0030] “Imported objects can be used to provide a visual reference for a user beginning to draw in three-dimensions within the VR space. The objects can be traced, or in some implementations, can be used as a guide in which the user can judge distances and shapes for recreating a drawing or other notation for the object. In some implementations, the user can draw on the imported object to annotate portions of the object. In some implementations, the user can modify the object by shrinking, growing, elongating, moving, turning, tilting, or otherwise manipulating the object and properties associated with object. In some implementations, the imported objects may be 2D or 3D and can include 3D models, scans, mesh models, depth collages, etc.” [0032] “the devices 102, 103, 104, 106, and 108 can be laptop or desktop computers, smartphones, personal digital assistants, portable media players, tablet computers, gaming devices, or other appropriate computing devices that can communicate, using the network 101, with other computing devices or computer systems.” [0044] “The triangular geometries include at least two triangles defining a three dimensional starting point for a cursor, represented in the virtual reality environment, and a three-dimensional ending point for the cursor. The positional information can include a beginning pointer location and a current pointer location. As a user moves the pointer object around in 3D space, the system 100 can generate a quad and positional information corresponding to the quad. The normal of one or both of the triangles that define the quad can be used to define a forward vector. That is, the normal of the pointer object represents the normal of a first triangle in the quad. Similarly, the normal of the pointer object in the current position represents the normal of the second triangle.” [0050] “the material as used herein may pertain to fabric and the properties of the fabric (e.g., fabric weight, fabric drape, and fabric shear recovery). The properties can be used to determine the look and movement of particular quads. The movement tracking module 112 includes a fabric movement simulator 114 that can simulate fabric weight, fabric drape, and fabric shear recovery of a fabric in order to simulate realistic 3D movement of the representations of fabric when placed/drawn on the dress form objects described herein. The fabric movement simulator 114 can access a number of fabric databases to access fabric data for application to brushstrokes of fabric that the user can place on or near the dress form object 118 in the VR space. In one example, the fabric movement simulator 114 can simulate fabric movement by obtaining fabric weight information associated with a user-selected fabric. The simulator 114 can then obtain, from the at least one input device associated with a user, (1) user movement direction and (2) force information associated with the dress form objects described herein. The fabric movement simulator 114 can then move at least a portion of the fabric at a speed based on the fabric weight and the force. For example, the simulator 114 can calculate typical forces with respect to user movements, fabric properties, and dress form object movements (e.g., pertaining to time and distance) and can select a threshold level of movement for a particular combination. The simulated movements may be generated and displayed in a first direction in response to determining the dress form object is moving in a second and opposite direction. That is, if the user twists the dress form object to the right, the draped fabric may shift or sway left a particular calculable amount.” [0052] “In operation of VR drawing system 108, the user is in control of an input device (e.g., such as a pointer object in a graphical user interface). When the pointer object is activated, the system 108 can record the pointer object position. As the pointer object moves, the system 108 can measure the difference from a previously recorded pointer object position and generate a new quad in response to the pointer object being moved by the user.” [0060]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
10. With reference to claim 24, Todasco does not explicitly teach moving the material with the model comprises: moving a mapping of the material to one or more polygons defining the shape of the model with the model; and
rendering, with the computer system, the model with the material. This is what Hackett teaches (“Imported objects can be used to provide a visual reference for a user beginning to draw in three-dimensions within the VR space. The objects can be traced, or in some implementations, can be used as a guide in which the user can judge distances and shapes for recreating a drawing or other notation for the object. In some implementations, the user can draw on the imported object to annotate portions of the object. In some implementations, the user can modify the object by shrinking, growing, elongating, moving, turning, tilting, or otherwise manipulating the object and properties associated with object. In some implementations, the imported objects may be 2D or 3D and can include 3D models, scans, mesh models, depth collages, etc.” [0032] “the devices 102, 103, 104, 106, and 108 can be laptop or desktop computers, smartphones, personal digital assistants, portable media players, tablet computers, gaming devices, or other appropriate computing devices that can communicate, using the network 101, with other computing devices or computer systems.” [0044] “The triangular geometries include at least two triangles defining a three dimensional starting point for a cursor, represented in the virtual reality environment, and a three-dimensional ending point for the cursor. The positional information can include a beginning pointer location and a current pointer location. As a user moves the pointer object around in 3D space, the system 100 can generate a quad and positional information corresponding to the quad. The normal of one or both of the triangles that define the quad can be used to define a forward vector. That is, the normal of the pointer object represents the normal of a first triangle in the quad. Similarly, the normal of the pointer object in the current position represents the normal of the second triangle.” [0050] “the material as used herein may pertain to fabric and the properties of the fabric (e.g., fabric weight, fabric drape, and fabric shear recovery). The properties can be used to determine the look and movement of particular quads. The movement tracking module 112 includes a fabric movement simulator 114 that can simulate fabric weight, fabric drape, and fabric shear recovery of a fabric in order to simulate realistic 3D movement of the representations of fabric when placed/drawn on the dress form objects described herein. The fabric movement simulator 114 can access a number of fabric databases to access fabric data for application to brushstrokes of fabric that the user can place on or near the dress form object 118 in the VR space. In one example, the fabric movement simulator 114 can simulate fabric movement by obtaining fabric weight information associated with a user-selected fabric. The simulator 114 can then obtain, from the at least one input device associated with a user, (1) user movement direction and (2) force information associated with the dress form objects described herein. The fabric movement simulator 114 can then move at least a portion of the fabric at a speed based on the fabric weight and the force. For example, the simulator 114 can calculate typical forces with respect to user movements, fabric properties, and dress form object movements (e.g., pertaining to time and distance) and can select a threshold level of movement for a particular combination. The simulated movements may be generated and displayed in a first direction in response to determining the dress form object is moving in a second and opposite direction. That is, if the user twists the dress form object to the right, the draped fabric may shift or sway left a particular calculable amount.” [0052]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
11. With reference to claim 28, Todasco does not explicitly teach the operations further comprising: independently locking and unlocking scale and position of the material relative to the model. This is what Hackett teaches (“The systems described below can track the drawing motions of the user, generate artistic or annotated content based on those motions, and provide for moving around the content, the x-y plane or y-z plane (or other coordinate system) that the content is being generated in. … the user can add a number of fabric swatches to a dress form object to generate a skirt on the dress form object. The user can then move the dress form object or move around the dress form object to view aspects of the skirt, including, but not limited to, fabric movement, shadows and light generated by the fabric, and the overall drape of the skirt. The user can also add additional drawings, details, or content to the skirt by tilting/moving the dress form object.” [0030] “Imported objects can be used to provide a visual reference for a user beginning to draw in three-dimensions within the VR space. The objects can be traced, or in some implementations, can be used as a guide in which the user can judge distances and shapes for recreating a drawing or other notation for the object. In some implementations, the user can draw on the imported object to annotate portions of the object. In some implementations, the user can modify the object by shrinking, growing, elongating, moving, turning, tilting, or otherwise manipulating the object and properties associated with object. In some implementations, the imported objects may be 2D or 3D and can include 3D models, scans, mesh models, depth collages, etc.” [0032] “The triangular geometries include at least two triangles defining a three dimensional starting point for a cursor, represented in the virtual reality environment, and a three-dimensional ending point for the cursor. The positional information can include a beginning pointer location and a current pointer location. As a user moves the pointer object around in 3D space, the system 100 can generate a quad and positional information corresponding to the quad. The normal of one or both of the triangles that define the quad can be used to define a forward vector. That is, the normal of the pointer object represents the normal of a first triangle in the quad. Similarly, the normal of the pointer object in the current position represents the normal of the second triangle.” [0050] “the material as used herein may pertain to fabric and the properties of the fabric (e.g., fabric weight, fabric drape, and fabric shear recovery). The properties can be used to determine the look and movement of particular quads. The movement tracking module 112 includes a fabric movement simulator 114 that can simulate fabric weight, fabric drape, and fabric shear recovery of a fabric in order to simulate realistic 3D movement of the representations of fabric when placed/drawn on the dress form objects described herein. The fabric movement simulator 114 can access a number of fabric databases to access fabric data for application to brushstrokes of fabric that the user can place on or near the dress form object 118 in the VR space. In one example, the fabric movement simulator 114 can simulate fabric movement by obtaining fabric weight information associated with a user-selected fabric. The simulator 114 can then obtain, from the at least one input device associated with a user, (1) user movement direction and (2) force information associated with the dress form objects described herein. The fabric movement simulator 114 can then move at least a portion of the fabric at a speed based on the fabric weight and the force. For example, the simulator 114 can calculate typical forces with respect to user movements, fabric properties, and dress form object movements (e.g., pertaining to time and distance) and can select a threshold level of movement for a particular combination. The simulated movements may be generated and displayed in a first direction in response to determining the dress form object is moving in a second and opposite direction. That is, if the user twists the dress form object to the right, the draped fabric may shift or sway left a particular calculable amount.” [0052]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
12. With reference to claim 29, Todasco does not explicitly teach the scale of the material is locked and the position of the material is unlocked relative to the model, and wherein the operations further comprise: determining in response to adjustment of the scale of the model, adjusting the scale of the material in the world-space coordinate system based on the adjustment to the scale of the model to relatively resize the material to the model; determining after moving the model to a new position, based on new positional information for the model, an updated mapping of the material to one or more polygons defining the shape of the model, the one or more polygons bounding a different portion of the material; and causing the model to be rendered in the virtual world with the material mapped on the model based on the updated mapping and the material sized in the world-space coordinate system according to the adjustment. This is what Hackett teaches (“The systems described below can track the drawing motions of the user, generate artistic or annotated content based on those motions, and provide for moving around the content, the x-y plane or y-z plane (or other coordinate system) that the content is being generated in. … the user can add a number of fabric swatches to a dress form object to generate a skirt on the dress form object. The user can then move the dress form object or move around the dress form object to view aspects of the skirt, including, but not limited to, fabric movement, shadows and light generated by the fabric, and the overall drape of the skirt. The user can also add additional drawings, details, or content to the skirt by tilting/moving the dress form object.” [0030] “Imported objects can be used to provide a visual reference for a user beginning to draw in three-dimensions within the VR space. The objects can be traced, or in some implementations, can be used as a guide in which the user can judge distances and shapes for recreating a drawing or other notation for the object. In some implementations, the user can draw on the imported object to annotate portions of the object. In some implementations, the user can modify the object by shrinking, growing, elongating, moving, turning, tilting, or otherwise manipulating the object and properties associated with object. In some implementations, the imported objects may be 2D or 3D and can include 3D models, scans, mesh models, depth collages, etc.” [0032] “The triangular geometries include at least two triangles defining a three dimensional starting point for a cursor, represented in the virtual reality environment, and a three-dimensional ending point for the cursor. The positional information can include a beginning pointer location and a current pointer location. As a user moves the pointer object around in 3D space, the system 100 can generate a quad and positional information corresponding to the quad. The normal of one or both of the triangles that define the quad can be used to define a forward vector. That is, the normal of the pointer object represents the normal of a first triangle in the quad. Similarly, the normal of the pointer object in the current position represents the normal of the second triangle.” [0050] “the material as used herein may pertain to fabric and the properties of the fabric (e.g., fabric weight, fabric drape, and fabric shear recovery). The properties can be used to determine the look and movement of particular quads. The movement tracking module 112 includes a fabric movement simulator 114 that can simulate fabric weight, fabric drape, and fabric shear recovery of a fabric in order to simulate realistic 3D movement of the representations of fabric when placed/drawn on the dress form objects described herein. The fabric movement simulator 114 can access a number of fabric databases to access fabric data for application to brushstrokes of fabric that the user can place on or near the dress form object 118 in the VR space. In one example, the fabric movement simulator 114 can simulate fabric movement by obtaining fabric weight information associated with a user-selected fabric. The simulator 114 can then obtain, from the at least one input device associated with a user, (1) user movement direction and (2) force information associated with the dress form objects described herein. The fabric movement simulator 114 can then move at least a portion of the fabric at a speed based on the fabric weight and the force. For example, the simulator 114 can calculate typical forces with respect to user movements, fabric properties, and dress form object movements (e.g., pertaining to time and distance) and can select a threshold level of movement for a particular combination. The simulated movements may be generated and displayed in a first direction in response to determining the dress form object is moving in a second and opposite direction. That is, if the user twists the dress form object to the right, the draped fabric may shift or sway left a particular calculable amount.” [0052] “In operation of VR drawing system 108, the user is in control of an input device (e.g., such as a pointer object in a graphical user interface). When the pointer object is activated, the system 108 can record the pointer object position. As the pointer object moves, the system 108 can measure the difference from a previously recorded pointer object position and generate a new quad in response to the pointer object being moved by the user.” [0060]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
13. With reference to claim 30, Todasco does not explicitly teach the scale of the material is unlocked and the position of the material is locked relative to the model, and wherein the operations further comprise: in response to adjustment of the scale of the model, with the computer system, retaining the scale of the material in the world-space coordinate system regardless of the adjustment to the scale of the model as an original scale of the material; in response to moving the model to a new position, moving a mapping of the material to one or more polygons defining the shape of the model with the model; and causing, with the computer system, the model to be rendered in the virtual space with the material mapped on the model and the material sized based on the original scale in the world-space coordinate system. This is what Hackett teaches (“The systems described below can track the drawing motions of the user, generate artistic or annotated content based on those motions, and provide for moving around the content, the x-y plane or y-z plane (or other coordinate system) that the content is being generated in. … the user can add a number of fabric swatches to a dress form object to generate a skirt on the dress form object. The user can then move the dress form object or move around the dress form object to view aspects of the skirt, including, but not limited to, fabric movement, shadows and light generated by the fabric, and the overall drape of the skirt. The user can also add additional drawings, details, or content to the skirt by tilting/moving the dress form object.” [0030] “Imported objects can be used to provide a visual reference for a user beginning to draw in three-dimensions within the VR space. The objects can be traced, or in some implementations, can be used as a guide in which the user can judge distances and shapes for recreating a drawing or other notation for the object. In some implementations, the user can draw on the imported object to annotate portions of the object. In some implementations, the user can modify the object by shrinking, growing, elongating, moving, turning, tilting, or otherwise manipulating the object and properties associated with object. In some implementations, the imported objects may be 2D or 3D and can include 3D models, scans, mesh models, depth collages, etc.” [0032] “the devices 102, 103, 104, 106, and 108 can be laptop or desktop computers, smartphones, personal digital assistants, portable media players, tablet computers, gaming devices, or other appropriate computing devices that can communicate, using the network 101, with other computing devices or computer systems.” [0044] “The triangular geometries include at least two triangles defining a three dimensional starting point for a cursor, represented in the virtual reality environment, and a three-dimensional ending point for the cursor. The positional information can include a beginning pointer location and a current pointer location. As a user moves the pointer object around in 3D space, the system 100 can generate a quad and positional information corresponding to the quad. The normal of one or both of the triangles that define the quad can be used to define a forward vector. That is, the normal of the pointer object represents the normal of a first triangle in the quad. Similarly, the normal of the pointer object in the current position represents the normal of the second triangle.” [0050] “the material as used herein may pertain to fabric and the properties of the fabric (e.g., fabric weight, fabric drape, and fabric shear recovery). The properties can be used to determine the look and movement of particular quads. The movement tracking module 112 includes a fabric movement simulator 114 that can simulate fabric weight, fabric drape, and fabric shear recovery of a fabric in order to simulate realistic 3D movement of the representations of fabric when placed/drawn on the dress form objects described herein. The fabric movement simulator 114 can access a number of fabric databases to access fabric data for application to brushstrokes of fabric that the user can place on or near the dress form object 118 in the VR space. In one example, the fabric movement simulator 114 can simulate fabric movement by obtaining fabric weight information associated with a user-selected fabric. The simulator 114 can then obtain, from the at least one input device associated with a user, (1) user movement direction and (2) force information associated with the dress form objects described herein. The fabric movement simulator 114 can then move at least a portion of the fabric at a speed based on the fabric weight and the force. For example, the simulator 114 can calculate typical forces with respect to user movements, fabric properties, and dress form object movements (e.g., pertaining to time and distance) and can select a threshold level of movement for a particular combination. The simulated movements may be generated and displayed in a first direction in response to determining the dress form object is moving in a second and opposite direction. That is, if the user twists the dress form object to the right, the draped fabric may shift or sway left a particular calculable amount.” [0052] “In operation of VR drawing system 108, the user is in control of an input device (e.g., such as a pointer object in a graphical user interface). When the pointer object is activated, the system 108 can record the pointer object position. As the pointer object moves, the system 108 can measure the difference from a previously recorded pointer object position and generate a new quad in response to the pointer object being moved by the user.” [0060]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
14. With reference to claim 31, Todasco does not explicitly teach the model comprises a plurality of layers of materials, including the material, that are independently lockable and unlockable. This is what Hackett teaches (“the user can add a number of fabric swatches to a dress form object to generate a skirt on the dress form object. The user can then move the dress form object or move around the dress form object to view aspects of the skirt, including, but not limited to, fabric movement, shadows and light generated by the fabric, and the overall drape of the skirt. The user can also add additional drawings, details, or content to the skirt by tilting/moving the dress form object.” [0030] “Imported objects can be used to provide a visual reference for a user beginning to draw in three-dimensions within the VR space. The objects can be traced, or in some implementations, can be used as a guide in which the user can judge distances and shapes for recreating a drawing or other notation for the object. In some implementations, the user can draw on the imported object to annotate portions of the object. In some implementations, the user can modify the object by shrinking, growing, elongating, moving, turning, tilting, or otherwise manipulating the object and properties associated with object. In some implementations, the imported objects may be 2D or 3D and can include 3D models, scans, mesh models, depth collages, etc.” [0032] “The triangular geometries include at least two triangles defining a three dimensional starting point for a cursor, represented in the virtual reality environment, and a three-dimensional ending point for the cursor. The positional information can include a beginning pointer location and a current pointer location. As a user moves the pointer object around in 3D space, the system 100 can generate a quad and positional information corresponding to the quad. … The look of a quad can be is defined by the texture, material, color, and shade or luminance. The texture is a property of the material, with the material being unique per brush and functioning to define the behavior the texture may have with respect to lighting in a VR space (e.g., scene). The color of a quad is set per vertex, and defined by the user, as described in detail below. The shade can be applied to the quad using various inputs from the VR space to modify the look of the quad. … the material as used herein may pertain to fabric and the properties of the fabric (e.g., fabric weight, fabric drape, and fabric shear recovery). The properties can be used to determine the look and movement of particular quads. The movement tracking module 112 includes a fabric movement simulator 114 that can simulate fabric weight, fabric drape, and fabric shear recovery of a fabric in order to simulate realistic 3D movement of the representations of fabric when placed/drawn on the dress form objects described herein. The fabric movement simulator 114 can access a number of fabric databases to access fabric data for application to brushstrokes of fabric that the user can place on or near the dress form object 118 in the VR space. In one example, the fabric movement simulator 114 can simulate fabric movement by obtaining fabric weight information associated with a user-selected fabric. The simulator 114 can then obtain, from the at least one input device associated with a user, (1) user movement direction and (2) force information associated with the dress form objects described herein. The fabric movement simulator 114 can then move at least a portion of the fabric at a speed based on the fabric weight and the force. For example, the simulator 114 can calculate typical forces with respect to user movements, fabric properties, and dress form object movements (e.g., pertaining to time and distance) and can select a threshold level of movement for a particular combination. The simulated movements may be generated and displayed in a first direction in response to determining the dress form object is moving in a second and opposite direction. That is, if the user twists the dress form object to the right, the draped fabric may shift or sway left a particular calculable amount.” [0050-0052] “In operation of VR drawing system 108, the user is in control of an input device (e.g., such as a pointer object in a graphical user interface). When the pointer object is activated, the system 108 can record the pointer object position. As the pointer object moves, the system 108 can measure the difference from a previously recorded pointer object position and generate a new quad in response to the pointer object being moved by the user.” [0060]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
15. With reference to claim 32, Todasco does not explicitly teach the operations further comprise providing means for editing smart materials. This is what Hackett teaches (“The objects can be traced, or in some implementations, can be used as a guide in which the user can judge distances and shapes for recreating a drawing or other notation for the object. In some implementations, the user can draw on the imported object to annotate portions of the object. In some implementations, the user can modify the object by shrinking, growing, elongating, moving, turning, tilting, or otherwise manipulating the object and properties associated with object. In some implementations, the imported objects may be 2D or 3D and can include 3D models, scans, mesh models, depth collages, etc.” [0032] “The look of a quad can be is defined by the texture, material, color, and shade or luminance. The texture is a property of the material, with the material being unique per brush and functioning to define the behavior the texture may have with respect to lighting in a VR space (e.g., scene). The color of a quad is set per vertex, and defined by the user, as described in detail below. The shade can be applied to the quad using various inputs from the VR space to modify the look of the quad.” [0051]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
16. With reference to claim 33, Todasco does not explicitly teach the operations further comprise steps for applying the material to the model. This is what Hackett teaches (“The triangular geometries include at least two triangles defining a three dimensional starting point for a cursor, represented in the virtual reality environment, and a three-dimensional ending point for the cursor. The positional information can include a beginning pointer location and a current pointer location. As a user moves the pointer object around in 3D space, the system 100 can generate a quad and positional information corresponding to the quad. … The look of a quad can be is defined by the texture, material, color, and shade or luminance. The texture is a property of the material, with the material being unique per brush and functioning to define the behavior the texture may have with respect to lighting in a VR space (e.g., scene). The color of a quad is set per vertex, and defined by the user, as described in detail below. The shade can be applied to the quad using various inputs from the VR space to modify the look of the quad. … the material as used herein may pertain to fabric and the properties of the fabric (e.g., fabric weight, fabric drape, and fabric shear recovery). The properties can be used to determine the look and movement of particular quads. The movement tracking module 112 includes a fabric movement simulator 114 that can simulate fabric weight, fabric drape, and fabric shear recovery of a fabric in order to simulate realistic 3D movement of the representations of fabric when placed/drawn on the dress form objects described herein. The fabric movement simulator 114 can access a number of fabric databases to access fabric data for application to brushstrokes of fabric that the user can place on or near the dress form object 118 in the VR space. In one example, the fabric movement simulator 114 can simulate fabric movement by obtaining fabric weight information associated with a user-selected fabric. The simulator 114 can then obtain, from the at least one input device associated with a user, (1) user movement direction and (2) force information associated with the dress form objects described herein. The fabric movement simulator 114 can then move at least a portion of the fabric at a speed based on the fabric weight and the force. For example, the simulator 114 can calculate typical forces with respect to user movements, fabric properties, and dress form object movements (e.g., pertaining to time and distance) and can select a threshold level of movement for a particular combination. The simulated movements may be generated and displayed in a first direction in response to determining the dress form object is moving in a second and opposite direction. That is, if the user twists the dress form object to the right, the draped fabric may shift or sway left a particular calculable amount.” [0050-0052]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
17. With reference to claim 34, Todasco does not explicitly teach the operations further comprise steps for manipulating properties of the material. This is what Hackett teaches (“The triangular geometries include at least two triangles defining a three dimensional starting point for a cursor, represented in the virtual reality environment, and a three-dimensional ending point for the cursor. The positional information can include a beginning pointer location and a current pointer location. As a user moves the pointer object around in 3D space, the system 100 can generate a quad and positional information corresponding to the quad. … The look of a quad can be is defined by the texture, material, color, and shade or luminance. The texture is a property of the material, with the material being unique per brush and functioning to define the behavior the texture may have with respect to lighting in a VR space (e.g., scene). The color of a quad is set per vertex, and defined by the user, as described in detail below. The shade can be applied to the quad using various inputs from the VR space to modify the look of the quad. … the material as used herein may pertain to fabric and the properties of the fabric (e.g., fabric weight, fabric drape, and fabric shear recovery). The properties can be used to determine the look and movement of particular quads. The movement tracking module 112 includes a fabric movement simulator 114 that can simulate fabric weight, fabric drape, and fabric shear recovery of a fabric in order to simulate realistic 3D movement of the representations of fabric when placed/drawn on the dress form objects described herein. The fabric movement simulator 114 can access a number of fabric databases to access fabric data for application to brushstrokes of fabric that the user can place on or near the dress form object 118 in the VR space. In one example, the fabric movement simulator 114 can simulate fabric movement by obtaining fabric weight information associated with a user-selected fabric. The simulator 114 can then obtain, from the at least one input device associated with a user, (1) user movement direction and (2) force information associated with the dress form objects described herein. The fabric movement simulator 114 can then move at least a portion of the fabric at a speed based on the fabric weight and the force. For example, the simulator 114 can calculate typical forces with respect to user movements, fabric properties, and dress form object movements (e.g., pertaining to time and distance) and can select a threshold level of movement for a particular combination. The simulated movements may be generated and displayed in a first direction in response to determining the dress form object is moving in a second and opposite direction. That is, if the user twists the dress form object to the right, the draped fabric may shift or sway left a particular calculable amount.” [0050-0052]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
18. With reference to claim 35, Todasco does not explicitly teach the operations further comprise steps for defining the material. This is what Hackett teaches (“The triangular geometries include at least two triangles defining a three dimensional starting point for a cursor, represented in the virtual reality environment, and a three-dimensional ending point for the cursor. The positional information can include a beginning pointer location and a current pointer location. As a user moves the pointer object around in 3D space, the system 100 can generate a quad and positional information corresponding to the quad. … The look of a quad can be is defined by the texture, material, color, and shade or luminance. The texture is a property of the material, with the material being unique per brush and functioning to define the behavior the texture may have with respect to lighting in a VR space (e.g., scene). The color of a quad is set per vertex, and defined by the user, as described in detail below. The shade can be applied to the quad using various inputs from the VR space to modify the look of the quad. … the material as used herein may pertain to fabric and the properties of the fabric (e.g., fabric weight, fabric drape, and fabric shear recovery). The properties can be used to determine the look and movement of particular quads. The movement tracking module 112 includes a fabric movement simulator 114 that can simulate fabric weight, fabric drape, and fabric shear recovery of a fabric in order to simulate realistic 3D movement of the representations of fabric when placed/drawn on the dress form objects described herein. The fabric movement simulator 114 can access a number of fabric databases to access fabric data for application to brushstrokes of fabric that the user can place on or near the dress form object 118 in the VR space. In one example, the fabric movement simulator 114 can simulate fabric movement by obtaining fabric weight information associated with a user-selected fabric. The simulator 114 can then obtain, from the at least one input device associated with a user, (1) user movement direction and (2) force information associated with the dress form objects described herein. The fabric movement simulator 114 can then move at least a portion of the fabric at a speed based on the fabric weight and the force. For example, the simulator 114 can calculate typical forces with respect to user movements, fabric properties, and dress form object movements (e.g., pertaining to time and distance) and can select a threshold level of movement for a particular combination. The simulated movements may be generated and displayed in a first direction in response to determining the dress form object is moving in a second and opposite direction. That is, if the user twists the dress form object to the right, the draped fabric may shift or sway left a particular calculable amount.” [0050-0052]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
19. With reference to claim 37, Todasco teaches the operations are performed by a server system hosting a distributed developer application. (“FIG. 7 illustrates an exemplary embodiment of a computer system 700 adapted for implementing one or more of the devices and servers of FIG. 6. As shown, a computer system 700 may comprise or implement software components that operate to perform various methodologies in accordance with the described embodiments. Some computer systems may implement one or more operating systems (OS) such as a MICROSOFT® OS, a UNIX® OS, a LINUX® OS, or other suitable OS. It may be appreciated that the system illustrated in FIG. 7 may be deployed in other ways and that the operations performed and/or the services provided by the system may be combined, distributed, and/or separated over several systems over a network for a given implementation and may be performed by any number of systems.” [0084])
20. With reference to claim 39, Todasco does not explicitly teach the material is a two-dimensional material. This is what Hackett teaches (“the tool palettes can include mechanisms to import preexisting files containing 2D or 3D objects including, but not limited to images representing data, art, photographs, models, and/or augmented reality content.” [0028] “The systems described below can track the drawing motions of the user, generate artistic or annotated content based on those motions, and provide for moving around the content, the x-y plane or y-z plane (or other coordinate system) that the content is being generated in. … the user can add a number of fabric swatches to a dress form object to generate a skirt on the dress form object. The user can then move the dress form object or move around the dress form object to view aspects of the skirt, including, but not limited to, fabric movement, shadows and light generated by the fabric, and the overall drape of the skirt. The user can also add additional drawings, details, or content to the skirt by tilting/moving the dress form object.” [0030] “the imported objects may be 2D or 3D and can include 3D models, scans, mesh models, depth collages, etc. Imported images can include any displayable file type including, but not limited to a CAD file, a jpeg file, a png, a bitmap file, or other file type. In some implementations, the user can export images generated, modified, or otherwise changed within the VR space.” [0032]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
21. Claim 40 is similar in scope to claim 21, and thus is rejected under similar rationale.
22. Claim(s) 25-27, 36 and 38 is/are rejected under 35 U.S.C. 103 as being unpatentable over Todasco (US 2017/0123750 A1) and Hackett et al. (US 2016/0370971 A1), as applied to claim 21 above, and further in view of McCall (US 2020/0098173 A1).
23. With reference to claim 25, Todasco does not explicitly teach rendering the model in a game, wherein the model is rendered based on one or more parameters associated with the material. This is what Hackett teaches. Hackett teaches the model is rendered based on one or more parameters associated with the material. (“The look of a quad can be is defined by the texture, material, color, and shade or luminance. The texture is a property of the material, with the material being unique per brush and functioning to define the behavior the texture may have with respect to lighting in a VR space (e.g., scene). … the material as used herein may pertain to fabric and the properties of the fabric (e.g., fabric weight, fabric drape, and fabric shear recovery). The properties can be used to determine the look and movement of particular quads. The movement tracking module 112 includes a fabric movement simulator 114 that can simulate fabric weight, fabric drape, and fabric shear recovery of a fabric in order to simulate realistic 3D movement of the representations of fabric when placed/drawn on the dress form objects described herein. The fabric movement simulator 114 can access a number of fabric databases to access fabric data for application to brushstrokes of fabric that the user can place on or near the dress form object 118 in the VR space. In one example, the fabric movement simulator 114 can simulate fabric movement by obtaining fabric weight information associated with a user-selected fabric. The simulator 114 can then obtain, from the at least one input device associated with a user, (1) user movement direction and (2) force information associated with the dress form objects described herein. The fabric movement simulator 114 can then move at least a portion of the fabric at a speed based on the fabric weight and the force. For example, the simulator 114 can calculate typical forces with respect to user movements, fabric properties, and dress form object movements (e.g., pertaining to time and distance) and can select a threshold level of movement for a particular combination. The simulated movements may be generated and displayed in a first direction in response to determining the dress form object is moving in a second and opposite direction. That is, if the user twists the dress form object to the right, the draped fabric may shift or sway left a particular calculable amount.” [0051-0052]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
The combination of Todasco and Hackett does not explicitly teach rendering the model in a game. This is what McCall teaches (“in gaming for example, 3D models (alternatively called 3D assets, or 3D content) are pre-loaded in the game application so when a user starts up the game, all of the 3D assets that will be viewed by the user are already on the device. When updates are needed, the application will add new content offline. For example, the game may apply a patch when the game is not running, and when the application is next opened, the new content is installed and ready for use.” [0046] “The wearable system 600 can comprise an avatar processing and rendering system 690. The avatar processing and rendering system 690 can be configured to generate, update, animate, and render an avatar based on contextual information.” [0119]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of McCall into the combination of Todasco and Hackett, in order to convenient for infrequent, planned updates.
24. With reference to claim 26, Todasco does not explicitly teach the one or more parameters include a shading parameter to render the model with shading effect, and wherein, when the shading parameter is enabled, the model is rendered with a shading layer on the material that causes at least a portion of the model to be lighter or darker than other portions of the model. This is what Hackett teaches (“The look of a quad can be is defined by the texture, material, color, and shade or luminance. The texture is a property of the material, with the material being unique per brush and functioning to define the behavior the texture may have with respect to lighting in a VR space (e.g., scene). The color of a quad is set per vertex, and defined by the user, as described in detail below. The shade can be applied to the quad using various inputs from the VR space to modify the look of the quad. Inputs that can affect the shade include color, time, audio input, world space position, model space position, and light/luminance values, … the material as used herein may pertain to fabric and the properties of the fabric (e.g., fabric weight, fabric drape, and fabric shear recovery). The properties can be used to determine the look and movement of particular quads. The movement tracking module 112 includes a fabric movement simulator 114 that can simulate fabric weight, fabric drape, and fabric shear recovery of a fabric in order to simulate realistic 3D movement of the representations of fabric when placed/drawn on the dress form objects described herein. The fabric movement simulator 114 can access a number of fabric databases to access fabric data for application to brushstrokes of fabric that the user can place on or near the dress form object 118 in the VR space. In one example, the fabric movement simulator 114 can simulate fabric movement by obtaining fabric weight information associated with a user-selected fabric. The simulator 114 can then obtain, from the at least one input device associated with a user, (1) user movement direction and (2) force information associated with the dress form objects described herein. The fabric movement simulator 114 can then move at least a portion of the fabric at a speed based on the fabric weight and the force. For example, the simulator 114 can calculate typical forces with respect to user movements, fabric properties, and dress form object movements (e.g., pertaining to time and distance) and can select a threshold level of movement for a particular combination. The simulated movements may be generated and displayed in a first direction in response to determining the dress form object is moving in a second and opposite direction. That is, if the user twists the dress form object to the right, the draped fabric may shift or sway left a particular calculable amount.” [0051-0052] “The hues range from lightness/bright or lightness and darkness, which can be depicted as a color, shade, or numerical value. Sliding a hue slider (not shown) up and down can cause the color palette 506a to physically cause movement (i.e., virtual movement) in the VR space, as shown by growing color palette 506b.” [0080]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
25. With reference to claim 27, Todasco does not explicitly teach the shading effect is determined based on a position and orientation of the model and a light source in the virtual space. This is what Hackett teaches (“As a user moves the pointer object around in 3D space, the system 100 can generate a quad and positional information corresponding to the quad…. Each movement the user makes can be used to generate quads, and each quad can be stitched or appended together to generate a smooth brushstroke (e.g., ribbon of color, texture, line drawing, or other object or artifact representing user movement when generating 3D drawing content in the VR space). The look of a quad can be is defined by the texture, material, color, and shade or luminance. The texture is a property of the material, with the material being unique per brush and functioning to define the behavior the texture may have with respect to lighting in a VR space (e.g., scene). The color of a quad is set per vertex, and defined by the user, as described in detail below. The shade can be applied to the quad using various inputs from the VR space to modify the look of the quad. Inputs that can affect the shade include color, time, audio input, world space position, model space position, and light/luminance values,” [0050-0051] “The hues range from lightness/bright or lightness and darkness, which can be depicted as a color, shade, or numerical value. Sliding a hue slider (not shown) up and down can cause the color palette 506a to physically cause movement (i.e., virtual movement) in the VR space, as shown by growing color palette 506b. … the cross-sectioned space may be represented as a cross-section of a cube that translates according to a user-selectable hue and then the texture on the cross-section updates color according to the position that it is cross-sectioning the cube. A user can select a hue to begin painting a drawing and can reselect additional hues to change colors and begin drawing in the reselected hues, accordingly.” [0080-0081]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of Hackett into Todasco, in order to translate well into a virtual reality environment.
26. With reference to claim 36, The combination of Todasco and Hackett does not explicitly teach the operations further comprise steps for publishing a game including the model. This is what McCall teaches (“in gaming for example, 3D models (alternatively called 3D assets, or 3D content) are pre-loaded in the game application so when a user starts up the game, all of the 3D assets that will be viewed by the user are already on the device. When updates are needed, the application will add new content offline. For example, the game may apply a patch when the game is not running, and when the application is next opened, the new content is installed and ready for use.” [0046]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of McCall into the combination of Todasco and Hackett, in order to convenient for infrequent, planned updates.
27. With reference to claim 38, The combination of Todasco and Hackett does not explicitly teach the operations further comprising hosting a game including the model. This is what McCall teaches (“in gaming for example, 3D models (alternatively called 3D assets, or 3D content) are pre-loaded in the game application so when a user starts up the game, all of the 3D assets that will be viewed by the user are already on the device. When updates are needed, the application will add new content offline. For example, the game may apply a patch when the game is not running, and when the application is next opened, the new content is installed and ready for use.” [0046]) Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the teachings of McCall into the combination of Todasco and Hackett, in order to convenient for infrequent, planned updates.
Conclusion
28. Any inquiry concerning this communication or earlier communications from the examiner should be directed to Michelle Chin whose telephone number is (571)270-3697. The examiner can normally be reached on Monday-Friday 8:00 AM-4:30 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http:/Awww.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner's supervisor, Kent Chang can be reached on (571)272-7667. The fax phone number for the organization where this application or proceeding is assigned is (571)273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https:/Awww.uspto.gov/patents/apply/patent- center for more information about Patent Center and https:/Awww.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MICHELLE CHIN/
Primary Examiner, Art Unit 2614