Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
This action is in response to the amendment filed on September 29th, 2025. Claims 1, 4, 5, 8, 9, 10, and 15 have been amended. The amended claims limitations have been fully considered but are not persuasive. Claims 1- 20 remain rejected in the application.
Response to Arguments
In response to applicant’s arguments regarding Holt failing to disclose generating a model corresponding to the demonstration area, applicant’s arguments have been fully considered but are not persuasive. Holt in view of Forutanpour explicitly teach generating virtual models corresponding to the demonstration area using sensor data. Please find further details and citations in office action below.
In response to applicant’s arguments regarding Holt failing to apply the digital material to a model surface, Holt in view of Forutanpour explicitly teach displaying virtual objects and applying a digital material is an obvious limitation. Please find further details and citations in office action below.
In response to applicant’s arguments regarding splitting images from the digital image, Holt very explicitly teaches the limitations of splitting images. Holt teaches generating virtual images which MadMapper can be applied to. Please find further details and citations in office action below.
In response to applicant’s arguments regarding the dependent claims, due to the Examiner maintaining the rejection for the independent claims, rejections for the dependent claims are maintained.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 2, 8, 15, 18, 19, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Holt et. al (U.S. Patent No. 11,062,383), in view of Forutanpour et. al (U.S. Patent No. 10,008,024).
Regarding claim 1, Holt discloses a system, comprising: projectors configured to project onto projector zones of a demonstration area (Holt: Col. 9, Lines 30-46 “In the embodiment shown in FIG. 1 (and FIG. 2), three projectors are being utilized. The first projector 151 is positioned at the top, center of the simulated room at the end distal to the third wall structure 103. The first projector 151 is oriented to project an image generally in the direction of the third wall structure 103, recognizing that the image will project an image on more than the third wall structure 103. The second projector 150 is positioned in the center of the ceiling structure 104 and oriented to project an image generally in the direction of the floor structure 105. The third projector 152 is positioned near the second wall structure 102 and oriented such that it projects an image generally in the direction of toward the third wall structure 103 and the first wall structure 101. The projectors 150, 151, and 152 each display coordinated images onto physical wall structures and the physical objects as directed by an electronic device, as described below in connection with FIG. 3.”); and a rendering computing device that: receives user input indicating a selection of a digital material of the digital materials for a demonstration surface of the demonstration area (interpreted as the system gets the customer’s choice of finish)(Holt: Col. 12, Lines 31-36 “The user can view images of selectable products and select one or more different products from different product categories or groups. The electronic input device 354 is in communication with electronic device 356 that receives the user selections and communicates such selections through the network 358 to the electronic device 355.”)(teaches system detects which product the user selects and receives the users selection corresponding to the claimed limitation); and in response to the user input, dynamically: generates a dynamic digital model of that corresponds to the demonstration area from a viewpoint using data of the demonstration area (interpreted as building a virtual representation of the real demonstration area)(Holt: Col. 4, Lines 22-23 “virtual display of the more cumbersome objects .”)(teaches the virtual (digital) displaying of the objects)(Holt: Col. 11, Lines 37-38 “depending on the particular product selected by the user to be displayed in the simulated room.”)(discloses simulated room for products to be displayed which corresponds to the simulation of the demonstration area); applies the digital material to a model surface of the dynamic digital model that corresponds to the demonstration surface (interpreted as wraps the chosen finish onto the right surface in the model)(Holt: Col. 10, Lines 15-18 “the user can select different particular products from a data store and display an image that simulates the particular selected product in the simulated room.”)(teaches displaying the user selected product in the simulated room) (Holt: Col. 4, Lines 22-23 “virtual display of the more cumbersome objects .”)(teaches the virtual (digital) displaying of the objects); adjusts the dynamic digital model for a demonstration area condition (interpreted as accounting for real room constraints)(Holt: Col. 12, Lines 43- “be configured to prevent at least one portion of the images of the selectable products ( e.g., the flooring pattern projected onto the floor 105 in FIG. 2) from being projected by projectors 351, 352, 353 onto a physical feature (e.g., onto the toilet 107 in FIG. 2) in the simulated room and allow other portions of the images of the selectable products (e.g., the flooring pattern projected onto the floor 105 in FIG. 2) to be projected onto at least a portion of the simulated room located around the physical feature (e.g., onto the portion of the floor 105 located around the toilet 107 in FIG. 2).”) (teaches selective blocking/masking adapts the model to physical obstructions – an environmental condition) (Holt: Col. 4, Lines 22-23 “virtual display of the more cumbersome objects .”)(teaches the virtual (digital) displaying of the objects); generates split images from the dynamic digital model in alignment with corresponding ones of the projector zones, splitting the dynamic digital model into the split images (interpreted as breaking the final frame into slices)(Holt: Col. 12, Lines 56-58 “and/or using projection mapping software ( e.g., commercially available "MadMapper" mapping software).”) (teaches using MadMapper which is an industry standard for slicing a scene into specific outputs); and provides the split images to the projectors that correspond to the projector zones (interpreted as sending each slice to its assigned projector)(Holt: Col. 12, Lines 40-41 “sends instructions to the projectors 351, 352, and 353 for displaying images of the selected products.”) (teaches that controller 355 transmits the prepared per projector imagery) but fails to explicitly disclose a scanner computing device that creates digital materials by scanning physical samples of design materials, the digital materials including images of the physical samples and texture stacks comprising texture maps representing characteristics of the physical samples, sensor.
However, Forutanpour discloses a scanner computing device that creates digital materials by scanning physical samples of design materials, the digital materials including images of the physical samples and texture stacks comprising texture maps representing characteristics of the physical samples (interpreted as a hardware and software scanner turns real world objects into digital textures)(Forutanpour: Col. 6, Lines 61-67 “In FIG. 2, an image capture device 202 ( e.g., a two-dimensional camera, a depth sensing camera, a three dimensional camera, etc.) is moved along a path 204 (e.g., an arc) relative to an object 210, such as a cup as shown in 65 FIG. 2. During the movement of the image capture device 202 along the path 204”)(Forutanpour: Col. 7. Lines 17-19 “3D point cloud of the object 210 may be generated in real time or near-real time based on the camera poses 206A-F and the sequence of image frames.”)(Forutanpour: Col 10, Lines 65-67 through Col. 11, Lines 1-3 “the texture map generator 328 may generate a translucence map, a translucence depth map, a specular map, an eccentricity map, a bump map, a normal map, a displacement map, a translucency color map, a cosine power map, or a combination thereof.”) (teaches a camera-based scanner captures the physical sample as a point cloud + image frames, then automatically builds a stack of PBR texture maps corresponding to the claimed limitation), sensor (Forutanpour: Col. 3, Line 35 “sensor , a depth - sensing camera”).
Holt and Forutanpour are both considered to be analogous to the claimed invention because they are in the same field of photorealistic, real-time digital visualization of physical materials for customer preview. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt to incorporate Forutanpour’s teachings of creating digital materials by scanning the physical objects. The motivation for such a combination would provide the benefit of physically based renderings thereby giving customers a more realistic preview and increasing purchase confidence.
Regarding claim 2, Holt discloses the system of claim 1, wherein the scanner computing device: recognizes that a physical sample of the physical samples includes a repeating pattern; and replicates the repeating pattern instead of scanning all of the physical sample (interpreted as detecting a pattern and replicating that pattern)(Holt: Col. 3, Lines 64-67 and Col. 4, Lines 1-4 “The at least one projector can project an image on a surface or surfaces to display a selected design choice or feature in the particular physical structure. For example, the at least one projector can project different colors, patterns, accessories, or other design choices onto the physical structure and onto the at least one physical object to provide a simulated room showing the different design choices.”)(teaches simulating a pattern which the system would have scanned corresponding to the entire claim’s limitations).
Claim 8 is a system claim corresponding to the system claim 1 above without any additional limitations. Thus, claim 8 is rejected for the same reason as claim 1.
Regarding claim 15, Holt discloses a system, comprising: a demonstration area (Holt: Col. 2, Line 15 “simulated room”); and a processor that executes the instructions to (Holt: Col. 2, Line 8 “processor”): receive first user input indicating a first selection of a first digital material from multiple digital materials for a first demonstration surface of the demonstration area, the digital materials including images of physical samples (interpreted as the user picks a first material for the first surface)(Holt: Col. 12, Lines 31-35 “The user can view images of selectable products and select one or more different products from different product categories or groups. The electronic input device 354 is in communication with electronic device 356 that receives the user selections and communicates such selections”) (teaches user can view and select a product); receive second user input indicating a second selection of a second digital material from the multiple digital materials for a second demonstration surface of the demonstration area (interpreted as user is able to have a second selection for a second demonstration) (Holt: Col. 10, Lines 5-7 “The user can select different paint colors to be projected on the first wall structure 101, the second wall structure 102, and the third wall structure 103.”)(teaches a second and third wall structure as a selection meaning the user can select a second and third wall structure which is destined for another surface); assign projectors to projector zones that each correspond to a portion of the demonstration area (Holt: Col. 9, Lines 30-33 “three projectors are being utilized. The first projector 151 is positioned at the top, center of the simulated room at the end distal to the third wall structure 103.”)(teaches projectors being utilized for specific locations within the simulated room); and in response to second user input, dynamically: apply the first digital material to a first model surface of a dynamic digital (Holt: Col. 4, Lines 22-23 “virtual display of the more cumbersome objects .”)(teaches the virtual (digital) displaying of the objects) model that corresponds to the demonstration area, and is generated using data of the demonstration area (Holt: Col. 11, Lines 37-38 “depending on the particular product selected by the user to be displayed in the simulated room.”)(discloses simulated room for products to be displayed which corresponds to the simulation of the demonstration area), the first model surface corresponding to the first demonstration surface (interpreted as after the second pick, the system maps the first material onto the virtual models first surface)(Holt: Col. 10, Lines 1-5 “a user can select different features to be projected into simulated room 110'. For example, a flooring pattern and color image 232 can be selected and projected onto floor structure 105 to simulate a particular flooring pattern and color.”)(teaches the system mapping a chosen flooring image onto the virtual floor corresponding to the claimed limitation); apply the second digital material to a second model surface of the dynamic digital model (Holt: Col. 4, Lines 22-23 “virtual display of the more cumbersome objects .”)(teaches the virtual (digital) displaying of the objects) that corresponds to the demonstration area, and is generated using data of the demonstration area (Holt: Col. 11, Lines 37-38 “depending on the particular product selected by the user to be displayed in the simulated room.”)(discloses simulated room for products to be displayed which corresponds to the simulation of the demonstration area), the second model surface corresponding to the second demonstration surface (interpreted as it then maps the second material onto the second surface)(Holt: Col. 10, Lines 5-7 “The user can select different paint colors to be projected on the first wall structure 101, the second wall structure 102, and the third wall structure 103.”)(teaches the user can select a second and third surface as well); generate split images from the dynamic digital (Holt: Col. 4, Lines 22-23 “virtual display of the more cumbersome objects .”)(teaches the virtual (digital) displaying of the objects) model in alignment with corresponding ones of the projector zones, splitting the dynamic digital model into the split images (interpreted as breaking the final frame into slices)(Holt: Col. 12, Lines 56-58 “and/or using projection mapping software ( e.g., commercially available "MadMapper" projection mapping software).”) (teaches using MadMapper which is an industry standard for slicing a scene into specific outputs); and provide the split images to the projectors that correspond to the projector zones (interpreted as the renderer scene is sliced per projector) (Holt: Col. 12, Lines 54-58 “using optical blocking techniques or by blackening, in the image data or file, pixels that correspond to the location of the features, and/or using projection mapping software ( e.g., commercially available "MadMapper" projection mapping software).”)(MadMapper is an industry standard for frame splitting) but fails to explicitly disclose a non-transitory storage medium that stores instructions; and texture stacks comprising texture maps representing characteristics of the physical samples, sensor, sensor.
However, Forutanpour discloses a non-transitory storage medium that stores instructions (Forutanpour: Col. 2, Lines 25-26 “a non-transitory computer-readable medium includes instructions”); and texture stacks comprising texture maps representing characteristics of the physical samples (Forutanpour: Col. 1, Lines 29-37 “Typically, to generate texture maps accompanying the 3D model of the object, a texture map generator may generate a two-dimensional (2D) texture map that is used for photorealistic rendering of the 3D model. The texture map generator typically determines color information (e.g., RGB color information) associated with the object based on a sequence of image frames, and the 2D texture map is typically generated based on the color information.”), sensor, sensor (Forutanpour: Col. 3, Line 35 “sensor , a depth - sensing camera”).
Holt and Forutanpour are both considered to be analogous to the claimed invention because they are in the same field of photorealistic, real-time digital visualization of physical materials for customer preview. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt to incorporate Forutanpour’s teachings of utilizing texture maps. The motivation for such a combination would provide the benefit of more accurate physically based renderings thereby giving customers a more realistic preview and increasing purchase confidence.
Regarding claim 18, Holt discloses the system of claim 15, explicitly disclose wherein the processor is operable to receive third user input indicating to order a product associated with a respective physical sample associated with the first digital material (interpreted as after the user has previewed the first digital material, the system can accept a further input that tells it the user wants to buy the real world product corresponding to that sample) (Holt: Col. 10, Lines 26-36 “The simultaneous projection of image 220, image 221, and the wall color image 230 in connection with the physical object may permit a more informed, confident, and thought-through visualization, selection and purchase of different features for a room by a consumer. The simulated room 110' shows a three-dimensional simulation of the different product selections (as opposed to systems limited to two-dimensional simulations). The simulated room 110' can provide a more complete visualization of a design choice of a room for a user before selecting and purchasing certain products.”)(teaches the system supports consumer purchase of the products visualized on the demo surfaces, the processor therefore necessarily receives user input that indicates an order for the chosen product corresponding to the digital material previously used).
Regarding claim 19, Holt discloses the system of claim 18, wherein the first digital material includes metadata that specifies a stock keeping unit for the product (Holt: Col. 13, lines 37-41 “The user can view a product in the store and then scan or otherwise select the product using the electronic device 357 by, for example, using a QR reader or a bar code scanner, or other scanning device.”)(The QR code specifies a stock keeping unit for the product that the user can scan).
Regarding claim 20, Holt discloses the system of claim 15, wherein the physical samples comprise at least one fabrics, floorings, paints, wood, paneling, stone, brick, carpet, laminates, countertops, cabinets wallpaper, molding, tiles, paint, and housewares (Holt: Lines 59-63 “the
user can review at least the following categories of products: wall color, wall treatment, floor color, flooring pattern, mirror, light fixtures, cabinet finish, cabinet size, cabinet pulls, and counter top color.”).
Claims 3 and 4 are rejected under 35 U.S.C. 103 as being unpatentable over Holt et. al (U.S. Patent No. 11,062,383), in view of Forutanpour et. al (U.S. Patent No. 10,008,024), in further view of Pytlarz et. al (WO International Publication No. 2018/119161).
Regarding claim 3, Holt in view of Forutanpour disclose the system of claim 1, but fail to explicitly disclose wherein the demonstration area condition includes a lighting condition.
However, Pytlarz discloses wherein the demonstration area condition includes a lighting condition [Pytlarz: 00017 “Example embodiments described herein relate to the display management of images under changing viewing environments (e.g., a change of the ambient light).”].
Holt, Forutanpour, and Pytlarz are considered to be analogous to the claimed invention because they are in the same field of real-time digital visualization of physical materials for viewing in simulated environments. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt and Forutanpour to incorporate Pytlarz’s teachings of treating light as an explicit condition for which the rendering engine must account. The motivation for such a combination would provide the benefit of delivering previews whose brightness and color accuracy match the showrooms real lighting.
Regarding claim 4, Holt in view of Forutanpour disclose the system of claim 3, but fail to explicitly disclose wherein the rendering computing device adjusts for the lighting condition by compensating for the demonstration area having a different lighting than the dynamic digital model.
However, Pytlarz discloses wherein the rendering computing device adjusts for the lighting condition by compensating for the demonstration area having a different lighting than the dynamic digital model [Pytlarz: 00017 “Example embodiments described herein relate to the display management of images under changing viewing environments (e.g., a change of the ambient light)”][Pytlarz: 305; Fig. 3A “Generate Virtual Image”](teaches generating digital images).
Holt, Forutanpour, and Pytlarz are considered to be analogous to the claimed invention because they are in the same field of real-time digital visualization of physical materials for viewing in simulated environments. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt and Forutanpour to incorporate Pytlarz’s teachings of treating light as an explicit condition for which the rendering engine must account. The motivation for such a combination would provide the benefit of delivering previews whose brightness and color accuracy match the showrooms real lighting.
Claim 5 is rejected under 35 U.S.C. 103 as being unpatentable over Holt et. al (U.S. Patent No. 11,062,383), in view of Forutanpour et. al (U.S. Patent No. 10,008,024), in further view of Bakalash et. al (U.S. Patent No. 10,297,068).
Regarding claim 5, Holt in view of Forutanpour disclose the system of claim 1, dynamic digital model (Holt: Col. 4, Lines 22-23 “virtual display of the more cumbersome objects .”)(teaches the virtual (digital) displaying of the objects) but fail to explicitly disclose wherein the rendering computing device performs ray tracing to add lighting.
However, Bakalash discloses wherein the rendering computing device performs ray tracing to add lighting to the model (interpreted as the render must run a ray tracing algorithm so the virtual room receives physically based lighting/reflection effects)(Bakalash: Col. 6, Lines 28-50 “aspects of the present invention teach how to implement ray tracing at both a reduced computational complexity and a high speed. One aspect of the invention relates to path tracing, which is a high-quality ray tracing based on global illumination. Its superior performance stems from a different technological approach to solving the intersection between rays and scene objects. It is based on dynamically aligned structure (DAS), which is a projection of parallel rays, used to carry secondary rays emitting from existing hit points. The DAS mechanism can be implemented either by a GPU (graphics processing unit) graphics pipeline, or by a CPU (central processing unit). The mechanism can solve ray-triangle intersections by use of the conventional graphics mechanism, replacing the expensive traversals of accelerating structures in the prior art. DAS Mechanism. In one embodiment the DAS mechanism is applied to path tracing, which is based on global illumination. Global illumination (or indirect illumination) takes into account not only the light that comes directly from a light source, but also light reflected by surfaces in the scene, whether specular, diffuse, or semi-reflective.”)(clearly teaches using ray tracing for adding lighting effects).
Holt, Forutanpour, and Bakalash are considered to be analogous to the claimed invention because they are in the same field of real-time digital visualization of physical materials. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt and Forutanpour to incorporate Bakalash’s teachings of using ray tracing. The motivation for such a combination would provide the benefit of delivering high quality simulations of the lighting and reflections.
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Holt et. al (U.S. Patent No. 11,062,383), in view of Forutanpour et. al (U.S. Patent No. 10,008,024), in further view of Pesonen et. al (WO International Publication No. 2019/229293).
Regarding claim 6, Holt discloses the system of claim 1, but fails to explicitly disclose wherein the texture maps comprise an alpha texture map, a displacement texture map, a roughness texture map, a metallic texture map, a normal texture map, and a base color texture map.
However, Forutanpour discloses wherein the texture maps comprise an alpha texture map (interpreted as transparent/opacity map), a displacement texture map, a normal texture map (Forutanpour: Col. 6, Lines 17-21 “property texture map include a translucence texture map, a translucence depth texture map, an eccentricity texture map, a normal map, a displacement map, a translucency color texture map, a cosine power map, a specular texture map, a transparency texture map”).
However, Pesonen discloses a roughness texture map, a metallic texture map, and a base color texture map [Pesonen: 0016 “base color texture map; a roughness map; a metalness map”].
Holt, Forutanpour, and Pesonen are considered to be analogous to the claimed invention because they are in the same field of real-time digital visualization of physical materials. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt and Forutanpour to incorporate Pesonen’s teachings of using maps. The motivation for such a combination would provide the benefit of a complete texture stack and improving visual realism.
Claim 7 is rejected under 35 U.S.C. 103 as being unpatentable over Holt et. al (U.S. Patent No. 11,062,383), in view of Forutanpour et. al (U.S. Patent No. 10,008,024), in further view of Llola et. al (U.S. Patent Publication No. 2022/0292763).
Regarding claim 7, Holt and Forutanpour discloses the system of claim 1, but fail to explicitly disclose wherein the scanner computing device is operable to perform a metallic/roughness workflow and a specular/glossy workflow as part of scanning the physical samples.
However, Llola discloses wherein the scanner computing device is operable to perform a metallic/roughness workflow and a specular/glossy workflow as part of scanning the physical samples (interpreted as the scanner can output the standard PBR metalness + roughness set of maps and the same scanner can alternatively output the older PBR workflow that uses specular-color and glossiness maps) [Llola: 0059 “gITF separates BRDF shading approximations into two categories, namely metallic - rough ness and specular - glossiness, which are mutually exclusive. For metallic - roughness, different maps for the surface are provided for albedo ( base - color ), roughness factor and metalness factor.”](Llola teaches the metallic roughness workflow and the exact map it requires. Llola further discloses every element of the specular/glossiness workflow).
Holt, Forutanpour, and Llola are considered to be analogous to the claimed invention because they are in the same field of real-time digital visualization of physical materials. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt and Forutanpour to incorporate Llola’s teachings of offering both metallic roughness and specular glossiness workflows during texture generation. The motivation for such a combination would allow the same scanned material to support either of the two standard PBR shading models, thereby increasing compatibility with common rendering pipelines and improving visual realism.
Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Holt et. al (U.S. Patent No. 11,062,383), in view of Forutanpour et. al (U.S. Patent No. 10,008,024), in further view of Hillesland et. al (U.S. Patent No. 9,626,789).
Regarding claim 9, Holt in view of Forutanpour disclose the system of claim 8, dynamic digital model (Holt: Col. 4, Lines 22-23 “virtual display of the more cumbersome objects .”)(teaches the virtual (digital) displaying of the objects) but fail to explicitly disclose wherein the rendering computing device uses a UV map included in the digital material to maintain scale of the digital material when applying the digital material to the surface.
However, Hillesland discloses wherein the rendering computing device uses a UV map included in the digital material to maintain scale of the digital material when applying the digital material to the model surface (Hillesland: Col. 7, Lines 39-40 “In general, each face has its own implicit UV parameterization.”)(Hillesland: Col. 7, Lines 50-52 “Scale and offsets are applied to get the face UV range of 0.1 mapped into the atlas UV coordinates”)(teaches the renderer reading the UV map that is bundled with the texture “implicit UV parameterization” and explicitly applying scale factors so the textures texel-to-world ratio stays correct when it is placed on the model).
Holt, Forutanpour, and Hillesland are considered to be analogous to the claimed invention because they are in the same field of real-time digital visualization of physical materials. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt and Forutanpour to incorporate Hillesland’s teachings of using the UV map embedded in a digital material – applying scale and offsetting factors-to maintain the materials real-world size when it is wrapped onto a model surface. The motivation for such a combination would be the benefit of preventing pattern distortion, thereby ensuring accurate visual previews for customers.
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Holt et. al (U.S. Patent No. 11,062,383), in view of Forutanpour et. al (U.S. Patent No. 10,008,024), in further view of Loberg (U.S. Patent No. 9,536,340).
Regarding claim 10, Holt in view of Forutanpour disclose the system of claim 8, the dynamic digital model (Holt: Col. 4, Lines 22-23 “virtual display of the more cumbersome objects .”)(teaches the virtual (digital) displaying of the objects) but fail to explicitly disclose wherein the rendering computing device uses a gaming engine to generate of that corresponds to the demonstration area and apply the digital material to the model surface.
However, Loberg discloses wherein the rendering computing device uses a gaming engine to generate of that corresponds to the demonstration area and apply the digital material to the model surface (Loberg: Col. 6, Lines 41-46 “In one implementation, the graphical processing engine 130 is similar in Some respects to a game engine, which takes data from one program component and passes the data to another program component, as necessary, to identify appropriate pixel information”)(Loberg: Col. 8, Lines 20-22 “The user can then view the table 215a and chair 205a and move, reposition, or change the design elements however the user sees fit”)(Loberg teaches using a game engine to build the scene graph and apply user chosen materials to the objects).
Holt, Forutanpour, and Loberg are considered to be analogous to the claimed invention because they are in the same field of real-time digital visualization of physical materials. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt and Forutanpour to incorporate Loberg’s teachings of using game engine to build the room model and apply user selected materials. The motivation for such a combination would be the benefit of smoother performance and easier material editing.
Claims 11, 12, 13, and 14 are rejected under 35 U.S.C. 103 as being unpatentable over Holt et. al (U.S. Patent No. 11,062,383), in view of Forutanpour et. al (U.S. Patent No. 10,008,024), in further view of Ghaleb (CA 3 072 632).
Regarding claim 11, Holt in view of Forutanpour disclose the system of claim 8, but fail to explicitly disclose wherein the projector zones overlap.
However, Ghaleb discloses wherein the projector zones overlap [Ghaleb: 71 “In embodiments, the throw of the projectors 23 overlap and feather at the edges thereof with the throw of adjacent projectors”].
Holt, Forutanpour, and Ghaleb are considered to be analogous to the claimed invention because they are in the same field of real-time digital visualization of physical materials. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt and Forutanpour to incorporate Ghaleb’s teachings of overlapping projectors. The motivation for such a combination would be to eliminate visible seams and delivering a continues image across large surfaces.
Regarding claim 12, Holt in view of Forutanpour disclose the system of claim 11, but fail to explicitly disclose wherein the rendering computing device reduces intensity of pixels in overlapping areas where the projector zones overlap.
However, Ghaleb discloses wherein the rendering computing device reduces intensity of pixels in overlapping areas where the projector zones overlap [Ghaleb: 71 “In embodiments, the throw of the projectors 23 overlap and feather at the edges thereof with the throw of adjacent projectors”](feather at the edges inherently means reducing the brightness/intensity of each projectors pixels).
Holt, Forutanpour, and Ghaleb are considered to be analogous to the claimed invention because they are in the same field of real-time digital visualization of physical materials. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt and Forutanpour to incorporate Ghaleb’s teachings of reducing pixel intensity at edges. The motivation for such a combination would be to eliminate visible seams and delivering a continues image across large surfaces.
Regarding claim 13, Holt in view of Forutanpour disclose the system of claim 8, but fail to explicitly disclose wherein the rendering computing device provides projector alignment grids to the projectors for mapping pixel density to the demonstration area.
However, Ghaleb discloses wherein the rendering computing device provides projector alignment grids to the projectors for mapping pixel density to the demonstration area [Ghaleb: 73 “the floorplan data 3 may specify a scale, and being configured in accordance with the dimensions of the projection surface level 25, the scaling module 15 is configured for scaling the floorplans appropriately such that the floorplan 26 scale to real-world dimensions”] (Ghaleb teaches a scale to the dimensions of the projection surface which corresponds to an alignment grid for projectors to project on the demonstration area).
Holt, Forutanpour, and Ghaleb are considered to be analogous to the claimed invention because they are in the same field of real-time digital visualization of physical materials. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt and Forutanpour to incorporate Ghaleb’s teachings of a scale to the dimensions of the projection surface. The motivation for such a combination would be to a have a dimensionally accurate projection.
Regarding claim 14, Holt in view of Forutanpour disclose the system of claim 13, but fail to explicitly disclose wherein the projector alignment grids include a blend zone.
However, Ghaleb discloses wherein the projector alignment grids include a blend zone [Ghaleb: 71 “may be configured for edge blending/feathering of the various segments”].
Claim 16 is rejected under 35 U.S.C. 103 as being unpatentable over Holt et. al (U.S. Patent No. 11,062,383), in view of Forutanpour et. al (U.S. Patent No. 10,008,024), in further view of Tang (U.S. Patent Publication No. 2013/0137324).
Regarding claim 16, Holt in view of Forutanpour disclose the system of claim 15, but fail to explicitly disclose further comprising a surface treatment applied to the first demonstration surface that prevents or reduces light bounce from one or more of the projectors.
However, Tang discloses further comprising a surface treatment applied to the first demonstration surface that prevents or reduces light bounce from one or more of the projectors (interpreted as the system also puts a special coating on the first surface so that project light is absorbed rather than reflected)[Tang: 0004 “Such materials must possess high optical absorptive and low reflective”].
Holt, Forutanpour, and Tang are considered to be analogous to the claimed invention because they address projected image quality on physical surfaces. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt and Forutanpour to incorporate Tang’s teachings of using materials that absorb light rather than reflect light. The motivation for such a combination would be to reduce light bounce and improve image quality.
Claim 17 is rejected under 35 U.S.C. 103 as being unpatentable over Holt et. al (U.S. Patent No. 11,062,383), in view of Forutanpour et. al (U.S. Patent No. 10,008,024), in view of Tang (U.S. Patent Publication No. 2013/0137324), in further view of Saitoh (U.S. Patent No. 7,645,400).
Regarding claim 17, Holt in view of Forutanpour and Tang disclose the system of claim 16, but fail to explicitly disclose wherein the surface treatment comprises crushed carbon nanotubes applied in a polymer.
However, Saitoh discloses wherein the surface treatment comprises crushed carbon nanotubes applied in a polymer (Saitoh: Col. 25, Lines 32-36 “In addition, crushed carbon nanotubes obtained by crushing using a ball mill, vibration mill, sand mill, roll mill or other ball-type kneading device, as well as shortly cut carbon nanotubes obtained by chemical or physical treatment, can also be used.”).
Holt, Forutanpour, Tang, and Saitoh are considered to be analogous to the claimed invention because they address projected image quality on physical surfaces. Therefore, it would have been obvious to someone of ordinary skill in the art before the effective filing date of the claimed invention to modify the system of Holt, Forutanpour, and Tang to incorporate Saitoh’s teachings of using crushed carbon nanotubes. The motivation for such a combination would be to improve coating durability and manufacturability while preserving the desired light absorption performance.
Conclusion
THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to AHMED TAHA whose telephone number is (571)272-6805. The examiner can normally be reached 8:30 am - 5 pm, Mon - Fri. Examiner interviews are available via telephone, in person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, XIAO WU can be reached at (571)272-7761. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786- 9199 (IN USA OR CANADA) or 571-272-1000.
/AHMED TAHA/Examiner, Art Unit 2613
/XIAO M WU/Supervisory Patent Examiner, Art Unit 2613