Prosecution Insights
Last updated: April 19, 2026
Application No. 17/911,898

METHOD AND APPARATUS FOR ADAPTING A SCENE RENDERING

Final Rejection §103
Filed
Sep 15, 2022
Examiner
PEREN, VINCENT ROBERT
Art Unit
2617
Tech Center
2600 — Communications
Assignee
Interdigital Ce Patent Holdings
OA Round
4 (Final)
70%
Grant Probability
Favorable
5-6
OA Rounds
2y 11m
To Grant
90%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
266 granted / 382 resolved
+7.6% vs TC avg
Strong +20% interview lift
Without
With
+20.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 11m
Avg Prosecution
15 currently pending
Career history
397
Total Applications
across all art units

Statute-Specific Performance

§101
8.8%
-31.2% vs TC avg
§103
46.8%
+6.8% vs TC avg
§102
26.0%
-14.0% vs TC avg
§112
13.7%
-26.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 382 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Obligation Under 37 CFR 1.56 – Joint Inventors This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention. Response to Amendment Applicant’s amendment filed on December 10, 2025 has been entered. Claims 1, 5, 15, 18, 21 and 31 have been amended. Claims 2-4, 16-17, 19-20 and are canceled. Claims 1,5-15,18 and 21-32 are still pending in this application, with claims 1 and 18 being independent. Claim Objections Claim18 is objected to because of the following informalities: “the model of the opening” (lines 9-10 of claim 18) lacks proper antecedent basis (i.e., “a model of an opening” would be proper). Appropriate correction is required. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: Determining the scope and contents of the prior art; Ascertaining the differences between the prior art and the claims at issue; Resolving the level of ordinary skill in the pertinent art; and Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1, 5-15, 18 and 21-32 is/are rejected under 35 U.S.C. 103 as being unpatentable over VINCENT et al. (US 2022/0028159, hereinafter “VINCENT”) in view of GLASER (US 2015/0302637). Regarding claim 1, VINCENT discloses a method (¶ [0011]: “techniques for using one or more computing devices to perform automated operations related to, with respect to a computer model of a building's interior, generating and displaying simulated lighting information in the model based on sunlight or other external light that is estimated to enter the building and be visible in particular rooms of the interior under specified conditions.”) comprising: obtaining information (e.g., ¶ [0011]: “specify the conditions”; ¶ [0013]: “specified conditions,”) indicating an indoor scene (e.g., ¶ [0013]: “a building's interior”; ¶ [0069]: “a computer model of a house or other building”) to be rendered under a virtual outdoor condition (e.g., ¶ [0013]: “the simulated lighting information for a building's interior may be generated in at least some embodiments to reflect specified conditions,” ¶ [0034]: “select a target time for the simulated lighting, by selecting a time-of-day and a season-of-year,” e.g., ¶ [0034]: “at 4 pm in the winter for a home in the northern hemisphere” ¶ [0069]: “generate and present simulated lighting information on a computer model of a house or other building in accordance with specified target conditions.”) (¶ [0011]: “the user able to specify the conditions for which a simulated lighting display is generated.” ¶ [0012]: “a determined type of weather (e.g., typical weather for a specified time, or a specific type of weather that is selected to be modeled),” ¶ [0013]: “As noted above, the simulated lighting information for a building's interior may be generated in at least some embodiments to reflect specified conditions, such as a target time at which to generate the simulated lighting (e.g., a season-of-the-year and a time-of-day), an amount of the building interior to display (e.g., one or more specific rooms, the entire interior, etc.), a geographical location and/or orientation of the building, typical weather for the building location and target time, etc.” ¶ [0013]: “a model of a building interior may be displayed in at least some embodiments to a user in a displayed GUI on a client computing device, and the user may be able to specify via the GUI (or in another manner) at least some of the conditions for which the simulated lighting display is generated, such as one or more of the following: one or more target times at which to generate the simulated lighting; an amount of the house or other building interior to display (e.g., one or more specific rooms, the entire interior, etc.); a type of simulated lighting display mode (e.g., simulated lighting conditions for a single target time; an animation over a sequence of simulated lighting conditions for multiple target times within a period of time; a comparison of multiple simultaneous simulated lighting conditions, such as daytime and nighttime, or two or more different seasons at a given time-of-day, or two or more other types of different daytime times; etc.); effects on the simulated lighting of changes to the building interior (e.g., adding or removing a window; adding or removing part or all of a wall; changing furnishings or moveable elements; changing the color and/or texture of a surface, such as a wall or a floor or a ceiling or a countertop; adding or removing lighting sources at specified locations in the interior and whose effects are included in the generated simulated lighting; etc.); effects on the simulated lighting of changes outside the building interior (e.g., adding or removing or changing a tree or other vegetation in an environment surrounding the building, such as in a yard of a house; adding or removing or changing an exterior building or other external structure, whether on a same property as the building or a nearby property; etc.);” ¶ [0018]: “the BMLSM system may receive information via computer network(s) 170 from end users of map viewer client computing devices 175 about specified conditions for which the lighting simulation information is generated, before generating and providing such simulated lighting information for display on the client computing devices 175,” ¶ [0034]: “FIG. 2F continues the examples of FIGS. 2A-2E, and illustrates a modified version of the GUI that includes an updated 3D model 265f corresponding to user-selectable control 296 having been selected by the user, with the generated and presented simulated lighting conditions corresponding to user-modifiable controls 226-227 to specify target conditions represented by the simulated lighting (in this example, to select a target time for the simulated lighting, by selecting a time-of-day and a season-of-year, respectively). In this example, the user has further modified the 3D computer model to display only a portion of a selected room (e.g., via zooming and/or dragging or other positioning, not shown), which in this example is the living room. Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0071]: “receives instructions from an end user, and determines one or more target times for which to generate simulated lighting information for one or more rooms of the indicated building, as well as optionally receiving other user-specified display options—in some embodiments and situations, the user may specify the one or more target times and/or the one or more building rooms via a GUI in which a version of the 3D computer model is displayed,”), wherein the indoor scene (e.g., ¶ [0013]: “a building's interior”; ¶ [0069]: “a computer model of a house or other building”) is associated with a first geographical location (e.g., ¶ [0012]: “the building's geographical location (e.g., address, latitude and longitude or other GPS coordinates, etc.) and geographical orientation (e.g., compass-based or other cardinal directions for external walls, such as with respect to magnetic north or true north),” ¶ [0013]: “geographical location and/or orientation of the building,”) (¶ [0015]: “use 3D models and/or 2.5D models and/or 2D floor maps of multi-room buildings and other structures (e.g., that are generated from images acquired in the buildings or other structures) to display simulated lighting conditions for building interiors that is generated via automated operations of one or more computing systems for particular target times or otherwise for specified target conditions, including to use information about the actual as-built buildings (e.g., internal structural components and/or other interior elements, nearby external buildings and/or vegetation, actual building geographical location and/or orientation, actual typical weather patterns, etc.) rather than using information from plans on how the building is designed and should theoretically be constructed. Such described techniques may further provide benefits in allowing improved automated navigation of a building by mobile devices (e.g., semi-autonomous or fully-autonomous vehicles) via varying visibility of interior elements that are visible in different lighting conditions, including to significantly reduce their computing power used and time used to attempt to otherwise learn a building's layout. In addition, in some embodiments the described techniques may be used to provide an improved GUI in which an end user may more accurately and quickly obtain information about a building's interior (e.g., for use in navigating that interior, such as via a virtual tour), including in response to search requests, as part of providing personalized information to the end user, as part of providing value estimates and/or other information about a building to an end user, etc.” ¶ [0033]: “FIG. 2E continues the examples of FIGS. 2A-2D, and illustrates a 3D computer model 265e of the house 198 that is generated from images (such as those illustrated in FIGS. 2A-2C and/or other related images taken from at least some of the viewing locations 210), whether directly or via use of an intermediate 2D floor map such as floor map 230 of FIG. 2D—in this example, most of the types of added information shown in floor map 230 are not shown in the 3D model 265e for the sake of simplicity, but some or all such added information could similarly be shown on the 3D model 265e in some embodiments and situations. With respect to the floor map 230 of FIG. 2D, the visual representation of the 3D model 265e shown in FIG. 2E includes additional visual representations of walls (e.g., based on estimated or measured heights of the walls), of doors and windows, etc.—while this example 3D model does not show actual images projected on the walls, such information may be further added in some embodiments and situations. The 3D model 265e for the house 198 may, for example, be presented to a BMLSM system operator user and/or end user in a GUI 260. In this example, the user-selectable control 228 remains to indicate a current floor that is displayed for the floor map, and to allow the end user to select a different floor to be displayed, although in other embodiments the 3D model may simultaneously show all floors or other levels together. In addition, in this example, the GUI 260 includes further additional user-selectable controls 295 to select various display modes or to otherwise select types of functionality to be provided, including a user-selectable control 296 (not yet selected) to cause simulated lighting information to be generated and presented in the model 265e. It will be appreciated that a variety of other types of information may be added in some embodiments, that some of the illustrated types of information may not be provided in some embodiments, and that visual indications of and user selections of controls and/or of linked and associated information may be displayed and selected in other manners in other embodiments.” ¶ [0034]: “FIG. 2F continues the examples of FIGS. 2A-2E, and illustrates a modified version of the GUI that includes an updated 3D model 265f corresponding to user-selectable control 296 having been selected by the user, with the generated and presented simulated lighting conditions corresponding to user-modifiable controls 226-227 to specify target conditions represented by the simulated lighting (in this example, to select a target time for the simulated lighting, by selecting a time-of-day and a season-of-year, respectively). In this example, the user has further modified the 3D computer model to display only a portion of a selected room (e.g., via zooming and/or dragging or other positioning, not shown), which in this example is the living room. Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).”), wherein the indoor scene (e.g., ¶ [0014]: “building's interior.”) is associated with a geometric model (e.g., ¶ [0002]: “a three-dimensional computer model of an interior of an as-built building.” ¶ [0005]: “a 3D (three-dimensional) computer model of a building's interior.” ¶ [0011]: “the computer model of the building's interior is a 3D (three-dimensional) or 2.5D (two and a half dimensional) representation that is generated after the house is built” ¶ [0014]: “a computer 3D model (e.g., with full height information represented) or computer 2.5D model (e.g., with partial representations of height shown) of a building's interior.” ¶ [0033]: “the 3D model 265e shown in FIG. 2E”; ¶ [0033]: “The 3D model 265e for the house 198” ¶ [0043]: “computer model of the building interior” ), and wherein the geometric model comprises a model of an opening to an outside of the indoor scene (e.g., ¶ [0033]: “the visual representation of the 3D model 265e shown in FIG. 2E includes additional visual representations of walls (e.g., based on estimated or measured heights of the walls), of doors and windows, etc.”) ¶ [0029]: “two windows 196-1,” ¶ [0034]: “the west window 196-2 of the living room”; ¶ [0036]: “the window”) (¶ [0011]: “the computer model of the building's interior is a 3D (three-dimensional) or 2.5D (two and a half dimensional) representation that is generated after the house is built and that shows physical components of the house's actual interior (e.g., walls, windows, doors, stairs, fireplaces, kitchen islands, cabinets, counters, lighting and/or plumbing fixtures and associated built-in elements such as sinks and showers/baths, curtains, wall paper or paint, floor coverings, etc.), such as from analysis of images acquired in the house's interior to reflect a current structure of the house (and optionally non-fixed or temporary elements in the house, such as furniture and/or furnishings).” ¶ [0012]: “after the external lighting source position is determined, effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior.” ¶ [0033]: “the 3D model 265e shown in FIG. 2E includes additional visual representations of walls (e.g., based on estimated or measured heights of the walls), of doors and windows, etc.” ¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).” NOTE: In order for the simulated lighting to enter the window in the 3D model, the window, in addition to the lighting, must be modeled.); instantiating a generic outdoor lighting model (e.g., ¶ [0034]: “sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere),” ¶ [0035]: “the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm,” ¶ [0036]: “the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere,”) based on the virtual outdoor condition (e.g., ¶ [0034]: “4 pm in the winter for a home in the northern hemisphere”; ¶ [0035]: “the new time-of-day”; ¶ [0036]: “the new season”; ¶ [0073]: “the routine continues to 640 to generate simulated lighting information for the room of the current combination at the target time of the current combination, such as by determining light entering the room from one or more external light sources at the current target time (e.g., by determining a position of the sun and/or moon in the sky for the target time and based on the buildings geographical location and orientation), and by using reflections or other light scatterings off walls and/or other structural components of the room's interior (e.g., via ambient occlusion processing using light transport matrix techniques and/or ray tracing techniques). In addition, other factors may optionally be considered during the generation of the simulated lighting information, such as a specified type of weather or likely weather for the current target time, effects of nearby buildings and/or vegetation, etc.”) (¶ [0012]: “In at least some embodiments, a position of an external lighting source (e.g., a position of the sun or moon in the sky; a location of one or more external lights, such as on an exterior of the building or otherwise on a property of the building, or on a streetlight; etc.) is determined for a building based at least in part on the building's geographical location (e.g., address, latitude and longitude or other GPS coordinates, etc.) and geographical orientation (e.g., compass-based or other cardinal directions for external walls, such as with respect to magnetic north or true north), and a specified time (e.g., a time-of-day, such as a time with hours and/or minutes specified according to a 24-hour clock; and/or a time-of-year, such as season-of-the-year and/or month-of-the-year and/or week-of-a-month and/or week-of-the-year and/or day-of-year/month/week)—after the external lighting source position is determined, effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior. In addition, other factors that may reduce or otherwise alter such incoming light may similarly be determined and used in at least some embodiments, such as effects from other external buildings and/or vegetation adjacent to the building (e.g., by modeling the external buildings and/or vegetation as solid shapes such as polyhedra or prismatoids that block some or all light striking them), from a determined type of weather (e.g., typical weather for a specified time, or a specific type of weather that is selected to be modeled), etc. Using such types of information, the BMLSM system may in at least some embodiments perform an ambient occlusion calculation for each room in the building's interior to estimate an amount and direction of light entering the room and the effects of light reflection or other scattering within the room (e.g., from walls and other structural elements, furniture, etc.), such as using ray tracing techniques and/or light transport matrix techniques. In other embodiments, other light simulation techniques may be used, whether instead of or in addition to ambient occlusion, such as one or more of global illumination, radiosity, etc.” NOTE: At the very least, determining the position of the sun or moon in the sky for a specified time-of-day in order to model exterior lighting at the specified time-of-day constitutes instantiating a generic outdoor lighting model. ¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).”), wherein the generic outdoor lighting model (e.g., ¶ [0035]: “position of the external lighting source (in this case, the sun)”) is independent of the indoor scene (e.g., ¶ [0035]: “position of the external lighting source (in this case, the sun)” NOTE: The position of the external lighting source, e.g., the position of the sun, for a selected time-of-day and time-of-year, at a specific geographic location, is independent of the indoor scene of the 3D building model. In other words, a generic lighting model for sunlight based on the position of the sun only depends, in essence, on a generic model of the solar position at different times for specific geographic locations, and, as such, is independent of the indoor scene.) (¶ [0012]: “a position of an external lighting source (e.g., a position of the sun or moon in the sky; a location of one or more external lights, such as on an exterior of the building or otherwise on a property of the building, or on a streetlight; etc.) is determined for a building based at least in part on the building's geographical location (e.g., address, latitude and longitude or other GPS coordinates, etc.) and geographical orientation (e.g., compass-based or other cardinal directions for external walls, such as with respect to magnetic north or true north), and a specified time (e.g., a time-of-day, such as a time with hours and/or minutes specified according to a 24-hour clock; and/or a time-of-year, such as season-of-the-year and/or month-of-the-year and/or week-of-a-month and/or week-of-the-year and/or day-of-year/month/week)—after the external lighting source position is determined, effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior. NOTE: The modeling of the position of an external lighting source (e.g., the position of the sun or moon in the sky at a specified time and location) is independent of the indoor scene. In other words, any model of the external lighting, for the specified external conditions, is independent of the indoor scene. ¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).”), computing a scene-dependent indoor lighting model (e.g., ¶ [0012]: “calculation for each room in the building's interior to estimate an amount and direction of light entering the room and the effects of light reflection or other scattering within the room” ¶ [0012]: “automated generation of simulated lighting information for a model of an interior of a house or other building”) for the indoor scene (e.g., ¶ [0011]: “the building interior” ¶ [0011]: “the building's interior”) (¶ [0011]: “generating and displaying simulated lighting information in the model based on sunlight or other external light that is estimated to enter the building and be visible in particular rooms of the interior under specified conditions.” ¶ [0012]: “The automated generation of simulated lighting information for a model of an interior of a house or other building under specified conditions”; ¶ [0012]: “after the external lighting source position is determined, effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior.” ¶ [0012]: “perform an ambient occlusion calculation for each room in the building's interior to estimate an amount and direction of light entering the room and the effects of light reflection or other scattering within the room (e.g., from walls and other structural elements, furniture, etc.),” ¶ [0013]: “the simulated lighting information for a building's interior may be generated” ¶ [0011]: “the building interior model”; ¶ [0011]: “the computer model of the building's interior is a 3D (three-dimensional) or 2.5D (two and a half dimensional) representation that is generated after the house is built and that shows physical components of the house's actual interior (e.g., walls, windows, doors, stairs, fireplaces, kitchen islands, cabinets, counters, lighting and/or plumbing fixtures and associated built-in elements such as sinks and showers/baths, curtains, wall paper or paint, floor coverings, etc.), such as from analysis of images acquired in the house's interior to reflect a current structure of the house (and optionally non-fixed or temporary elements in the house, such as furniture and/or furnishings).”) by applying lighting parameters derived from the generic outdoor lighting model (e.g., ¶ [0012]: “after the external lighting source position is determined, effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior.” ¶ [0012]: “Using such types of information, the BMLSM system may in at least some embodiments perform an ambient occlusion calculation for each room in the building's interior to estimate an amount and direction of light entering the room and the effects of light reflection or other scattering within the room (e.g., from walls and other structural elements, furniture, etc.), such as using ray tracing techniques and/or light transport matrix techniques. In other embodiments, other light simulation techniques may be used, whether instead of or in addition to ambient occlusion, such as one or more of global illumination, radiosity, etc.”) (¶ [0011]: “using one or more computing devices to perform automated operations related to, with respect to a computer model of a building's interior, generating and displaying simulated lighting information in the model based on sunlight or other external light that is estimated to enter the building and be visible in particular rooms of the interior under specified conditions.” ¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).”); and adapting a scene rendering (e.g., FIGS. 2F-2J; ¶ [0034]: “FIG. 2F continues the examples of FIGS. 2A-2E, and illustrates a modified version of the GUI that includes an updated 3D model 265f corresponding to user-selectable control 296 having been selected by the user, with the generated and presented simulated lighting conditions corresponding to user-modifiable controls 226-227 to specify target conditions represented by the simulated lighting (in this example, to select a target time for the simulated lighting, by selecting a time-of-day and a season-of-year, respectively).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).” ¶ [0022]: “2.5D model rendering of the building and/or a 3D model rendering of the building”; ¶ [0022]: “a displayed or otherwise generated computer model (e.g., a 2.5D or 3D model view that optionally includes images texture-mapped to walls of the displayed model)” ¶ [0034]: “FIG. 2F continues the examples of FIGS. 2A-2E, and illustrates a modified version of the GUI that includes an updated 3D model 265f corresponding to user-selectable control 296 having been selected by the user, with the generated and presented simulated lighting conditions corresponding to user-modifiable controls 226-227 to specify target conditions represented by the simulated lighting (in this example, to select a target time for the simulated lighting, by selecting a time-of-day and a season-of-year, respectively). In this example, the user has further modified the 3D computer model to display only a portion of a selected room (e.g., via zooming and/or dragging or other positioning, not shown), which in this example is the living room. Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).”) by at least one of: sending the scene-dependent indoor lighting model for rendering to an external device (¶ [0011]: “the building interior model may be displayed in at least some embodiments to a user of a client computing device in a GUI (graphical user interface) displayed on the client computing device, with the user able to specify the conditions for which a simulated lighting display is generated.” ¶ [0018]: “the BMLSM system may receive information via computer network(s) 170 from end users of map viewer client computing devices 175 about specified conditions for which the lighting simulation information is generated, before generating and providing such simulated lighting information for display on the client computing devices 175,” ¶ [0022]: “One or more end users (not shown) of one or more map viewer client computing devices 175 may further interact over computer networks 170 with the BMLSM system 140 (and optionally the ICA system 160 and/or FMGM system 160), such as to obtain, display and interact with a generated computer model and/or floor map, including to obtain and present simulated lighting information that is generated for such a computer model based on user-specified conditions.” ¶ [0024]: “an exemplary building interior environment in which images are acquired and for which one or more computer models and/or 2D floor maps are generated, for further use by the BMLSM system to generate and provide simulated lighting conditions, as discussed in greater detail with respect to FIGS. 2A-2K, as well as for use in otherwise presenting the computer models and/or floor maps and/or images to users.” ¶ [0047]: “the BMLSM system 340 and/or the ICA system 389 and/or the FMGM system 379 in a single system or device,” ¶ [0047]: “The server computing system(s) 300 and executing BMLSM system 340, and server computing system(s) 380 and executing ICA system 389, and server computing system(s) 370 and executing FMGM system 379, may communicate with each other and with other computing systems and devices in this illustrated embodiment via one or more networks 399 (e.g., the Internet, one or more cellular telephone networks, etc.), such as to interact with user client computing devices 390 (e.g., used to view 3D computer models with generated and presented simulated lighting information, and optionally other associated information such as floor maps, images and/or other related information),” ¶ [0050]: “Some or all of the user client computing devices 390 (e.g., mobile devices), mobile image acquisition devices 360, optional other navigable devices 395 and other computing systems (not shown) may similarly include some or all of the same types of components illustrated for server computing system 300. As one non-limiting example, the mobile image acquisition devices 360 are each shown to include one or more hardware CPU(s) 361, I/O components 362, storage 365, and memory 367, with one or both of a browser and one or more client applications 368 (e.g., an application specific to the FMGM system and/or ICA system and/or BMLSM system) executing within memory 367, such as to participate in communication with the BMLSM system 340, ICA system 389, FMGM system 379 and/or other computing systems”); or rendering the indoor scene based on the scene-dependent indoor lighting model (Abstract: “The computer model may be a 3D (three-dimensional) or 2.5D representation that is generated after the house is built and that shows physical components of the actual house's interior (e.g., walls), and may be displayed to a user of a client computing device in a displayed GUI (graphical user interface) via which the user specifies conditions for which the simulated lighting display is generated.” ¶ [0011]: “using one or more computing devices to perform automated operations related to, with respect to a computer model of a building's interior, generating and displaying simulated lighting information in the model based on sunlight or other external light that is estimated to enter the building and be visible in particular rooms of the interior under specified conditions.” ¶ [0011]: “In addition, the building interior model may be displayed in at least some embodiments to a user of a client computing device in a GUI (graphical user interface) displayed on the client computing device, with the user able to specify the conditions for which a simulated lighting display is generated. Additional details are included below regarding the automated operations of the computing device(s) involved in the generating and displaying of the simulated lighting information, and some or all of the techniques described herein may, in at least some embodiments, be performed at least in part via automated operations of a Building Map Lighting Simulation Manager (“BMLSM”) system, as discussed further below.” ¶ [0012]: “using ray tracing techniques and/or light transport matrix techniques. In other embodiments, other light simulation techniques may be used, whether instead of or in addition to ambient occlusion, such as one or more of global illumination, radiosity, etc.” ¶ [0073]: “If it is instead determined in block 630 not to use previously defined lighting simulation information (e.g., if no such previously defined lighting simulation information is available, or if available previously defined lighting simulation information is not sufficiently close to the current target conditions), the routine continues to 640 to generate simulated lighting information for the room of the current combination at the target time of the current combination, such as by determining light entering the room from one or more external light sources at the current target time (e.g., by determining a position of the sun and/or moon in the sky for the target time and based on the buildings geographical location and orientation), and by using reflections or other light scatterings off walls and/or other structural components of the room's interior (e.g., via ambient occlusion processing using light transport matrix techniques and/or ray tracing techniques).” NOTE: Displaying the 3D computer model with the simulated lighting, by necessity, requires rendering the scene and, as such, is inherently taught by VINCENT. Furthermore, generating the displayed indoor 3D model using ray-tracing techniques, etc., by definition refers to rendering the indoor scene. ¶ [0022]: “2.5D model rendering of the building and/or a 3D model rendering of the building”; ¶ [0022]: “Accordingly, non-exclusive examples of an end user's interactions with a displayed or otherwise generated computer model (e.g., a 2.5D or 3D model view that optionally includes images texture-mapped to walls of the displayed model) and/or 2D floor map of a building may include one or more of the following: to change between a computer model view and a floor map view (collectively referred to herein as one or more mapping views); to change between a mapping view and a view of a particular image at a viewing location within or near the building's interior;” ¶ [0052]: “Alternatively, in other embodiments some or all of the software components and/or systems may execute in memory on another device and communicate with the illustrated computing systems via inter-computer communication. Thus, in some embodiments, some or all of the described techniques may be performed by hardware means that include one or more processors and/or memory and/or storage when configured by one or more software programs (e.g., by the BMLSM system software 340 executing on server computing systems 300 and/or on devices 360, by the ICA system software 389 executing on server computing systems 380, by the FMGM system software 379 executing on server computing systems 370, etc.) and/or data structures, such as by execution of software instructions of the one or more software programs and/or by storage of such software instructions and/or data structures, and such as to perform algorithms as described in the flow charts and other disclosure herein.”). Although VINCENT discloses instantiating the generic outdoor lighting model based on the virtual outdoor condition and applying lighting parameters derived from the instantiated generic outdoor model to the model of the opening associated with the first geographical location, VINCENT fails to explicitly disclose that “the virtual outdoor condition is associated with a second geographical location that is different from the first geographical location,” and subsequently, mutatis mutandis, “applying lighting parameters derived from the generic outdoor model associated with the second geographical location,” (emphasis added). However, whereas VINCENT is not entirely explicit as to, GLASER, working in the same field of endeavor, teaches and/or renders obvious: wherein the virtual outdoor condition (e.g., ¶ [0005]: “an environmental daylighting model associated with a designated geographical location for the architectural space model,”) is associated with a second geographical location (e.g., ¶ [0004]; “site location (e.g., geography, altitude, climate zone, etc.), site orientation (e.g., north angle, etc.),” ¶ [0005]: “an environmental daylighting model associated with a designated geographical location for the architectural space model,”) that is different from the first geographical location (e.g., a geographical location of an existing building where images have been captured and used to create a 3D model of the as-built building, as taught by VINCENT) (¶ [0005]: “According to one set of embodiments, a method is provided for simulation and analysis of lighting performance in architectural modeling environments. The method includes: associating, with a processor-implemented lighting modeling engine, lighting performance properties with a plurality of structural components defined as building geometry of an architectural space model in a three-dimensional computer-aided design (3D CAD) environment; formulating an environmental daylighting model associated with a designated geographical location for the architectural space model, a designated orientation for the architectural space model, and a plurality of keytimes; communicating the architectural space model and the environmental daylighting model to a lighting rendering engine remote from the lighting modeling engine; receiving, at the lighting modeling engine from the lighting rendering engine, lighting rendering data computed by the daylighting rendering engine by ray-tracing the architectural space model as a function of the structural components, the lighting performance properties, and the environmental daylighting model at each keytime; and outputting, via an interface of the lighting modeling engine, a plurality of images, each graphically representing, for a respective one of the keytimes, a depiction of the architectural space model and a distributed lighting performance metric computed as a function of the lighting rendering data.” ¶ [0031]: “Returning again to FIG. 1, embodiments of the environmental modeling sub-engine 130 can formulate an environmental daylighting model 135 that effectively defines the natural lighting conditions for simulation. In some implementations, the environmental daylighting model 135 is associated with a designated geographical location for the architectural space model, a designated orientation for the architectural space model, and a plurality of keytimes. The environmental daylighting model 135 can be defined in any suitable manner for use by the lighting rendering engine 180. For example, some embodiments implement the lighting rendering engine 180 as a local and/or remote (e.g., cloud-based, distributed, etc.) version of the RADIANCE Synthetic Imaging System (and/or any other suitable lighting rendering system). For example, one or more local clients can be installed as a mini-server on one or more client machines. In such implementations, the environmental daylighting model 135 can be defined as a “sky dome” or other numeric definition of the environmental data at each key time (e.g., any suitable, physically-based, analytical model of the sky at each keytime).” ¶ [0033]: “The designated orientation for the architectural space model can be selected in any suitable manner. For example, the 3D CAD modeling environment may include a definition of the geographic orientation of the modeled structure (e.g., in relation to “compass north,” or the like). Alternatively, some implementations of the environmental modeling sub-engine 130 permit a user to enter an orientation (e.g., numerically, by manipulating a rendering of the building geometry 165, etc.).” ¶ [0034]: “The designated geographical location can be defined in any suitable manner. For example, a user can be prompted (e.g., via the GUI 115) to select one of a number of preset geographic locations from a list or a map, click on a location of a map, enter a geographic place name (e.g., a city, state, etc.), enter a latitude and longitude, enter a landmark name (e.g., an airport, a weather station, etc.), etc. In some implementations, designating the geographical location involves receiving input from the user, then selecting (or offering selection of) one or more nearest preset locations. For example, a user can input a city and state, and the environmental modeling sub-engine 130 can identify a nearest weather station as the designated geographic location.” ¶ [0035]: “Other implementations enable one or more sky types to be selected and/or defined, such as a clear sky, an overcast sky, a sky to be generated based on climate data (e.g., average climate data and/or any other suitable function) for the designated location and keytime, etc. For example, daylighting data (e.g., position of the sun, average cloud cover, etc.) can be retrieved by the environmental modeling sub-engine 130 for the designated geographic location at each designated keytime.”). Thus, in order to obtain a more versatile apparatus for simulating lighting for building models, it would have been obvious to one of ordinary skill in the art to modify the system for simulating lighting for building models, so as to include a graphical user interface for prompting a user to select and/or input a desired geographical location for retrieving associated daylighting data for use in an environmental daylighting model, as clearly taught by GLASER. Moreover, given the modified apparatus for simulating lighting for building models including a graphical user interface for selecting/inputting a designated geographical location, it would have been obvious to one of ordinary skill in the art, for the user to select to input any desired geographic location as a simple matter of choice. In other words, for the apparatus resulting from the combination of VINCENT and GLASER, given a 3D building model generated from images taken of an existing building at a first geographical location, as taught by VINCENT, it would be obvious that the user could enter any desired geographic location into the graphical user interface, including a second geographical location different from the first geographical location of the existing building. For instance, for a user considering replicating an existing building located at a first geographic location at a second geographic location different from the first location, such a user would obviously be motivated to enter the second geographical location into the user interface in order to create an architectural lighting simulation of the 3D model of the existing building at the desired second geographical location. For instance, it is a well-known and longtime common practice, for residential developers, before actually building all of the houses on the lots (i.e., individual building sites) of a neighborhood under development, to first build model homes (i.e., physical houses) representing the various models/designs of houses being offered for sale. Thus, as a tool for selling (or buying) the houses on the various lots being offered, it would be very desirable for the buyer that is touring a model home (i.e., a physical home in a first geographical location) to be able to visualize how the interior lighting for that model of house would appear if reproduction of that model of house was built on one of the lots being offered for sale in the development (i.e., second geographical locations). At least for this reason, one of ordinary skill in the art would be motivated to modify the system taught by VINCENT so as to incorporate a user interface for choosing from a selection of geographic locations, as taught by GLASER. Regarding claim 5 (depends on claim 1), VINCENT discloses: obtaining a lighting model element corresponding to the opening (e.g., ¶ [0034]: “an almost rectangular parallelogram of light entering directly through window 196-2”) is obtained based on the virtual outdoor condition and on the model of the opening (¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).”). Regarding claim 6 (depends on claim 5), VINCENT discloses: the model of the opening (¶ [0012]: “effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.)”) comprises an orientation of the opening relative to at least one of north or a vertical (¶ [0012]: “based at least in part on the building's geographical location (e.g., address, latitude and longitude or other GPS coordinates, etc.) and geographical orientation (e.g., compass-based or other cardinal directions for external walls, such as with respect to magnetic north or true north)” ¶ [0012]: “effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior.”), the lighting model element corresponding to the opening being based on the virtual outdoor condition (¶ [0012]: “a position of an external lighting source (e.g., a position of the sun or moon in the sky; a location of one or more external lights, such as on an exterior of the building or otherwise on a property of the building, or on a streetlight; etc.)”) and on the orientation (¶ [0012]: “a position of an external lighting source (e.g., a position of the sun or moon in the sky; a location of one or more external lights, such as on an exterior of the building or otherwise on a property of the building, or on a streetlight; etc.) is determined for a building based at least in part on the building's geographical location (e.g., address, latitude and longitude or other GPS coordinates, etc.) and geographical orientation (e.g., compass-based or other cardinal directions for external walls, such as with respect to magnetic north or true north), and a specified time (e.g., a time-of-day, such as a time with hours and/or minutes specified according to a 24-hour clock; and/or a time-of-year, such as season-of-the-year and/or month-of-the-year and/or week-of-a-month and/or week-of-the-year and/or day-of-year/month/week)—after the external lighting source position is determined, effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior.” ¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).”). Regarding claim 7 (depends on claim 5), VINCENT discloses: the model of the opening comprises a position of the opening (¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).” NOTE: Given the position of the sun in the western sky at 4 pm in the winter (i.e., the angle of the light rays passing through the window), in order to determine where the rectangular parallelogram of light entering through the west window 196-2 of the living room strikes the floor of the living, by necessity, the position of the window the westward-facing west wall of the living room must be known and, as such, is inherently taught by, at least, ¶ [0034] of VINCENT.), the lighting model element corresponding to the opening being based on the virtual outdoor condition and on the position (¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).”). Regarding claim 8 (depends on claim 1), VINCENT discloses: obtaining the virtual outdoor condition (e.g., ¶ [0034]: “the generated and presented simulated lighting conditions corresponding to user-modifiable controls 226-227 to specify target conditions represented by the simulated lighting”) from a user interface (e.g., ¶ [0033]: “the GUI 260”) (¶ [0018]: “the BMLSM system may receive information via computer network(s) 170 from end users of map viewer client computing devices 175 about specified conditions for which the lighting simulation information is generated, before generating and providing such simulated lighting information for display on the client computing devices 175,” ¶ [0033]: “the GUI 260 includes further additional user-selectable controls 295 to select various display modes or to otherwise select types of functionality to be provided, including a user-selectable control 296 (not yet selected) to cause simulated lighting information to be generated and presented in the model 265e.” ¶ [0034]: “FIG. 2F continues the examples of FIGS. 2A-2E, and illustrates a modified version of the GUI that includes an updated 3D model 265f corresponding to user-selectable control 296 having been selected by the user, with the generated and presented simulated lighting conditions corresponding to user-modifiable controls 226-227 to specify target conditions represented by the simulated lighting (in this example, to select a target time for the simulated lighting, by selecting a time-of-day and a season-of-year, respectively).” ¶ [0071]: “the user may specify the one or more target times and/or the one or more building rooms via a GUI in which a version of the 3D computer model is displayed,”). Regarding claim 9 (depends on claim 1), VINCENT discloses: the virtual outdoor condition comprises an indication of at least one of a day of year or a time of day (¶ [0012]: “a position of an external lighting source (e.g., a position of the sun or moon in the sky; a location of one or more external lights, such as on an exterior of the building or otherwise on a property of the building, or on a streetlight; etc.) is determined for a building based at least in part on the building's geographical location (e.g., address, latitude and longitude or other GPS coordinates, etc.) and geographical orientation (e.g., compass-based or other cardinal directions for external walls, such as with respect to magnetic north or true north), and a specified time (e.g., a time-of-day, such as a time with hours and/or minutes specified according to a 24-hour clock; and/or a time-of-year, such as season-of-the-year and/or month-of-the-year and/or week-of-a-month and/or week-of-the-year and/or day-of-year/month/week)—after the external lighting source position is determined, effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior.” ). Regarding claim 10 (depends on claim 1), VINCENT discloses: the virtual outdoor condition comprises information indicating a weather condition (¶ [0012]: “other factors that may reduce or otherwise alter such incoming light may similarly be determined and used in at least some embodiments, such as effects from other external buildings and/or vegetation adjacent to the building (e.g., by modeling the external buildings and/or vegetation as solid shapes such as polyhedra or prismatoids that block some or all light striking them), from a determined type of weather (e.g., typical weather for a specified time, or a specific type of weather that is selected to be modeled), etc.” ¶ [0039]: “While not illustrated in FIGS. 2A-2K, other factors may similarly be used to affect simulated lighting, such as a specified type of weather and/or typical weather for a target time (e.g., cloud cover, rain, snow, etc.). ). Regarding claim 11 (depends on claim 1), VINCENT discloses: capturing an image of the indoor scene (e.g., ¶ [0017]: “acquired images 165”) (¶ [0014]: “one or more types of additional information may be associated with and optionally displayed with a computer 3D model (e.g., with full height information represented) or computer 2.5D model (e.g., with partial representations of height shown) of a building's interior. As one example, one or more types of additional information about a building may be received, associated and displayed with such a model (e.g., with particular locations in particular rooms) or otherwise accessible from the displayed model (e.g., upon selection by a user), such as one or more of the following: images;” ¶ [0014]: “in-room images for a room that are projected on the walls of the room shown in the model;” ¶ [0015]: “use 3D models and/or 2.5D models and/or 2D floor maps of multi-room buildings and other structures (e.g., that are generated from images acquired in the buildings or other structures) to display simulated lighting conditions for building interiors that is generated via automated operations of one or more computing systems for particular target times or otherwise for specified target conditions,” ¶ [0016]: “The term “acquire” or “capture” as used herein with reference to a building interior, viewing location, or other location (unless context clearly indicates otherwise) may refer to any recording, storage, or logging of media, sensor data, and/or other information related to spatial and/or visual characteristics of the building interior or subsets thereof, such as by a recording device or by another device that receives information from the recording device.” ¶ [0019]: “Various components of the mobile image acquisition device 185 are illustrated in FIG. 1A,” ¶ [0020]: “acquiring multiple images at multiple associated viewing locations (e.g., in multiple rooms or other locations within a building or other structure and optionally around some or all of the exterior of the building or other structure), such as using visual data acquired via the mobile device(s) 185, and for subsequent use in generating and providing a representation of an interior of the building or other structure. For example, in at least some such embodiments, such techniques may include using one or more mobile devices (e.g., a camera having one or more fisheye lenses and mounted on a rotatable tripod or otherwise having an automated rotation mechanism; a camera having sufficient fisheye lenses to capture 360 degrees horizontally without rotation; a smart phone held and moved by a user, such as to rotate the user's body and held smart phone in a 360º circle around a vertical axis; a camera held by or mounted on a user or the user's clothing; a camera mounted on an aerial and/or ground-based drone or robotic device; etc.) to capture data from a sequence of multiple viewing locations within multiple rooms of a house (or other building), and to optionally further capture data involved in movement or travel between some or all of the viewing locations for use in linking the multiple viewing locations together, but without having distances between the viewing locations being measured or having other measured depth information to objects in an environment around the viewing locations (e.g., without using any depth-sensing sensors).” ¶ [0025]: “In operation, the mobile image acquisition device 185 arrives at a first viewing location 210A within a first room of the building interior (in this example, in a living room accessible via an external door 190-1), and captures a view of a portion of the building interior that is visible from that viewing location 210A (e.g., some or all of the first room, and optionally small portions of one or more other adjacent or nearby rooms, such as through doors, halls, stairs or other connecting passages from the first room). The view capture may be performed in various manners as discussed herein, and may capture information about a number of objects or other features (e.g., structural details) that are visible in images captured from the viewing location—in the example of FIG. 1B, such objects or other features throughout the house include the doorways 190 (including 190-1 and 190-3) and 197 (e.g., with swinging and/or sliding doors), windows 196 (including 196-1, 196-2, 196-3 and 196-4), corners or edges 195 (including corner 195-1 in the northwest corner of the building 198, corner 195-2 in the northeast corner of the first room, corner 195-3 in the southwest corner of the first room, corner 195-4 at the northern edge of the inter-room passage between the first room and a hallway, etc.), furniture 191-193 (e.g., a couch 191; chair 192; table 193; etc.), pictures or paintings or televisions or other hanging objects 194 (such as 194-1 and 194-2) hung on walls, light fixtures, various built-in appliances or fixtures (not shown), etc.”), and wherein rendering the indoor scene comprises rendering the image of the indoor scene based on the lighting model of the indoor scene (¶ [0017]: “a system 160 that is executing on one or more server computing systems 180, and/or a system provided by application 157 executing on one or more mobile image acquisition devices 185) has used the acquired images 165 and optionally other information to generate one or more 2D floor maps 165 and/or computer models 165 (e.g., 3D and/or 2.5D models) for the one or more buildings or other structures. FIG. 1B shows one example of acquisition of such panorama images for a particular house at multiple viewing locations 210, and FIGS. 2A-2K illustrate additional details about using a computer model generated from such panorama images to display generated simulated lighting information for an interior of the building, as discussed further below.” ¶ [0018]: “A BMLSM (Building Map Lighting Simulation Manager) system 140 is further executing on one or more server computing systems to use building models 145 (e.g., models 165 acquired from the FMGM system) and/or other mapping-related information (not shown) that result from the images 165 and optionally additional associated information in order to generate and display simulated lighting information for such models 145.” ¶ [0021]: “In the example of FIG. 1A, the FMGM system may perform automated operations involved in using images acquired at multiple associated viewing locations (e.g., in multiple rooms or other locations within a building or other structure and optionally around some or all of the exterior of the building or other structure) to generate a 2D floor map for the building or other structure and/or to generate a computer model for the building or other structure (e.g., a 3D model and/or a 2.5D model), such as by analyzing visual information available in the images, and for providing a representation of an interior of the building or other structure (e.g., for subsequent use in generating and presenting simulated lighting conditions for the interior of the building or other structure). For example, in at least some such embodiments, such techniques may include analyzing one or more images taken in a room to determine a shape of the room and/or to identify inter-room passages (e.g., doorways and other openings in walls) into and/or out of the room.” NOTE: In other words, images of a room in a building (i.e., a scene) are captured and used to generate a 3D model of the room, and subsequently, the 3D model of the imaged room (i.e., the indoor scene) and the specified simulated lighting conditions (i.e., the lighting model for the indoor scene) are used to render the imaged indoor scene based on the simulated lighting conditions for the interior of the building (i.e., the lighting model for the room). ¶ [0024]: “an exemplary building interior environment in which images are acquired and for which one or more computer models and/or 2D floor maps are generated, for further use by the BMLSM system to generate and provide simulated lighting conditions, as discussed in greater detail with respect to FIGS. 2A-2K, as well as for use in otherwise presenting the computer models and/or floor maps and/or images to users.” ¶ [0028]: “FIGS. 2F-2K illustrate further examples of generating and presenting simulated lighting information for the 3D computer model, such as for the building 198 and images' viewing locations 210 discussed in FIG. 1B.”). Regarding claim 12 (depends on claim 11), VINCENT discloses: removing a lighting effect in the image before applying the lighting model of the indoor scene (¶ [0013]: “effects on the simulated lighting of changes to the building interior (e.g., adding or removing a window; adding or removing part or all of a wall; changing furnishings or moveable elements; changing the color and/or texture of a surface, such as a wall or a floor or a ceiling or a countertop; adding or removing lighting sources at specified locations in the interior and whose effects are included in the generated simulated lighting; etc.);” ¶ [0013]: “In addition, a model of a building interior may be displayed in at least some embodiments to a user in a displayed GUI on a client computing device, and the user may be able to specify via the GUI (or in another manner) at least some of the conditions for which the simulated lighting display is generated, such as one or more of the following: one or more target times at which to generate the simulated lighting; an amount of the house or other building interior to display (e.g., one or more specific rooms, the entire interior, etc.); a type of simulated lighting display mode (e.g., simulated lighting conditions for a single target time; an animation over a sequence of simulated lighting conditions for multiple target times within a period of time; a comparison of multiple simultaneous simulated lighting conditions, such as daytime and nighttime, or two or more different seasons at a given time-of-day, or two or more other types of different daytime times; etc.); effects on the simulated lighting of changes to the building interior (e.g., adding or removing a window; adding or removing part or all of a wall; changing furnishings or moveable elements; changing the color and/or texture of a surface, such as a wall or a floor or a ceiling or a countertop; adding or removing lighting sources at specified locations in the interior and whose effects are included in the generated simulated lighting; etc.); effects on the simulated lighting of changes outside the building interior (e.g., adding or removing or changing a tree or other vegetation in an environment surrounding the building, such as in a yard of a house; adding or removing or changing an exterior building or other external structure, whether on a same property as the building or a nearby property; etc.);” NOTE: If an interior light source is removed from a room (e.g., a window or interior light), then, by necessity, it must be removed prior to generating the simulated lighting for the changed building interior. Otherwise, the generated simulated lighting would not properly correspond to the changed interior lighting sources in the modified building interior. As such, VINCENT clearly inherently teaches applying the lighting model of the indoor scene after removing a lighting effect in the image.). Regarding claim 13 (depends on claim 11), VINCENT discloses: lighting a virtual object inserted in the rendered image according to the lighting model of the indoor scene (¶ [0013]: “effects on the simulated lighting of changes to the building interior (e.g., adding or removing a window; adding or removing part or all of a wall; changing furnishings or moveable elements; changing the color and/or texture of a surface, such as a wall or a floor or a ceiling or a countertop; adding or removing lighting sources at specified locations in the interior and whose effects are included in the generated simulated lighting; etc.);”). Regarding claim 14 (depends on claim 11), VINCENT discloses: wherein at least one area of the image is not modelled in the geometric model (¶ [0013]: “effects on the simulated lighting of changes to the building interior (e.g., adding or removing a window; adding or removing part or all of a wall; changing furnishings or moveable elements; changing the color and/or texture of a surface, such as a wall or a floor or a ceiling or a countertop; adding or removing lighting sources at specified locations in the interior and whose effects are included in the generated simulated lighting; etc.);”), and the method further comprising color correcting the at least one area based on the lighting model of the indoor scene (effects on the simulated lighting of changes to the building interior (e.g., adding or removing a window; adding or removing part or all of a wall; changing furnishings or moveable elements; changing the color and/or texture of a surface, such as a wall or a floor or a ceiling or a countertop; adding or removing lighting sources at specified locations in the interior and whose effects are included in the generated simulated lighting; etc.);” NOTE: Removing a lighting source will cause “color correction” of the area(s) in the image that affected by the removal of the lighting source (especially in the area in the image where the light source has been removed from the model). Also, changing a color of a surface in the 3D model (e.g., removing the color in the image(s) from the 3D model generated therefrom) will also cause the surface to be “color corrected” based on the given lighting model (i.e., simulated lighting).). Regarding claim 15 (depends on claim 1), VINCENT discloses: an opacity of the opening is configurable via a user interface (¶ [0013]: “a model of a building interior may be displayed in at least some embodiments to a user in a displayed GUI on a client computing device, and the user may be able to specify via the GUI (or in another manner) at least some of the conditions for which the simulated lighting display is generated,” ¶ [0013]: “Additional details are included elsewhere herein regarding types of user-selectable controls and other user selections in a displayed GUI.” ¶ [0042]: “In addition, an end user may specify one or more thresholds with respect to simulated lighting in various manners, such as a specified amount of lux or other illuminance measurement of an amount of light per amount of surface area, a specified amount of luminance of light reflected or emitted from a surface, a specified daylight factor that expresses an amount of daylight available inside a room (e.g., on a surface) as a percentage of an amount of unobstructed daylight available outside under overcast sky conditions, a specified value for daylight autonomy that corresponds to the percentage of the time when the target illuminance of a point in a space is met by daylight, etc., and the user-specified threshold(s) may be used as part of the determination and/or presentation of corresponding information, as noted above.” NOTE: If the user specifies a reduction (a limit) of the amount of luminance of light emitted from the surface of a window in the model, the corresponding effect is to reduce the opacity of the window (i.e., the opening). ¶ [0043]: “Furthermore, in at least some embodiments, an end user may be able to specify ‘what if’ scenarios related to simulated lighting, such as to specify an amount of occlusion of exterior lighting (e.g., an amount of occlusion of one or more windows of the building, such as a percentage, a square footage, etc. that is occluded), such as via manipulation of a displayed GUI slider control, and to see corresponding simulated lighting condition results on a displayed computer model of the building interior for one or more specified times or other conditions. In at least some such embodiments, some or all such information may be precomputed for some or all windows for one or more defined amounts (e.g., an enumerated group of percentage amounts), or the results may instead be dynamically calculated in part or in whole at a time of the specification by the end user (e.g., in a real time manner).”). Regarding claim 18, VINCENT discloses an apparatus (¶ [0011]: “one or more computing devices”; ¶ [0046]: “server computing system 300”; ¶ [0017]: “FIG. 1A is an example block diagram of various computing devices and systems that may participate in the described techniques in some embodiments.”) comprising at least one processor (¶ [0046]: “each server computing system 300 includes one or more hardware central processing units (“CPUs”) or other hardware processors 305,”) configured to: obtain information (e.g., ¶ [0011]: “specify the conditions”; ¶ [0013]: “specified conditions,”) indicating an indoor scene (e.g., ¶ [0013]: “a building's interior”; ¶ [0069]: “a computer model of a house or other building”) to be rendered under a virtual outdoor condition (e.g., ¶ [0013]: “the simulated lighting information for a building's interior may be generated in at least some embodiments to reflect specified conditions,” ¶ [0034]: “select a target time for the simulated lighting, by selecting a time-of-day and a season-of-year,” e.g., ¶ [0034]: “at 4 pm in the winter for a home in the northern hemisphere” ¶ [0069]: “generate and present simulated lighting information on a computer model of a house or other building in accordance with specified target conditions.”) (¶ [0011]: “the user able to specify the conditions for which a simulated lighting display is generated.” ¶ [0012]: “a determined type of weather (e.g., typical weather for a specified time, or a specific type of weather that is selected to be modeled),” ¶ [0013]: “As noted above, the simulated lighting information for a building's interior may be generated in at least some embodiments to reflect specified conditions, such as a target time at which to generate the simulated lighting (e.g., a season-of-the-year and a time-of-day), an amount of the building interior to display (e.g., one or more specific rooms, the entire interior, etc.), a geographical location and/or orientation of the building, typical weather for the building location and target time, etc.” ¶ [0013]: “a model of a building interior may be displayed in at least some embodiments to a user in a displayed GUI on a client computing device, and the user may be able to specify via the GUI (or in another manner) at least some of the conditions for which the simulated lighting display is generated, such as one or more of the following: one or more target times at which to generate the simulated lighting; an amount of the house or other building interior to display (e.g., one or more specific rooms, the entire interior, etc.); a type of simulated lighting display mode (e.g., simulated lighting conditions for a single target time; an animation over a sequence of simulated lighting conditions for multiple target times within a period of time; a comparison of multiple simultaneous simulated lighting conditions, such as daytime and nighttime, or two or more different seasons at a given time-of-day, or two or more other types of different daytime times; etc.); effects on the simulated lighting of changes to the building interior (e.g., adding or removing a window; adding or removing part or all of a wall; changing furnishings or moveable elements; changing the color and/or texture of a surface, such as a wall or a floor or a ceiling or a countertop; adding or removing lighting sources at specified locations in the interior and whose effects are included in the generated simulated lighting; etc.); effects on the simulated lighting of changes outside the building interior (e.g., adding or removing or changing a tree or other vegetation in an environment surrounding the building, such as in a yard of a house; adding or removing or changing an exterior building or other external structure, whether on a same property as the building or a nearby property; etc.);” ¶ [0018]: “the BMLSM system may receive information via computer network(s) 170 from end users of map viewer client computing devices 175 about specified conditions for which the lighting simulation information is generated, before generating and providing such simulated lighting information for display on the client computing devices 175,” ¶ [0034]: “FIG. 2F continues the examples of FIGS. 2A-2E, and illustrates a modified version of the GUI that includes an updated 3D model 265f corresponding to user-selectable control 296 having been selected by the user, with the generated and presented simulated lighting conditions corresponding to user-modifiable controls 226-227 to specify target conditions represented by the simulated lighting (in this example, to select a target time for the simulated lighting, by selecting a time-of-day and a season-of-year, respectively). In this example, the user has further modified the 3D computer model to display only a portion of a selected room (e.g., via zooming and/or dragging or other positioning, not shown), which in this example is the living room. Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0071]: “receives instructions from an end user, and determines one or more target times for which to generate simulated lighting information for one or more rooms of the indicated building, as well as optionally receiving other user-specified display options—in some embodiments and situations, the user may specify the one or more target times and/or the one or more building rooms via a GUI in which a version of the 3D computer model is displayed,”), wherein the indoor scene (e.g., ¶ [0013]: “a building's interior”; ¶ [0069]: “a computer model of a house or other building”) is associated with a first geographical location (e.g., ¶ [0012]: “the building's geographical location (e.g., address, latitude and longitude or other GPS coordinates, etc.) and geographical orientation (e.g., compass-based or other cardinal directions for external walls, such as with respect to magnetic north or true north),” ¶ [0013]: “geographical location and/or orientation of the building,”) (¶ [0015]: “use 3D models and/or 2.5D models and/or 2D floor maps of multi-room buildings and other structures (e.g., that are generated from images acquired in the buildings or other structures) to display simulated lighting conditions for building interiors that is generated via automated operations of one or more computing systems for particular target times or otherwise for specified target conditions, including to use information about the actual as-built buildings (e.g., internal structural components and/or other interior elements, nearby external buildings and/or vegetation, actual building geographical location and/or orientation, actual typical weather patterns, etc.) rather than using information from plans on how the building is designed and should theoretically be constructed. Such described techniques may further provide benefits in allowing improved automated navigation of a building by mobile devices (e.g., semi-autonomous or fully-autonomous vehicles) via varying visibility of interior elements that are visible in different lighting conditions, including to significantly reduce their computing power used and time used to attempt to otherwise learn a building's layout. In addition, in some embodiments the described techniques may be used to provide an improved GUI in which an end user may more accurately and quickly obtain information about a building's interior (e.g., for use in navigating that interior, such as via a virtual tour), including in response to search requests, as part of providing personalized information to the end user, as part of providing value estimates and/or other information about a building to an end user, etc.” ¶ [0033]: “FIG. 2E continues the examples of FIGS. 2A-2D, and illustrates a 3D computer model 265e of the house 198 that is generated from images (such as those illustrated in FIGS. 2A-2C and/or other related images taken from at least some of the viewing locations 210), whether directly or via use of an intermediate 2D floor map such as floor map 230 of FIG. 2D—in this example, most of the types of added information shown in floor map 230 are not shown in the 3D model 265e for the sake of simplicity, but some or all such added information could similarly be shown on the 3D model 265e in some embodiments and situations. With respect to the floor map 230 of FIG. 2D, the visual representation of the 3D model 265e shown in FIG. 2E includes additional visual representations of walls (e.g., based on estimated or measured heights of the walls), of doors and windows, etc.—while this example 3D model does not show actual images projected on the walls, such information may be further added in some embodiments and situations. The 3D model 265e for the house 198 may, for example, be presented to a BMLSM system operator user and/or end user in a GUI 260. In this example, the user-selectable control 228 remains to indicate a current floor that is displayed for the floor map, and to allow the end user to select a different floor to be displayed, although in other embodiments the 3D model may simultaneously show all floors or other levels together. In addition, in this example, the GUI 260 includes further additional user-selectable controls 295 to select various display modes or to otherwise select types of functionality to be provided, including a user-selectable control 296 (not yet selected) to cause simulated lighting information to be generated and presented in the model 265e. It will be appreciated that a variety of other types of information may be added in some embodiments, that some of the illustrated types of information may not be provided in some embodiments, and that visual indications of and user selections of controls and/or of linked and associated information may be displayed and selected in other manners in other embodiments.” ¶ [0034]: “FIG. 2F continues the examples of FIGS. 2A-2E, and illustrates a modified version of the GUI that includes an updated 3D model 265f corresponding to user-selectable control 296 having been selected by the user, with the generated and presented simulated lighting conditions corresponding to user-modifiable controls 226-227 to specify target conditions represented by the simulated lighting (in this example, to select a target time for the simulated lighting, by selecting a time-of-day and a season-of-year, respectively). In this example, the user has further modified the 3D computer model to display only a portion of a selected room (e.g., via zooming and/or dragging or other positioning, not shown), which in this example is the living room. Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).”); instantiate a generic outdoor lighting model (e.g., ¶ [0034]: “sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere),” ¶ [0035]: “the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm,” ¶ [0036]: “the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere,”) based on the virtual outdoor condition (e.g., ¶ [0034]: “4 pm in the winter for a home in the northern hemisphere”; ¶ [0035]: “the new time-of-day”; ¶ [0036]: “the new season”; ¶ [0073]: “the routine continues to 640 to generate simulated lighting information for the room of the current combination at the target time of the current combination, such as by determining light entering the room from one or more external light sources at the current target time (e.g., by determining a position of the sun and/or moon in the sky for the target time and based on the buildings geographical location and orientation), and by using reflections or other light scatterings off walls and/or other structural components of the room's interior (e.g., via ambient occlusion processing using light transport matrix techniques and/or ray tracing techniques). In addition, other factors may optionally be considered during the generation of the simulated lighting information, such as a specified type of weather or likely weather for the current target time, effects of nearby buildings and/or vegetation, etc.”) (¶ [0012]: “In at least some embodiments, a position of an external lighting source (e.g., a position of the sun or moon in the sky; a location of one or more external lights, such as on an exterior of the building or otherwise on a property of the building, or on a streetlight; etc.) is determined for a building based at least in part on the building's geographical location (e.g., address, latitude and longitude or other GPS coordinates, etc.) and geographical orientation (e.g., compass-based or other cardinal directions for external walls, such as with respect to magnetic north or true north), and a specified time (e.g., a time-of-day, such as a time with hours and/or minutes specified according to a 24-hour clock; and/or a time-of-year, such as season-of-the-year and/or month-of-the-year and/or week-of-a-month and/or week-of-the-year and/or day-of-year/month/week)—after the external lighting source position is determined, effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior. In addition, other factors that may reduce or otherwise alter such incoming light may similarly be determined and used in at least some embodiments, such as effects from other external buildings and/or vegetation adjacent to the building (e.g., by modeling the external buildings and/or vegetation as solid shapes such as polyhedra or prismatoids that block some or all light striking them), from a determined type of weather (e.g., typical weather for a specified time, or a specific type of weather that is selected to be modeled), etc. Using such types of information, the BMLSM system may in at least some embodiments perform an ambient occlusion calculation for each room in the building's interior to estimate an amount and direction of light entering the room and the effects of light reflection or other scattering within the room (e.g., from walls and other structural elements, furniture, etc.), such as using ray tracing techniques and/or light transport matrix techniques. In other embodiments, other light simulation techniques may be used, whether instead of or in addition to ambient occlusion, such as one or more of global illumination, radiosity, etc.” NOTE: At the very least, determining the position of the sun or moon in the sky for a specified time-of-day in order to model exterior lighting at the specified time-of-day constitutes instantiating a generic outdoor lighting model. ¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).”), wherein the generic outdoor lighting model (e.g., ¶ [0035]: “position of the external lighting source (in this case, the sun)”) is independent of the indoor scene (e.g., ¶ [0035]: “position of the external lighting source (in this case, the sun)” NOTE: The position of the external lighting source, e.g., the position of the sun, for a selected time-of-day and time-of-year, at a specific geographic location, is independent of the indoor scene of the 3D building model. In other words, a generic lighting model for sunlight based on the position of the sun only depends, in essence, on a generic model of the solar position at different times for specific geographic locations, and, as such, is independent of the indoor scene.) (¶ [0012]: “a position of an external lighting source (e.g., a position of the sun or moon in the sky; a location of one or more external lights, such as on an exterior of the building or otherwise on a property of the building, or on a streetlight; etc.) is determined for a building based at least in part on the building's geographical location (e.g., address, latitude and longitude or other GPS coordinates, etc.) and geographical orientation (e.g., compass-based or other cardinal directions for external walls, such as with respect to magnetic north or true north), and a specified time (e.g., a time-of-day, such as a time with hours and/or minutes specified according to a 24-hour clock; and/or a time-of-year, such as season-of-the-year and/or month-of-the-year and/or week-of-a-month and/or week-of-the-year and/or day-of-year/month/week)—after the external lighting source position is determined, effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior. NOTE: The modeling of the position of an external lighting source (e.g., the position of the sun or moon in the sky at a specified time and location) is independent of the indoor scene. In other words, any model of the external lighting, for the specified external conditions, is independent of the indoor scene. ¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).”), compute a scene-dependent indoor lighting model (e.g., ¶ [0012]: “calculation for each room in the building's interior to estimate an amount and direction of light entering the room and the effects of light reflection or other scattering within the room” ¶ [0012]: “automated generation of simulated lighting information for a model of an interior of a house or other building”) for the indoor scene (e.g., ¶ [0011]: “the building interior” ¶ [0011]: “the building's interior”) (¶ [0011]: “generating and displaying simulated lighting information in the model based on sunlight or other external light that is estimated to enter the building and be visible in particular rooms of the interior under specified conditions.” ¶ [0012]: “The automated generation of simulated lighting information for a model of an interior of a house or other building under specified conditions”; ¶ [0012]: “after the external lighting source position is determined, effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior.” ¶ [0012]: “perform an ambient occlusion calculation for each room in the building's interior to estimate an amount and direction of light entering the room and the effects of light reflection or other scattering within the room (e.g., from walls and other structural elements, furniture, etc.),” ¶ [0013]: “the simulated lighting information for a building's interior may be generated” ¶ [0011]: “the building interior model”; ¶ [0011]: “the computer model of the building's interior is a 3D (three-dimensional) or 2.5D (two and a half dimensional) representation that is generated after the house is built and that shows physical components of the house's actual interior (e.g., walls, windows, doors, stairs, fireplaces, kitchen islands, cabinets, counters, lighting and/or plumbing fixtures and associated built-in elements such as sinks and showers/baths, curtains, wall paper or paint, floor coverings, etc.), such as from analysis of images acquired in the house's interior to reflect a current structure of the house (and optionally non-fixed or temporary elements in the house, such as furniture and/or furnishings).”) by applying lighting parameters derived from the generic outdoor lighting model (e.g., ¶ [0012]: “after the external lighting source position is determined, effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior.” ¶ [0012]: “Using such types of information, the BMLSM system may in at least some embodiments perform an ambient occlusion calculation for each room in the building's interior to estimate an amount and direction of light entering the room and the effects of light reflection or other scattering within the room (e.g., from walls and other structural elements, furniture, etc.), such as using ray tracing techniques and/or light transport matrix techniques. In other embodiments, other light simulation techniques may be used, whether instead of or in addition to ambient occlusion, such as one or more of global illumination, radiosity, etc.”) (¶ [0011]: “using one or more computing devices to perform automated operations related to, with respect to a computer model of a building's interior, generating and displaying simulated lighting information in the model based on sunlight or other external light that is estimated to enter the building and be visible in particular rooms of the interior under specified conditions.” ¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).”); and adapt a scene rendering (e.g., FIGS. 2F-2J; ¶ [0034]: “FIG. 2F continues the examples of FIGS. 2A-2E, and illustrates a modified version of the GUI that includes an updated 3D model 265f corresponding to user-selectable control 296 having been selected by the user, with the generated and presented simulated lighting conditions corresponding to user-modifiable controls 226-227 to specify target conditions represented by the simulated lighting (in this example, to select a target time for the simulated lighting, by selecting a time-of-day and a season-of-year, respectively).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).” ¶ [0022]: “2.5D model rendering of the building and/or a 3D model rendering of the building”; ¶ [0022]: “a displayed or otherwise generated computer model (e.g., a 2.5D or 3D model view that optionally includes images texture-mapped to walls of the displayed model)” ¶ [0034]: “FIG. 2F continues the examples of FIGS. 2A-2E, and illustrates a modified version of the GUI that includes an updated 3D model 265f corresponding to user-selectable control 296 having been selected by the user, with the generated and presented simulated lighting conditions corresponding to user-modifiable controls 226-227 to specify target conditions represented by the simulated lighting (in this example, to select a target time for the simulated lighting, by selecting a time-of-day and a season-of-year, respectively). In this example, the user has further modified the 3D computer model to display only a portion of a selected room (e.g., via zooming and/or dragging or other positioning, not shown), which in this example is the living room. Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).”) by at least one of: sending the scene-dependent indoor lighting model for rendering to an external device (¶ [0011]: “the building interior model may be displayed in at least some embodiments to a user of a client computing device in a GUI (graphical user interface) displayed on the client computing device, with the user able to specify the conditions for which a simulated lighting display is generated.” ¶ [0018]: “the BMLSM system may receive information via computer network(s) 170 from end users of map viewer client computing devices 175 about specified conditions for which the lighting simulation information is generated, before generating and providing such simulated lighting information for display on the client computing devices 175,” ¶ [0022]: “One or more end users (not shown) of one or more map viewer client computing devices 175 may further interact over computer networks 170 with the BMLSM system 140 (and optionally the ICA system 160 and/or FMGM system 160), such as to obtain, display and interact with a generated computer model and/or floor map, including to obtain and present simulated lighting information that is generated for such a computer model based on user-specified conditions.” ¶ [0024]: “an exemplary building interior environment in which images are acquired and for which one or more computer models and/or 2D floor maps are generated, for further use by the BMLSM system to generate and provide simulated lighting conditions, as discussed in greater detail with respect to FIGS. 2A-2K, as well as for use in otherwise presenting the computer models and/or floor maps and/or images to users.” ¶ [0047]: “the BMLSM system 340 and/or the ICA system 389 and/or the FMGM system 379 in a single system or device,” ¶ [0047]: “The server computing system(s) 300 and executing BMLSM system 340, and server computing system(s) 380 and executing ICA system 389, and server computing system(s) 370 and executing FMGM system 379, may communicate with each other and with other computing systems and devices in this illustrated embodiment via one or more networks 399 (e.g., the Internet, one or more cellular telephone networks, etc.), such as to interact with user client computing devices 390 (e.g., used to view 3D computer models with generated and presented simulated lighting information, and optionally other associated information such as floor maps, images and/or other related information),” ¶ [0050]: “Some or all of the user client computing devices 390 (e.g., mobile devices), mobile image acquisition devices 360, optional other navigable devices 395 and other computing systems (not shown) may similarly include some or all of the same types of components illustrated for server computing system 300. As one non-limiting example, the mobile image acquisition devices 360 are each shown to include one or more hardware CPU(s) 361, I/O components 362, storage 365, and memory 367, with one or both of a browser and one or more client applications 368 (e.g., an application specific to the FMGM system and/or ICA system and/or BMLSM system) executing within memory 367, such as to participate in communication with the BMLSM system 340, ICA system 389, FMGM system 379 and/or other computing systems”); or rendering the indoor scene based on the scene-dependent indoor lighting model (Abstract: “The computer model may be a 3D (three-dimensional) or 2.5D representation that is generated after the house is built and that shows physical components of the actual house's interior (e.g., walls), and may be displayed to a user of a client computing device in a displayed GUI (graphical user interface) via which the user specifies conditions for which the simulated lighting display is generated.” ¶ [0011]: “using one or more computing devices to perform automated operations related to, with respect to a computer model of a building's interior, generating and displaying simulated lighting information in the model based on sunlight or other external light that is estimated to enter the building and be visible in particular rooms of the interior under specified conditions.” ¶ [0011]: “In addition, the building interior model may be displayed in at least some embodiments to a user of a client computing device in a GUI (graphical user interface) displayed on the client computing device, with the user able to specify the conditions for which a simulated lighting display is generated. Additional details are included below regarding the automated operations of the computing device(s) involved in the generating and displaying of the simulated lighting information, and some or all of the techniques described herein may, in at least some embodiments, be performed at least in part via automated operations of a Building Map Lighting Simulation Manager (“BMLSM”) system, as discussed further below.” ¶ [0012]: “using ray tracing techniques and/or light transport matrix techniques. In other embodiments, other light simulation techniques may be used, whether instead of or in addition to ambient occlusion, such as one or more of global illumination, radiosity, etc.” ¶ [0073]: “If it is instead determined in block 630 not to use previously defined lighting simulation information (e.g., if no such previously defined lighting simulation information is available, or if available previously defined lighting simulation information is not sufficiently close to the current target conditions), the routine continues to 640 to generate simulated lighting information for the room of the current combination at the target time of the current combination, such as by determining light entering the room from one or more external light sources at the current target time (e.g., by determining a position of the sun and/or moon in the sky for the target time and based on the buildings geographical location and orientation), and by using reflections or other light scatterings off walls and/or other structural components of the room's interior (e.g., via ambient occlusion processing using light transport matrix techniques and/or ray tracing techniques).” NOTE: Displaying the 3D computer model with the simulated lighting, by necessity, requires rendering the scene and, as such, is inherently taught by VINCENT. Furthermore, generating the displayed indoor 3D model using ray-tracing techniques, etc., by definition refers to rendering the indoor scene. ¶ [0022]: “2.5D model rendering of the building and/or a 3D model rendering of the building”; ¶ [0022]: “Accordingly, non-exclusive examples of an end user's interactions with a displayed or otherwise generated computer model (e.g., a 2.5D or 3D model view that optionally includes images texture-mapped to walls of the displayed model) and/or 2D floor map of a building may include one or more of the following: to change between a computer model view and a floor map view (collectively referred to herein as one or more mapping views); to change between a mapping view and a view of a particular image at a viewing location within or near the building's interior;” ¶ [0052]: “Alternatively, in other embodiments some or all of the software components and/or systems may execute in memory on another device and communicate with the illustrated computing systems via inter-computer communication. Thus, in some embodiments, some or all of the described techniques may be performed by hardware means that include one or more processors and/or memory and/or storage when configured by one or more software programs (e.g., by the BMLSM system software 340 executing on server computing systems 300 and/or on devices 360, by the ICA system software 389 executing on server computing systems 380, by the FMGM system software 379 executing on server computing systems 370, etc.) and/or data structures, such as by execution of software instructions of the one or more software programs and/or by storage of such software instructions and/or data structures, and such as to perform algorithms as described in the flow charts and other disclosure herein.”). Although VINCENT discloses instantiating the generic outdoor lighting model based on the virtual outdoor condition and applying lighting parameters derived from the instantiated generic outdoor model to the model of the opening associated with the first geographical location, VINCENT fails to explicitly disclose that “the virtual outdoor condition is associated with a second geographical location that is different from the first geographical location,” and subsequently, mutatis mutandis, “applying lighting parameters derived from the generic outdoor model associated with the second geographical location,” (emphasis added). However, whereas VINCENT is not entirely explicit as to, GLASER, working in the same field of endeavor, teaches and/or renders obvious: wherein the virtual outdoor condition (e.g., ¶ [0005]: “an environmental daylighting model associated with a designated geographical location for the architectural space model,”) is associated with a second geographical location (e.g., ¶ [0004]; “site location (e.g., geography, altitude, climate zone, etc.), site orientation (e.g., north angle, etc.),” ¶ [0005]: “an environmental daylighting model associated with a designated geographical location for the architectural space model,”) that is different from the first geographical location (e.g., a geographical location of an existing building where images have been captured and used to create a 3D model of the as-built building, as taught by VINCENT) (¶ [0005]: “According to one set of embodiments, a method is provided for simulation and analysis of lighting performance in architectural modeling environments. The method includes: associating, with a processor-implemented lighting modeling engine, lighting performance properties with a plurality of structural components defined as building geometry of an architectural space model in a three-dimensional computer-aided design (3D CAD) environment; formulating an environmental daylighting model associated with a designated geographical location for the architectural space model, a designated orientation for the architectural space model, and a plurality of keytimes; communicating the architectural space model and the environmental daylighting model to a lighting rendering engine remote from the lighting modeling engine; receiving, at the lighting modeling engine from the lighting rendering engine, lighting rendering data computed by the daylighting rendering engine by ray-tracing the architectural space model as a function of the structural components, the lighting performance properties, and the environmental daylighting model at each keytime; and outputting, via an interface of the lighting modeling engine, a plurality of images, each graphically representing, for a respective one of the keytimes, a depiction of the architectural space model and a distributed lighting performance metric computed as a function of the lighting rendering data.” ¶ [0031]: “Returning again to FIG. 1, embodiments of the environmental modeling sub-engine 130 can formulate an environmental daylighting model 135 that effectively defines the natural lighting conditions for simulation. In some implementations, the environmental daylighting model 135 is associated with a designated geographical location for the architectural space model, a designated orientation for the architectural space model, and a plurality of keytimes. The environmental daylighting model 135 can be defined in any suitable manner for use by the lighting rendering engine 180. For example, some embodiments implement the lighting rendering engine 180 as a local and/or remote (e.g., cloud-based, distributed, etc.) version of the RADIANCE Synthetic Imaging System (and/or any other suitable lighting rendering system). For example, one or more local clients can be installed as a mini-server on one or more client machines. In such implementations, the environmental daylighting model 135 can be defined as a “sky dome” or other numeric definition of the environmental data at each key time (e.g., any suitable, physically-based, analytical model of the sky at each keytime).” ¶ [0033]: “The designated orientation for the architectural space model can be selected in any suitable manner. For example, the 3D CAD modeling environment may include a definition of the geographic orientation of the modeled structure (e.g., in relation to “compass north,” or the like). Alternatively, some implementations of the environmental modeling sub-engine 130 permit a user to enter an orientation (e.g., numerically, by manipulating a rendering of the building geometry 165, etc.).” ¶ [0034]: “The designated geographical location can be defined in any suitable manner. For example, a user can be prompted (e.g., via the GUI 115) to select one of a number of preset geographic locations from a list or a map, click on a location of a map, enter a geographic place name (e.g., a city, state, etc.), enter a latitude and longitude, enter a landmark name (e.g., an airport, a weather station, etc.), etc. In some implementations, designating the geographical location involves receiving input from the user, then selecting (or offering selection of) one or more nearest preset locations. For example, a user can input a city and state, and the environmental modeling sub-engine 130 can identify a nearest weather station as the designated geographic location.” ¶ [0035]: “Other implementations enable one or more sky types to be selected and/or defined, such as a clear sky, an overcast sky, a sky to be generated based on climate data (e.g., average climate data and/or any other suitable function) for the designated location and keytime, etc. For example, daylighting data (e.g., position of the sun, average cloud cover, etc.) can be retrieved by the environmental modeling sub-engine 130 for the designated geographic location at each designated keytime.”). Thus, in order to obtain a more versatile apparatus for simulating lighting for building models, it would have been obvious to one of ordinary skill in the art to modify the system for simulating lighting for building models, so as to include a graphical user interface for prompting a user to select and/or input a desired geographical location for retrieving associated daylighting data for use in an environmental daylighting model, as clearly taught by GLASER. Moreover, given the modified apparatus for simulating lighting for building models including a graphical user interface for selecting/inputting a designated geographical location, it would have been obvious to one of ordinary skill in the art, for the user to select to input any desired geographic location as a simple matter of choice. In other words, for the apparatus resulting from the combination of VINCENT and GLASER, given a 3D building model generated from images taken of an existing building at a first geographical location, as taught by VINCENT, it would be obvious that the user could enter any desired geographic location into the graphical user interface, including a second geographical location different from the first geographical location of the existing building. For instance, for a user considering replicating an existing building located at a first geographic location at a second geographic location different from the first location, such a user would obviously be motivated to enter the second geographical location into the user interface in order to create an architectural lighting simulation of the 3D model of the existing building at the desired second geographical location. For instance, it is a well-known and longtime common practice, for residential developers, before actually building all of the houses on the lots (i.e., individual building sites) of a neighborhood under development, to first build model homes (i.e., physical houses) representing the various models/designs of houses being offered for sale. Thus, as a tool for selling (or buying) the houses on the various lots being offered, it would be very desirable for the buyer that is touring a model home (i.e., a physical home in a first geographical location) to be able to visualize how the interior lighting for that model of house would appear if reproduction of that model of house was built on one of the lots being offered for sale in the development (i.e., second geographical locations). At least for this reason, one of ordinary skill in the art would be motivated to modify the system taught by VINCENT so as to incorporate a user interface for choosing from a selection of geographic locations, as taught by GLASER. Regarding claim 21 (depends on claim 18), VINCENT discloses: a lighting model element corresponding to the opening (e.g., ¶ [0034]: “an almost rectangular parallelogram of light entering directly through window 196-2”) is obtained based on the virtual outdoor condition and on the model of the opening (¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).”). Regarding claim 22 (depends on claim 21), VINCENT discloses: the model of the opening (¶ [0012]: “effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.)”) comprises an orientation of the opening relative to at least one of north or a vertical (¶ [0012]: “based at least in part on the building's geographical location (e.g., address, latitude and longitude or other GPS coordinates, etc.) and geographical orientation (e.g., compass-based or other cardinal directions for external walls, such as with respect to magnetic north or true north)” ¶ [0012]: “effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior.”), the lighting model element corresponding to the opening being based on the virtual outdoor condition (¶ [0012]: “a position of an external lighting source (e.g., a position of the sun or moon in the sky; a location of one or more external lights, such as on an exterior of the building or otherwise on a property of the building, or on a streetlight; etc.)”) and on the orientation (¶ [0012]: “a position of an external lighting source (e.g., a position of the sun or moon in the sky; a location of one or more external lights, such as on an exterior of the building or otherwise on a property of the building, or on a streetlight; etc.) is determined for a building based at least in part on the building's geographical location (e.g., address, latitude and longitude or other GPS coordinates, etc.) and geographical orientation (e.g., compass-based or other cardinal directions for external walls, such as with respect to magnetic north or true north), and a specified time (e.g., a time-of-day, such as a time with hours and/or minutes specified according to a 24-hour clock; and/or a time-of-year, such as season-of-the-year and/or month-of-the-year and/or week-of-a-month and/or week-of-the-year and/or day-of-year/month/week)—after the external lighting source position is determined, effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior.” ¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).”). Regarding claim 23 (depends on claim 21), VINCENT discloses: the model of the opening comprises a position of the opening (¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).” NOTE: Given the position of the sun in the western sky at 4 pm in the winter (i.e., the angle of the light rays passing through the window), in order to determine where the rectangular parallelogram of light entering through the west window 196-2 of the living room strikes the floor of the living, by necessity, the position of the window the westward-facing west wall of the living room must be known and, as such, is inherently taught by, at least, ¶ [0034] of VINCENT.), the lighting model element corresponding to the opening being based on the virtual outdoor condition and on the position (¶ [0034]: “Given the westward-facing west wall of the living room (between corners 195-1 and 195-3 illustrated in FIG. 1B), the west window 196-2 of the living room will admit sunlight from the sun (not shown, but relatively low in the western sky at 4 pm in the winter for a home in the northern hemisphere), with simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation). The simulated lighting 225f may include not only an almost rectangular parallelogram of light entering directly through window 196-2 and striking the floor of the living room, but also further simulated lighting from light reflections and other light scattering (e.g., off of walls, the floor, the ceiling, etc.).” ¶ [0035]: “FIG. 2G continues the examples of FIGS. 2A-2F, and illustrates a modified version 265g of the 3D model to reflect a change in the user-specified target conditions—in particular, the user has modified the control 226 in the GUI to change the time-of-day from 4 pm in FIG. 2F to noon in FIG. 2G. Accordingly, the generated simulated lighting information 225g shown in FIG. 2G has changed to reflect the changed position of the external lighting source (in this case, the sun) at the new time-of-day (e.g., to reflect that the sun is higher in the sky at noon than at 4 pm, causing the simulated lighting to cover a smaller part of the living room in this example).” ¶ [0036]: “FIG. 2H continues the examples of FIGS. 2A-2G, and illustrates a modified version 265h of the 3D model to reflect a further change in the user-specified target conditions—in particular, the user has modified the control 227 in the GUI to change the season-of-year from winter in FIGS. 2F and 2G to summer in FIG. 2H. Accordingly, the generated simulated lighting information 225h shown in FIG. 2H has changed to reflect the changed position of the external lighting source (in this case, the sun) for the new season (e.g., to reflect that the sun is further north during the summer in the northern hemisphere than during the winter in the northern hemisphere, causing the simulated lighting to enter the window at a different angle and to cover a different part of the living room).”). Regarding claim 24 (depends on claim 18), VINCENT discloses: the virtual outdoor condition (e.g., ¶ [0034]: “the generated and presented simulated lighting conditions corresponding to user-modifiable controls 226-227 to specify target conditions represented by the simulated lighting”) is obtained from a user interface (e.g., ¶ [0033]: “the GUI 260”) (¶ [0018]: “the BMLSM system may receive information via computer network(s) 170 from end users of map viewer client computing devices 175 about specified conditions for which the lighting simulation information is generated, before generating and providing such simulated lighting information for display on the client computing devices 175,” ¶ [0033]: “the GUI 260 includes further additional user-selectable controls 295 to select various display modes or to otherwise select types of functionality to be provided, including a user-selectable control 296 (not yet selected) to cause simulated lighting information to be generated and presented in the model 265e.” ¶ [0034]: “FIG. 2F continues the examples of FIGS. 2A-2E, and illustrates a modified version of the GUI that includes an updated 3D model 265f corresponding to user-selectable control 296 having been selected by the user, with the generated and presented simulated lighting conditions corresponding to user-modifiable controls 226-227 to specify target conditions represented by the simulated lighting (in this example, to select a target time for the simulated lighting, by selecting a time-of-day and a season-of-year, respectively).” ¶ [0071]: “the user may specify the one or more target times and/or the one or more building rooms via a GUI in which a version of the 3D computer model is displayed,”). Regarding claim 25 (depends on claim 18), VINCENT discloses: the virtual outdoor condition comprises an indication of at least one of a day of year or a time of day (¶ [0012]: “a position of an external lighting source (e.g., a position of the sun or moon in the sky; a location of one or more external lights, such as on an exterior of the building or otherwise on a property of the building, or on a streetlight; etc.) is determined for a building based at least in part on the building's geographical location (e.g., address, latitude and longitude or other GPS coordinates, etc.) and geographical orientation (e.g., compass-based or other cardinal directions for external walls, such as with respect to magnetic north or true north), and a specified time (e.g., a time-of-day, such as a time with hours and/or minutes specified according to a 24-hour clock; and/or a time-of-year, such as season-of-the-year and/or month-of-the-year and/or week-of-a-month and/or week-of-the-year and/or day-of-year/month/week)—after the external lighting source position is determined, effects of its resulting light that enters the building (e.g., through windows, doors, skylights, etc.) can be determined for particular rooms of the building interior.” ). Regarding claim 26 (depends on claim 18), VINCENT discloses: the virtual outdoor condition comprises information indicating a weather condition (¶ [0012]: “other factors that may reduce or otherwise alter such incoming light may similarly be determined and used in at least some embodiments, such as effects from other external buildings and/or vegetation adjacent to the building (e.g., by modeling the external buildings and/or vegetation as solid shapes such as polyhedra or prismatoids that block some or all light striking them), from a determined type of weather (e.g., typical weather for a specified time, or a specific type of weather that is selected to be modeled), etc.” ¶ [0039]: “While not illustrated in FIGS. 2A-2K, other factors may similarly be used to affect simulated lighting, such as a specified type of weather and/or typical weather for a target time (e.g., cloud cover, rain, snow, etc.). ). Regarding claim 27 (depends on claim 18), VINCENT discloses: wherein an image of the indoor scene is captured (e.g., ¶ [0017]: “acquired images 165”) (¶ [0014]: “one or more types of additional information may be associated with and optionally displayed with a computer 3D model (e.g., with full height information represented) or computer 2.5D model (e.g., with partial representations of height shown) of a building's interior. As one example, one or more types of additional information about a building may be received, associated and displayed with such a model (e.g., with particular locations in particular rooms) or otherwise accessible from the displayed model (e.g., upon selection by a user), such as one or more of the following: images;” ¶ [0014]: “in-room images for a room that are projected on the walls of the room shown in the model;” ¶ [0015]: “use 3D models and/or 2.5D models and/or 2D floor maps of multi-room buildings and other structures (e.g., that are generated from images acquired in the buildings or other structures) to display simulated lighting conditions for building interiors that is generated via automated operations of one or more computing systems for particular target times or otherwise for specified target conditions,” ¶ [0016]: “The term “acquire” or “capture” as used herein with reference to a building interior, viewing location, or other location (unless context clearly indicates otherwise) may refer to any recording, storage, or logging of media, sensor data, and/or other information related to spatial and/or visual characteristics of the building interior or subsets thereof, such as by a recording device or by another device that receives information from the recording device.” ¶ [0019]: “Various components of the mobile image acquisition device 185 are illustrated in FIG. 1A,” ¶ [0020]: “acquiring multiple images at multiple associated viewing locations (e.g., in multiple rooms or other locations within a building or other structure and optionally around some or all of the exterior of the building or other structure), such as using visual data acquired via the mobile device(s) 185, and for subsequent use in generating and providing a representation of an interior of the building or other structure. For example, in at least some such embodiments, such techniques may include using one or more mobile devices (e.g., a camera having one or more fisheye lenses and mounted on a rotatable tripod or otherwise having an automated rotation mechanism; a camera having sufficient fisheye lenses to capture 360 degrees horizontally without rotation; a smart phone held and moved by a user, such as to rotate the user's body and held smart phone in a 360º circle around a vertical axis; a camera held by or mounted on a user or the user's clothing; a camera mounted on an aerial and/or ground-based drone or robotic device; etc.) to capture data from a sequence of multiple viewing locations within multiple rooms of a house (or other building), and to optionally further capture data involved in movement or travel between some or all of the viewing locations for use in linking the multiple viewing locations together, but without having distances between the viewing locations being measured or having other measured depth information to objects in an environment around the viewing locations (e.g., without using any depth-sensing sensors).” ¶ [0025]: “In operation, the mobile image acquisition device 185 arrives at a first viewing location 210A within a first room of the building interior (in this example, in a living room accessible via an external door 190-1), and captures a view of a portion of the building interior that is visible from that viewing location 210A (e.g., some or all of the first room, and optionally small portions of one or more other adjacent or nearby rooms, such as through doors, halls, stairs or other connecting passages from the first room). The view capture may be performed in various manners as discussed herein, and may capture information about a number of objects or other features (e.g., structural details) that are visible in images captured from the viewing location—in the example of FIG. 1B, such objects or other features throughout the house include the doorways 190 (including 190-1 and 190-3) and 197 (e.g., with swinging and/or sliding doors), windows 196 (including 196-1, 196-2, 196-3 and 196-4), corners or edges 195 (including corner 195-1 in the northwest corner of the building 198, corner 195-2 in the northeast corner of the first room, corner 195-3 in the southwest corner of the first room, corner 195-4 at the northern edge of the inter-room passage between the first room and a hallway, etc.), furniture 191-193 (e.g., a couch 191; chair 192; table 193; etc.), pictures or paintings or televisions or other hanging objects 194 (such as 194-1 and 194-2) hung on walls, light fixtures, various built-in appliances or fixtures (not shown), etc.”), and wherein rendering the indoor scene comprises rendering the image of the indoor scene based on the lighting model of the indoor scene (¶ [0017]: “a system 160 that is executing on one or more server computing systems 180, and/or a system provided by application 157 executing on one or more mobile image acquisition devices 185) has used the acquired images 165 and optionally other information to generate one or more 2D floor maps 165 and/or computer models 165 (e.g., 3D and/or 2.5D models) for the one or more buildings or other structures. FIG. 1B shows one example of acquisition of such panorama images for a particular house at multiple viewing locations 210, and FIGS. 2A-2K illustrate additional details about using a computer model generated from such panorama images to display generated simulated lighting information for an interior of the building, as discussed further below.” ¶ [0018]: “A BMLSM (Building Map Lighting Simulation Manager) system 140 is further executing on one or more server computing systems to use building models 145 (e.g., models 165 acquired from the FMGM system) and/or other mapping-related information (not shown) that result from the images 165 and optionally additional associated information in order to generate and display simulated lighting information for such models 145.” ¶ [0021]: “In the example of FIG. 1A, the FMGM system may perform automated operations involved in using images acquired at multiple associated viewing locations (e.g., in multiple rooms or other locations within a building or other structure and optionally around some or all of the exterior of the building or other structure) to generate a 2D floor map for the building or other structure and/or to generate a computer model for the building or other structure (e.g., a 3D model and/or a 2.5D model), such as by analyzing visual information available in the images, and for providing a representation of an interior of the building or other structure (e.g., for subsequent use in generating and presenting simulated lighting conditions for the interior of the building or other structure). For example, in at least some such embodiments, such techniques may include analyzing one or more images taken in a room to determine a shape of the room and/or to identify inter-room passages (e.g., doorways and other openings in walls) into and/or out of the room.” NOTE: In other words, images of a room in a building (i.e., a scene) are captured and used to generate a 3D model of the room, and subsequently, the 3D model of the imaged room (i.e., the indoor scene) and the specified simulated lighting conditions (i.e., the lighting model for the indoor scene) are used to render the imaged indoor scene based on the simulated lighting conditions for the interior of the building (i.e., the lighting model for the room). ¶ [0024]: “an exemplary building interior environment in which images are acquired and for which one or more computer models and/or 2D floor maps are generated, for further use by the BMLSM system to generate and provide simulated lighting conditions, as discussed in greater detail with respect to FIGS. 2A-2K, as well as for use in otherwise presenting the computer models and/or floor maps and/or images to users.” ¶ [0028]: “FIGS. 2F-2K illustrate further examples of generating and presenting simulated lighting information for the 3D computer model, such as for the building 198 and images' viewing locations 210 discussed in FIG. 1B.”). Regarding claim 28 (depends on claim 27), VINCENT discloses: a lighting effect is removed in the image before applying the lighting model of the indoor scene (¶ [0013]: “effects on the simulated lighting of changes to the building interior (e.g., adding or removing a window; adding or removing part or all of a wall; changing furnishings or moveable elements; changing the color and/or texture of a surface, such as a wall or a floor or a ceiling or a countertop; adding or removing lighting sources at specified locations in the interior and whose effects are included in the generated simulated lighting; etc.);” ¶ [0013]: “In addition, a model of a building interior may be displayed in at least some embodiments to a user in a displayed GUI on a client computing device, and the user may be able to specify via the GUI (or in another manner) at least some of the conditions for which the simulated lighting display is generated, such as one or more of the following: one or more target times at which to generate the simulated lighting; an amount of the house or other building interior to display (e.g., one or more specific rooms, the entire interior, etc.); a type of simulated lighting display mode (e.g., simulated lighting conditions for a single target time; an animation over a sequence of simulated lighting conditions for multiple target times within a period of time; a comparison of multiple simultaneous simulated lighting conditions, such as daytime and nighttime, or two or more different seasons at a given time-of-day, or two or more other types of different daytime times; etc.); effects on the simulated lighting of changes to the building interior (e.g., adding or removing a window; adding or removing part or all of a wall; changing furnishings or moveable elements; changing the color and/or texture of a surface, such as a wall or a floor or a ceiling or a countertop; adding or removing lighting sources at specified locations in the interior and whose effects are included in the generated simulated lighting; etc.); effects on the simulated lighting of changes outside the building interior (e.g., adding or removing or changing a tree or other vegetation in an environment surrounding the building, such as in a yard of a house; adding or removing or changing an exterior building or other external structure, whether on a same property as the building or a nearby property; etc.);” NOTE: If an interior light source is removed from a room (e.g., a window or interior light), then, by necessity, it must be removed prior to generating the simulated lighting for the changed building interior. Otherwise, the generated simulated lighting would not properly correspond to the changed interior lighting sources in the modified building interior. As such, VINCENT clearly inherently teaches applying the lighting model of the indoor scene after removing a lighting effect in the image.). Regarding claim 29 (depends on claim 27), VINCENT discloses: a virtual object inserted in the rendered image is lighted according to the lighting model of the indoor scene (¶ [0013]: “effects on the simulated lighting of changes to the building interior (e.g., adding or removing a window; adding or removing part or all of a wall; changing furnishings or moveable elements; changing the color and/or texture of a surface, such as a wall or a floor or a ceiling or a countertop; adding or removing lighting sources at specified locations in the interior and whose effects are included in the generated simulated lighting; etc.);”). Regarding claim 30 (depends on claim 27), VINCENT discloses: wherein at least one area of the image is not modelled in the geometric model (¶ [0013]: “effects on the simulated lighting of changes to the building interior (e.g., adding or removing a window; adding or removing part or all of a wall; changing furnishings or moveable elements; changing the color and/or texture of a surface, such as a wall or a floor or a ceiling or a countertop; adding or removing lighting sources at specified locations in the interior and whose effects are included in the generated simulated lighting; etc.);”), and wherein the at least one area is color corrected based on the lighting model of the indoor scene (effects on the simulated lighting of changes to the building interior (e.g., adding or removing a window; adding or removing part or all of a wall; changing furnishings or moveable elements; changing the color and/or texture of a surface, such as a wall or a floor or a ceiling or a countertop; adding or removing lighting sources at specified locations in the interior and whose effects are included in the generated simulated lighting; etc.);” NOTE: Removing a lighting source will cause “color correction” of the area(s) in the image that affected by the removal of the lighting source (especially in the area in the image where the light source has been removed from the model). Also, changing a color of a surface in the 3D model (e.g., removing the color in the image(s) from the 3D model generated therefrom) will also cause the surface to be “color corrected” based on the given lighting model (i.e., simulated lighting).). Regarding claim 31 (depends on claim 18), VINCENT discloses: an opacity of the opening is configurable via a user interface (¶ [0013]: “a model of a building interior may be displayed in at least some embodiments to a user in a displayed GUI on a client computing device, and the user may be able to specify via the GUI (or in another manner) at least some of the conditions for which the simulated lighting display is generated,” ¶ [0013]: “Additional details are included elsewhere herein regarding types of user-selectable controls and other user selections in a displayed GUI.” ¶ [0042]: “In addition, an end user may specify one or more thresholds with respect to simulated lighting in various manners, such as a specified amount of lux or other illuminance measurement of an amount of light per amount of surface area, a specified amount of luminance of light reflected or emitted from a surface, a specified daylight factor that expresses an amount of daylight available inside a room (e.g., on a surface) as a percentage of an amount of unobstructed daylight available outside under overcast sky conditions, a specified value for daylight autonomy that corresponds to the percentage of the time when the target illuminance of a point in a space is met by daylight, etc., and the user-specified threshold(s) may be used as part of the determination and/or presentation of corresponding information, as noted above.” NOTE: If the user specifies a reduction (a limit) of the amount of luminance of light emitted from the surface of a window in the model, the corresponding effect is to reduce the opacity of the window (i.e., the opening). ¶ [0043]: “Furthermore, in at least some embodiments, an end user may be able to specify ‘what if’ scenarios related to simulated lighting, such as to specify an amount of occlusion of exterior lighting (e.g., an amount of occlusion of one or more windows of the building, such as a percentage, a square footage, etc. that is occluded), such as via manipulation of a displayed GUI slider control, and to see corresponding simulated lighting condition results on a displayed computer model of the building interior for one or more specified times or other conditions. In at least some such embodiments, some or all such information may be precomputed for some or all windows for one or more defined amounts (e.g., an enumerated group of percentage amounts), or the results may instead be dynamically calculated in part or in whole at a time of the specification by the end user (e.g., in a real time manner).”). Regarding claim 32 (depends on claim 1), claim 32 is directed to a non-transitory computer readable medium storing instructions which, when executed by one or more processors, cause the one or more processors to perform the method of claim 1, and, as such, claim 32 is rejected for the same reasons as claim 1. Response to Arguments Applicant's arguments filed December 10, 2025 have been fully considered but they are not persuasive. On page 6 of the REMARKS (2nd paragraph), applicant argues that the claimed invention differs from ¶ [0032] of applicant’s specification. In response to applicant's argument that the references fail to show certain features of the invention, it is noted that the features upon which applicant relies (i.e., the features described in the citation of ¶ [0032] of applicant’s specification) are not recited in the rejected claim(s). Although the claims are interpreted in light of the specification, limitations from the specification are not read into the claims. See In re Van Geuns, 988 F.2d 1181, 26 USPQ2d 1057 (Fed. Cir. 1993). In the paragraph spanning pages 6-7 of the REMARKS, applicant argues that the examiner’s rationale for combining VINCENT and GLASER “is not prima facie evidence of obviousness, but rather hindsight reasoning that rises to the impermissible level.” The examiner respectfully disagrees. In response to applicant's argument that the examiner's conclusion of obviousness is based upon improper hindsight reasoning, it must be recognized that any judgment on obviousness is in a sense necessarily a reconstruction based upon hindsight reasoning. But so long as it takes into account only knowledge which was within the level of ordinary skill at the time the claimed invention was made, and does not include knowledge gleaned only from the applicant's disclosure, such a reconstruction is proper. See In re McLaughlin, 443 F.2d 1392, 170 USPQ 209 (CCPA 1971). On page 7 of the REMARKS (lines 2-3 and 9-10), applicant argues “There is no suggestion of other locations or hypothetically copied buildings” and “Vincent’s system for documenting reality would not have any reason to incorporate a feature from a pre-construction design tool like Glaser, and in fact, discourages it (above).” In response to applicant’s argument that there is no teaching, suggestion, or motivation to combine the references, the examiner recognizes that obviousness may be established by combining or modifying the teachings of the prior art to produce the claimed invention where there is some teaching, suggestion, or motivation to do so found either in the references themselves or in the knowledge generally available to one of ordinary skill in the art. See In re Fine, 837 F.2d 1071, 5 USPQ2d 1596 (Fed. Cir. 1988), In re Jones, 958 F.2d 347, 21 USPQ2d 1941 (Fed. Cir. 1992), and KSR International Co. v. Teleflex, Inc., 550 U.S. 398, 82 USPQ2d 1385 (2007). In this case, as already noted above, for residential developers (i.e., builders), it is a well-known and longtime common practice, before actually building all of the houses on the lots of a neighborhood under development (i.e., the geographically separated building sites), to first build model homes (i.e., physical houses) representing the various models/designs of houses being offered for sale, but not yet constructed. Thus, as a tool for selling (or buying) a house to be built on one of the various lots being offered, it would be very desirable for the buyer that is physically touring a model home (i.e., a physical home in a first geographical location) to be able to visualize how the interior lighting, for that same physical model of house, would appear when built on one of the lots being offered for sale in the development (i.e., second geographical locations). At least for this reason, one of ordinary skill in the art would be motivated to modify the system taught by VINCENT so as to incorporate a user interface for choosing from a selection of geographic locations, as taught by GLASER. Furthermore, the examiner also notes, VINCENT arguably does suggest (in ¶ [0034]) the possibility of a user interface allowing a house’s geographical location and/or geographical orientation to be user-selectable (¶ [0034]: “simulated lighting 225f being generated and displayed in this example for the specified target conditions (which may include the house's geographical location, not shown, as well as geographical orientation, but which are not user-selectable in this example since the house 198 is not moveable with respect to geographical location or orientation).” (emphasis added) ). In the 1st sentence of the 2nd paragraph (i.e., the 1st full paragraph) on page 7 of the REMARKS, applicant states, “with Glaser, changing the designated geographical location would just cause the environmental daylighting model to change to the appropriate lighting for the newly designated geographical location.” The examiner agrees. However, in the 2nd sentence of the 2nd paragraph (i.e., the 1st full paragraph) on page 7 of the REMARKS, applicant argues, “Thus, both references are limited to the proposition of representing the geographically correct lighting for the given location.” The examiner respectfully disagrees. Nowhere in Glaser is there anything that prohibits (or prevents) a user from using the user interface to enter (or select) any geographical location desired when specifying an environmental daylighting model. Thus, Glaser is not limited selecting an outdoor lighting model for any particular geographical location. To the contrary, Glaser teaches providing a user interface that allows a user, as a matter of user choice, to enter any geographical location desired to select a corresponding outdoor lighting model. The examiner reminds applicant, the test for obviousness is not whether the features of a secondary reference may be bodily incorporated into the structure of the primary reference; nor is it that the claimed invention must be expressly suggested in any one or all of the references. Rather, the test is what the combined teachings of the references would have suggested to those of ordinary skill in the art. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981). In the last sentence of the 2nd paragraph (i.e., the 1st full paragraph) on page 7 of the REMARKS, applicant alleges, “even if the references were combined, there was no showing of prima facie evidence of obviousness for a "virtual outdoor condition is associated with a second geographical location that is different from the first geographical location" as recited in independent claims 1 and 18.” The examiner respectfully agrees for the reasons already provided and explained above in the rejections of claims 1 and 18. Furthermore, as a reminder, the test for obviousness is not whether the features of a secondary reference may be bodily incorporated into the structure of the primary reference; nor is it that the claimed invention must be expressly suggested in any one or all of the references. Rather, the test is what the combined teachings of the references would have suggested to those of ordinary skill in the art. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981). Also, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986). Conclusion At present, it is not apparent to the examiner which part of the application could serve as a basis for new and allowable claims. However, should the applicant nevertheless regard some particular matter as patentable, the examiner encourages applicant to appropriately amend the claims to include such matter and to indicate in the REMARKS the difference(s) between the prior art and the claimed invention as well as the significance thereof. Furthermore, should applicant decide to amend the claims, examiner respectfully requests that the applicant please indicate in the REMARKS from which page(s), line(s) or claim(s) of the originally filed application that any amendments are derived. See MPEP § 2163(II)(A) (There is a strong presumption that an adequate written description of the claimed invention is present in the specification as filed, Wertheim, 541 F.2d at 262, 191 USPQ at 96; however, with respect to newly added or amended claims, applicant should show support in the original disclosure for the new or amended claims.). Action is Final Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Contact Information Any inquiry concerning this communication or earlier communications from the examiner should be directed to VINCENT PEREN who can be reached by telephone at (571) 270-7781, or via email at vincent.peren@uspto.gov. The examiner can normally be reached on Monday-Friday from 10:00 A.M. to 6:00 P.M. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, KING POON, can be reached at telephone number (571)272-7440. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from Patent Center. Status information for published applications may be obtained from Patent Center. Status information for unpublished applications is available through Patent Center for authorized users only. Should you have questions about access to Patent Center, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/patents/uspto-automated- interview-request-air-form. /VINCENT PEREN/ Examiner, Art Unit 2617 /KING Y POON/Supervisory Patent Examiner, Art Unit 2617
Read full office action

Prosecution Timeline

Sep 15, 2022
Application Filed
Nov 16, 2024
Non-Final Rejection — §103
Feb 18, 2025
Response Filed
May 28, 2025
Final Rejection — §103
Aug 25, 2025
Request for Continued Examination
Aug 28, 2025
Response after Non-Final Action
Sep 17, 2025
Non-Final Rejection — §103
Dec 10, 2025
Response Filed
Mar 23, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12592017
Rendering XR Avatars Based on Acoustical Features
2y 5m to grant Granted Mar 31, 2026
Patent 12586282
AVATAR COMMUNICATION
2y 5m to grant Granted Mar 24, 2026
Patent 12555314
THREE-DIMENSIONAL SHADING METHOD, APPARATUS, AND COMPUTING DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Feb 17, 2026
Patent 12555296
ADAPTING SIMULATED CHARACTER INTERACTIONS TO DIFFERENT MORPHOLOGIES AND INTERACTION SCENARIOS
2y 5m to grant Granted Feb 17, 2026
Patent 12541913
METHOD AND APPARATUS FOR REBUILDING RELIGHTABLE IMPLICIT HUMAN BODY MODEL
2y 5m to grant Granted Feb 03, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
70%
Grant Probability
90%
With Interview (+20.2%)
2y 11m
Median Time to Grant
High
PTA Risk
Based on 382 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month