DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim 1, 3-11, and 13-20 are rejected under 35 U.S.C. 103 as being unpatentable over Marotta et al., US 20210334538 A1 (herein, Marotta) and in view of Lection US 20180058864 A1 (herein, Lection).
Regarding Claims 1 and 11, Marotta discloses a system (FIG. 1, #100 – exemplary system) for fusing sensor data (¶[0061] – “…periodic uploads and downloads of information.”) from drones (FIG. 1, #140 – drones) in a virtual environment (¶[0054] – “…to generate data of the kitchen, …”), the system comprising:
a plurality of sensors (FIG. 1, #s 120, and 125 – LIDAR and Photographic cameras, respectively) configured to collect sensor data by observing a real-world landscape (¶[0014] – “…measuring a plurality of dimensions of a landscape…”);
a command center computing system (FIG. 1, #110 -server) comprising:
a processor (FIG. 1, #162 – processor);
non-volatile memory (FIG. 1, #130 – database storage) comprising a sensor data integration platform application (¶[0061] – “…cloud computing…”);
where the sensor data integration platform application, when executed, instructs the processor to perform (¶[0061] – “…enabling near real-time uploads and downloads of information…”):
obtaining geometry data describing the real-world landscape (¶[0054] – “…remodeling a kitchen…”);
drawing a map within a virtual environment using a 3-D visualization software and the geometry data (¶[0023] – “…generate a new floor plan of the floor of the commercial building based upon the received 3D model of the floor. The generated new floor plan may be a 3D floor plan.”);
receiving sensor data and location data from the sensors (¶[0255] – “…receiving sensor data via wireless communication or data transmission…”); and
Marotta discloses drones, map, virtual environment, landscape, and sensor data but does not disclose,
placing a plurality of projectors on the map within the virtual environment corresponding to sensors in the region of the real-world landscape;
projecting the sensor data onto the map at locations indicated by the location data using the projectors corresponding to the sensors that the sensor data is received from.
However, Lection teaches,
placing a plurality of projectors on the map within the virtual environment corresponding to sensors in the region of the real-world landscape (FIG. 7, ¶[0063] – “…, drone icons 122 are also displayed on the map 114, which likewise represent the real-time locations of the drones in the region.”);
projecting the sensor data onto the map at locations indicated by the location data using the projectors corresponding to the sensors that the sensor data is received from (FIG. 7, ¶[0064] – “…products (or icons representative of those products and/or an alphanumeric list of those products) 126 are displayed, which are available for sale and loaded on that particular delivery vehicle. In some embodiments, the user may “drag and drop” products 126 from the available product window 124…”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of he claimed invention to modify the system as disclosed by Marotta to include the plurality of projections located on the map to include location data received from sensors as taught by Lection. Doing so, enhances the capability of the system by providing to the user multiple instances of the location of the drones on the display map.
Regarding Claims 3 and 13, modified Marotta further discloses, wherein at least some of the sensors are mounted to drones (¶[0059] – “…data may include data from a camera on the drone, a LIDAR camera on the drone,…”).
Regarding Claims 4 and 14, modified Marotta further discloses, further comprising:
receiving user input captured on graphical user interface (¶[0094] – “…visually navigating through the 3D model based upon navigation input received from a human user such as a selection of a directional arrow on the 3D model. In some embodiments, the human user input is received…”);
directing a drone identified by the user input to move as indicated by the user input (¶[0093] – “…a drone flying exterior to the house may provide data verifying where a window or skylight is…”).
Regarding Claims 5 and 15, modified Marotta further discloses, where at least one of the drones is in motion (FIG. 1 illustrates a drone flying and ¶[0093] – “drone flying”).
Regarding Claims 6 and 16, modified Marotta further discloses, wherein placing a plurality of projectors on the map further comprises:
retrieving a status and type (FIG. 1 illustrates type of drone and status of it in flight) of each drone from a drone information database (FIG. 1, ¶[0085] “…a drone data monitoring application 142 for monitoring drone data…”);
retrieving telemetry and control information of each drone from the drone information database (¶[0062] – “…separate databases may be used for storing different types of information and/or making different calculations…”).
Modified Marotta teaches telemetry, drone, and virtual environment but does not disclose,
receiving telemetry from each drone where the telemetry indicates a location of the drone;
determine location coordinates in the coordinate system of the virtual environment using the telemetry; and
placing a projector within the virtual environment at the determined location.
However, Lection teaches,
receiving telemetry from each drone where the telemetry indicates a location of the drone (FIG. 4 illustrates location of drone);
determine location coordinates in the coordinate system of the virtual environment using the telemetry (FIG. 7, ¶[0063] – “the map interface screen 112 includes a map 114”) ; and
placing a projector within the virtual environment at the determined location (FIG. 7, ¶[0063] – “…drone icons 122 are also displayed on the map 11…”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of he claimed invention to modify the system as disclosed by Marotta to include the aforementioned of drone location, coordinates and placement in the virtual environment as taught by Lection. Doing so, enhances the capability of the system by providing to the user multiple instances of the location of the drones on the display map.
Regarding Claims 7 and 17, modified Marotta further discloses, wherein projecting the sensor data onto the map further comprises:
retrieving telemetry of a drone (¶[0107] – “…gather data from a drone 140…”);
extracting sensor data from the telemetry (¶[0107] – “…data may include data from a camera on the drone, a LIDAR camera on the drone…”);
converting the sensor data to a visual format (¶[0002] – “and visualization systems”);
determine an orientation of the sensor from the sensor data was received (¶[0094] – “…as a selection of a directional arrow on the 3D model…”);
rotating a projector corresponding to the sensor to match the sensor orientation (¶[0069] – “…drone data (e.g., a LIDAR camera, a photographic camera, or any other data coming from a drone inside or outside of the building) may aide in building the 3D model…” – i.e., the orientation); and
projecting the visual format of the sensor data onto the map using the location data (¶[0019] – “…a LIDAR-based virtual map of the store from processor analysis of the LIDAR data; determine locations of individual goods in the store; overlay the locations of the individual goods onto the LIDAR-based virtual map; and generate an updated LIDAR-based virtual map of the store displaying aisles of the store..”).
Regarding Claims 8 and 18, modified Marotta further discloses, wherein at least some of the sensor data is video (¶[0420] – “…utility line by overlaying the indication of the location of the utility line onto an image or video data..”).
Regarding Claims 9 and 19, modified Marotta further discloses, wherein at least some of the sensor data is invisible wavelength (¶[0005] – “…A server may receive light detection and ranging (LIDAR) data…”).
Regarding Claims 10 and 20, modified Marotta further discloses, where the sensor data integration platform application, when executed, instructs the processor to perform rendering the virtual environment on a display (¶[0381] – “…via the one or more processors, transceivers, sensors, and/or servers: displaying, on a display, the generated recommendation for placement of the object in the room.”).
Claims 2 and 12 are rejected under 35 U.S.C. 103 as being unpatentable over Marotta et al., US 20210334538 A1 (herein, Marotta) in view of Lection US 20180058864 A1 (herein, Lection), and in further view of Thaller et al., US10636209B1 (herein, Thaller).
Regarding Claims 2 and 12, modified Marotta teaches the geometry data but does not disclose, wherein the geometry data is retrieved from Esri ArcGIS.
However, Thaller teaches, wherein the geometry data is retrieved from Esri ArcGIS (Col. 4, lines 26-37 – “…the modeling software “City Engine,” available from Esri R&D Center, which uses such vector data to model a 3-D city. The output are buildings and other infrastructure of a city which are stored in scene database 50 in preparation for downloading to a client computer in real time as needed…”).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of he claimed invention to modify the system as disclosed by modified Marotta to include the geospatial platform developed by Esri known as Geographic Information System (GIS) as taught by Thaller. Doing so, enhances the capability of the system by providing an enhanced mapping capability to the user.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. The references cited but not utilized in the Office Action pertain to a system for the fusing sensor data from drones on a virtual environment.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LUIS G DEL VALLE whose telephone number is (303)297-4313. The examiner can normally be reached Monday-Friday, 0730 - 1630 MST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Anne Antonucci can be reached at (313) 446-6519. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/LUIS G DEL VALLE/Examiner, Art Unit 3666
/ANNE MARIE ANTONUCCI/Supervisory Patent Examiner, Art Unit 3666