DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claim 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more. Claim 1 recite(s) to receive photographic data including one or more images of a structure; in response to receiving the photographic data, apply the photographic data to a structure assessment model configured to determine a structural status of the structure, wherein the structure assessment model is trained using historical photographic data including a plurality of historical images of structures; receive an output from the structure assessment model, wherein the output comprises at least an estimated cost of repairing damage having occurred to the structure; and based upon the output, transmit a message to a user computing device associated with the structure that causes display of the estimated cost. This judicial exception is not integrated into a practical application because each of the independent claims recite abstract ideas (mathematical concepts and fundamental economic practices) and do not integrate them into a practical application. The additional elements, viewed individually and in combination, amount to nothing more than implementing the abstract ideas on generic computing devices with routine data acquisition, user prompting, and financial transactions. The remaining dependent claims are also rejected under 35 U.S.C. § 101 as directed to abstract ideas (mathematical concepts, certain methods of organizing human activity, and mental processes) and failing to recite additional elements that integrate the abstract ideas into a practical application or that add significantly more than the abstract ideas.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1 is/are rejected under 35 U.S.C. 103 as being unpatentable over D1 [US 10580075 B1] in view of D2 [US 11080838 B1].
Claim 1. A computing device comprising at least one memory and at least one processor in communication with the at least one memory, the at least one processor configured to:
receive photographic data including one or more images of a structure; [D1, Fig. 3] A photo is received.
in response to receiving the photographic data, apply the photographic data to a structure assessment model configured to determine a structural status of the structure, wherein the structure assessment model is trained using historical photographic data including a plurality of historical images of structures; [D1, Fig. 3] The analysis of the image is used to determine damage.
receive an output from the structure assessment model, wherein the output comprises at least an estimated cost of repairing damage having occurred to the structure; and [D1, Fig. 3] From the analysis there is a damage estimate that is determined.
based upon the output, transmit a message to a user computing device associated with the structure that causes display of the estimated cost. [D1, Fig. 3] A proposed settlement is sent to a users mobile device.
D1, see column 16 and lines 20-32, teaches the server may also determine a damage estimate (e.g., an estimate for repairing and/or replacing any damaged parts) after analyzing the photos in step based on predefined rules. The damage estimate may be generated by comparing the photos submitted by the mobile device with photos of similarly damaged vehicles or with photos of non-damaged vehicles of similar make/model. To perform this comparison, server may access a database (e.g., network device) of photos of vehicles with various types of damage and/or vehicles with no damage. D1 does not explicitly mention a trained model to complete this analysis, however, D2, see column 5 and lines 23-65, teaches the damage prediction model may use artificial neural networks (e.g., neural network), such as multi-layer deep neural networks. In some embodiments, the damage prediction model may be trained using sets of sample image data (e.g., training sets) that are correctly labeled to identify objects portrayed in the image data. For example, the sample image data may be used to train the damage prediction model to recognize various types of rooftop types (e.g., gable roof, flat roof, and hip roof), rooftops materials (e.g., asphalt shingles, metal, slate, tile, and wood), and rooftop slopes (e.g., flat, low, and steep slopes). Additionally, or alternatively, the sample image data may be used to train the damage prediction model to recognize and distinguish between different types of rooftop damages (e.g., hail damage, water damage, wind damage, storm damage, structural damage due to fallen trees, damage due to wildlife, and intentional man-made mechanical damage). For example, the damage prediction model may be trained to recognize various patterns of rooftop damage associated with each type of rooftop damage (e.g., repetitive dents in an isolated area of a rooftop, cracked/torn/chipped tiles throughout the rooftop, damaged fascia boards, missing shingles, bruised shingles, discoloration, and holes in the rooftop). In further embodiments, the IA server may implement one or more sets of test image data (e.g., validation image data) to evaluate how accurately the damage prediction model learned to classify rooftop images. It would have been obvious before the effective filing date of the claimed invention to one of ordinary skill in the art to combine the teachings of D1, wherein a analysis is done in order to determine damage and provide an estimate, with the teachings of D2, wherein the process is completed using a trained algorithm and able to do it on roofing damage instead of only car damage. One skilled in the art would have been motivated to modify D1 in this manner in order to utilize the analysis aspect of D2 in which the trained algorithm is being used to provide the damage analysis. Therefore, one of ordinary skill in the art, such as an individual with a basic degree in electrical engineering could have combined the elements as claimed by known methods, and that in combination, each element merely performs the same function as it does separately. It is for at least the aforementioned reasons that the Examiner has reached a conclusion of obviousness with respect to claim 1.
Claim 2. The computing device of claim 1, wherein the at least one processor is further programmed to receive the photographic data from the user computing device. [D1, Fig. 3] D1 teaches the server receives image data from a mobile device.
Claim 3. The computing device of claim 2, wherein the at least one processor is further configured to cause the user computing device to display a graphical user interface (GUI) prompting a user to submit the one or more images of the structure. [D1, Fig. 4 and 5(all)] D1 teaches the graphical user interface with instruction on how to submit photos.
Claim 4. The computing device of claim 3, wherein the at least one processor is further configured to: in response to receiving the photographic data, determine that one or more additional images are necessary for the structure assessment model to satisfy a threshold confidence score; and cause the user computing device to prompt within the GUI the user to submit the one or more additional images of the structure. [D1, Fig. 3] D1 teaches the determination of photos being acceptable or not and based upon that providing instructions to the user to submit photos.
Claim 5. The computing device of claim 1, wherein the at least one processor is further configured to train the structure assessment model using historical records comprising the historical photographic data including a plurality of historical images of structures. Claim 5 is rejected for similar reasons as to those described in claim 1.
Claim 6. The computing device of claim 5, wherein the at least one processor is further configured to: update the historical records to updated historical records comprising a new historical record, the new historical record comprising the received photographic data and the determined structural status of the structure; and re-train the trained structure assessment model using the updated historical records. [D2, Column 6, Lines 1-14] D2 teaches the updating of the model’s parameters in a dynamic process.
Claim 7. The computing device of claim 5, wherein the historical records further include historical structural data associated with the plurality of historical images. D1, see column 16 and lines 20-32, teaches the server may also determine a damage estimate (e.g., an estimate for repairing and/or replacing any damaged parts) after analyzing the photos in step based on predefined rules. The damage estimate may be generated by comparing the photos submitted by the mobile device with photos of similarly damaged vehicles or with photos of non-damaged vehicles of similar make/model.
Claim 8. The computing device of claim 7, wherein the historical structural data includes at least one of the following: roof material, roof age, shingle type, roof slant angle, age of roof, total area of a roof, and roof occlusion. [D2, Column 5, Lines 23-65] D2 teaches the determination of rooftop damage, occlusions, rooftop type, material, etc.
Claim 9. The computing device of claim 1, wherein the processor is further configured to:
receive an acceptance of the estimated cost submitted at the user computing device; and
in response to the acceptance, transfer a settlement amount determined based upon the estimated cost to an account associated with the structure. [D1, Fig. 3] See determination of estimate, sending of proposal, acceptance, and payout.
Claim 10. The computing device of claim 1, wherein the processor is further configured to generate a flight plan for a drone, the flight plan causing the drone to capture the one or more images of the structure. [D2, Column 4, Lines 36-55 and Column 11, Line 25-43] D1 teaches the drone is deployed to fly through the area to obtain data associated with the damage of the property which is in communication of the server. The drone is a automated device receiving instructions.
Claim 11. Claim 11 is rejected for similar reasons as to those described in claim 1.
Claim 12 – 20. The claims are rejected for similar reasons as to those described in claim 2 – 10.
Claim 21. Claim 21 is rejected for similar reasons as to those described in claim 1.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to Amandeep Saini whose telephone number is (571)272-3382. The examiner can normally be reached M-F (8AM-4PM).
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/AMANDEEP SAINI/Supervisory Patent Examiner, Art Unit 2662