DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Status of Claims
This Office Action is in response to the application filed on 02 May 2024. Claims 1-20 are presently pending and are presented for examination.
Information Disclosure Statement
The Information Disclosure Statement(s) was/were submitted on 03 December 2024. The submission(s) is/are in compliance with the provisions of 37 CFR 1.97. Accordingly, the Information Disclosure Statement(s) is/are being considered by the Examiner.
Priority
Request for priority to Provisional App. No. 63/552,025 is acknowledged. Examiner notes that the current claims do not appear to be fully supported by the provisional application and further notes that the Applicant may be requested to perfect one or more of the claims in the situation where applied prior art has priority falling between the filing date of the non-provisional application dated 02 May 2024 and the provisional application dated 09 February 2024. No action on the part of the Applicant is requested at this time.
Claim Objections
Claim(s) 1 and 13 is/are objected to because of the following informalities:
Claim 1: “the memory having computer-executable instructions stored thereon that” should be “the at least one memory having computer-executable instructions stored thereon that”; and
Claim 13: “a score indicating the proficiency of a pilot of the airplane” should be “a score indicating [[a proficiency of a pilot of the airplane”.
Appropriate correction is required.
Specification
The disclosure is objected to because of the following informalities:
Reference character “2401” has been used to designate both “…where an application receives flight data associated with a flight path, such as in a similar manner to receiving flight data as part of the process for transmitting a social media post described above.” (para. 0149) and “…where the application generates one or more measure of the turbulence experienced while the plane flew along the flight path based on the flight data and the additional data.” (para. 0151). Correcting para. 0151 will resolve this objection.
Appropriate correction is required.
Drawings
The drawings are objected to under 37 CFR 1.83(a) because they fail to show 1301, 1302, and 1403, as described in the specification. Specifically:
para. 0117 recites “act 1301” as “…where the application identifies one or more attributes of a user of the social media service for airplane pilots” and FIG. 13 shows 1301 as “Identify attributes indicating a user’s flight experience”;
para. 0118 recites “act 1302” as “…where the application identifies one or more instances of content based on the attributes of the user” and FIG. 13 shows 1302 as “Identify one or more instances of content based on the identified attributes of the user’s flight experience”; and
para. 0125 recites “act 1403, where the application updates a repository of off-airport landing area information maintained, managed, accessed, updated, or otherwise used by the social media service” and FIG. 14 shows 1403 as “Update a repository of off-airport landing area information based on the indication of off-airport landing data and the indicated off-airport landing area”.
Any structural detail that is essential for a proper understanding of the disclosed invention should be shown in the drawing. MPEP § 608.02(d). Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. The figure or figure number of an amended drawing should not be labeled as “amended.” If a drawing figure is to be canceled, the appropriate figure must be removed from the replacement sheet, and where necessary, the remaining figures must be renumbered and appropriate changes made to the brief description of the several views of the drawings for consistency. Additional replacement sheets may be necessary to show the renumbering of the remaining figures. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(5) because they include the following reference character(s) not mentioned in the description:
2102, 2103, 2104, 2105, 2106, 2107, and 2108, in FIG. 21;
2403, in FIG. 24 (Correcting para. 0151 will resolve this objection.);
2831, in FIG. 28 (Replacing 2831 with 2832, to match para. 0179, will resolve this objection.); and
2856 and 2856a, in FIG. 28.
Corrected drawing sheets in compliance with 37 CFR 1.121(d), or amendment to the specification to add the reference character(s) in the description in compliance with 37 CFR 1.121(b) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
The drawings are objected to as failing to comply with 37 CFR 1.84(p)(4) because reference character “2840” has been used to designate both “server” (FIG. 28 and para. 0182) and “communications programs” (para. 0181). Corrected drawing sheets in compliance with 37 CFR 1.121(d) are required in reply to the Office action to avoid abandonment of the application. Any amended replacement drawing sheet should include all of the figures appearing on the immediate prior version of the sheet, even if only one figure is being amended. Each drawing sheet submitted after the filing date of an application must be labeled in the top margin as either “Replacement Sheet” or “New Sheet” pursuant to 37 CFR 1.121(d). If the changes are not accepted by the examiner, the applicant will be notified and informed of any required corrective action in the next Office action. The objection to the drawings will not be held in abeyance.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim(s) 9, 11-13, and 18 is/are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 9 recites the limitation "the user of the second user device has interacted with the post.”. There is insufficient antecedent basis for this limitation (“the user”) in the claim. Examiner is interpreting “the user” as “a user”.
Claim 11 recites the limitation "the representation of the flight path". There is insufficient antecedent basis for this limitation in the claim. Examiner is interpreting claim 11 as if it depends on claim 10, instead of claim 1. As claim 12 depends on claim 11, claim 12 is similarly rejected.
Claim 12 recites the limitation "the post". There is insufficient antecedent basis for this limitation in the claim. Examiner is interpreting claim 12 as if claim 11 depends on claim 10, instead of claim 1.
Claim 13 recites the limitation “based on the flight data and the additional data.” It is unclear whether “the flight data” in this limitation is referring to the “flight data associated with the flight path”, recited earlier in claim 13, or the “flight data from one or more flight data repositories based on the indicated airplane” from claim 1. It is also unclear whether these two recitations of “flight data” (in claim 1 and claim 13) are the same. Examiner is interpreting these two recitations of “flight data” as being different, and that the flight data recited in claim 13 (“based on the flight data and the additional data.”) is referring to “flight data associated with the flight path”, recited earlier in claim 13.
Claim 18 recites the limitation “generate, based on the flight data and the additional data”. It is unclear whether “the flight data” in this limitation is referring to the “flight data associated with the flight path”, recited earlier in claim 18, or the “flight data from one or more flight data repositories based on the indicated airplane” from claim 1. It is also unclear whether these two recitations of “flight data” (in claim 1 and claim 18) are the same. Examiner is interpreting these two recitations of “flight data” as being different, and that the flight data recited in claim 18 (“generate, based on the flight data and the additional data”) is referring to “flight data associated with the flight path”, recited earlier in claim 18.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to an abstract idea without significantly more.
101 Analysis – Step 1
Claim 1 is directed to a system (i.e., a machine), claim 19 is directed to a non-transitory computer-readable medium (i.e., a machine) and claim 20 is directed to a method (i.e., a process). Therefore, independent claims 1, 19, and 20 are each within at least one of the four statutory categories.
101 Analysis – Step 2A, Prong I
Regarding Prong I of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether they recite subject matter that falls within one of the follow groups of abstract ideas: a) mathematical concepts, b) certain methods of organizing human activity, and/or c) mental processes.
Independent claims 1, 19, and 20 include limitations that recite an abstract idea (emphasized below) and claim 1 will be used as a representative claim for the remainder of the 101 rejection. Claim 1 recites:
A system comprising:
a display;
at least one processor; and
at least one memory coupled to the at least one processor, the memory having computer-executable instructions stored thereon that, when executed by the at least one processor, cause the system to:
receive an indication of an airplane associated with a user;
receive an indication of flight data from one or more flight data repositories based on the indicated airplane;
identify one or more airports based on the flight data;
generate a flight path based on the identified one or more airports, the flight data, and the indicated airplane;
receive an indication of descriptive data regarding the flight path;
cause the display to visually display a pattern showing an indication of the flight path and an indication of the descriptive data to the user on a first user device; and
cause the flight path and the descriptive data to be stored for transmission to a second user device.
The examiner submits that the foregoing bolded limitation(s) constitute a “mental process” because under its broadest reasonable interpretation, the claim covers performance of the limitation in the human mind. For example, the “identify…” limitation encompasses a person looking at flight data for an aircraft and identifying which airports are relevant to the aircraft, and the “generate…” limitation encompasses a person looking at flight data for an aircraft drawing a flight path of an aircraft based on data about the aircraft. Accordingly, the claim recites at least one abstract idea.
101 Analysis – Step 2A, Prong II
Regarding Prong II of the Step 2A analysis in the 2019 PEG, the claims are to be analyzed to determine whether the claim, as a whole, integrates the abstract into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application.”
In the present case, the additional limitations beyond the above-noted abstract idea are as follows (where the underlined portions are the “additional limitations” while the bolded portions continue to represent the “abstract idea”):
A system comprising:
[A] a display;
[A] at least one processor; and
[A] at least one memory coupled to the at least one processor, the memory having computer-executable instructions stored thereon that, when executed by the at least one processor, cause the system to:
[B] receive an indication of an airplane associated with a user;
[B] receive an indication of flight data from one or more flight data repositories based on the indicated airplane;
identify one or more airports based on the flight data;
generate a flight path based on the identified one or more airports, the flight data, and the indicated airplane;
[B] receive an indication of descriptive data regarding the flight path;
[B] cause the display to visually display a pattern showing an indication of the flight path and an indication of the descriptive data to the user on a first user device; and
[B] cause the flight path and the descriptive data to be stored for transmission to a second user device.
For the following reason(s), the examiner submits that the above identified additional limitations do not integrate the above-noted abstract idea into a practical application.
Regarding the additional limitations, identified with [A], “a display”, “at least one processor” and “at least one memory…having computer-executable instructions stored thereon…”, examiner submits that these limitations are merely using a computer to implement an abstract idea. Regarding the additional limitations, identified with [B], “receive an indication of an airplane…”, “receive an indication of flight data…,” “receive an indication of descriptive data…”, “cause the display to visually display…”, and “cause the flight path and the descriptive data to be stored for transmission…,” the examiner submits that these limitations are insignificant extra-solution activities. In particular, the receiving limitations are recited at a high level of generality, and amounts to mere data gathering, which is a form of insignificant extra-solution activity. The displaying limitation is also recited at a high level of generality (i.e. as a general means of displaying the result of the data gathering and abstract idea), and amounts to mere post-solution displaying, which is a form of insignificant extra-solution activity. Lastly, the “cause the flight path and the descriptive data to be stored for transmission” limitation merely describes receiving or transmitting data over a network.
Thus, taken alone, the additional elements do not integrate the abstract idea into a practical application. Further, looking at the additional limitation(s) as an ordered combination or as a whole, the limitation(s) add nothing that is not already present when looking at the elements taken individually. For instance, there is no indication that the additional elements, when considered as a whole, reflect an improvement in the functioning of a computer or an improvement to another technology or technical field, apply or use the above-noted judicial exception to effect a particular treatment or prophylaxis for a disease or medical condition, implement/use the above-noted judicial exception with a particular machine or manufacture that is integral to the claim, effect a transformation or reduction of a particular article to a different state or thing, or apply or use the judicial exception in some other meaningful way beyond generally linking the use of the judicial exception to a particular technological environment, such that the claim as a whole is not more than a drafting effort designed to monopolize the exception (MPEP § 2106.05). Accordingly, the additional limitation(s) do/does not integrate the abstract idea into a practical application because it does not impose any meaningful limits on practicing the abstract idea.
101 Analysis – Step 2B
Regarding Step 2B of the 2019 PEG, representative independent claim 1 does not include additional elements (considered both individually and as an ordered combination) that are sufficient to amount to significantly more than the judicial exception for the same reasons to those discussed above with respect to determining that the claim does not integrate the abstract idea into a practical application. As discussed above with respect to integration of the abstract idea into a practical application, the additional limitations of “a display”, “at least one processor” and “at least one memory…having computer-executable instructions stored thereon…” amounts to nothing more than applying the exception using a generic computer component. Generally applying an exception using a generic computer component cannot provide an inventive concept. And as discussed above, the additional limitations of “receive an indication of an airplane…”, “receive an indication of flight data…,” “receive an indication of descriptive data…”, “cause the display to visually display…”, and “cause the flight path and the descriptive data to be stored for transmission…”, the examiner submits that these limitations are insignificant extra-solution activities.
Further, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be re-evaluated in Step 2B to determine if they are more than what is well-understood, routine, conventional activity in the field. The additional limitations of “receive an indication of an airplane…”, “receive an indication of flight data…,” “receive an indication of descriptive data…”, and “cause the flight path and the descriptive data to be stored for transmission…”, are the well-understood, routine, and conventional activities because the background recites that the display, processors, and memory are all conventional displays, processors and computer-based memories, and the specification does not provide any indication that the system is anything other than a conventional computer. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner. The additional limitation of “cause the display to visually display…”, is a well-understood, routine, and conventional activity because the Federal Circuit in Trading Techs. Int’l v. IBG LLC, 921 F.3d 1084, 1093 (Fed. Cir. 2019), and Intellectual Ventures I LLC v. Erie Indemnity Co., 850 F.3d 1315, 1331 (Fed. Cir. 2017), for example, indicated that the mere displaying of data is a well understood, routine, and conventional function. Claims 19 and 20 recite similar limitations to those discussed above with regards to claim 1 and therefore discussion is omitted for brevity. Hence, independent claims 1, 19, and 20 are not patent subject matter eligible.
Dependent claim(s) 2-18 do not recite any further limitations that cause the claim(s) to be patent subject matter eligible. Rather, the limitations of dependent claims are directed toward additional aspects of the judicial exception and/or well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application. The additional limitations of the dependent claims include additional aspects of the abstract idea, like identifying, detecting, determining, updating, selecting, extrapolating, and generating and include additional aspects of well-understood, routine, and conventional additional elements like receiving data, transmitting data, and displaying data. Therefore, dependent claims 2-18 are not patent subject matter eligible under the same rationale as provided for in the rejection of independent claim 1.
Therefore, claim(s) 1-20 is/are subject matter ineligible under 35 USC §101.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-8, 10-11, 14, 17, and 19-20 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by US-8700236-B1, hereinafter “Berman”.
Regarding claim 1, and analogous claims 19 and 20, Berman discloses A system (Berman, FIG. 1-3; col. 1, line 15: “The present disclosure relates to maintaining an online flight logbook to help track flying activities for a pilot, and more particularly, to systems [i.e., A system] and methods for recording and publishing related flight information.”) comprising:
Regarding claims 19 and 20, Berman also discloses a non-transitory computer-readable medium having contents configured to cause at least one processor to perform a method (Berman, FIG. 2-3, computer-readable medium 295, computer-readable medium 395; Claim 11) and a method (Berman, col. 1, line 15: “The present disclosure relates to maintaining an online flight logbook to help track flying activities for a pilot, and more particularly, to systems and methods [i.e., A method] for recording and publishing related flight information.”).
a display (Berman, FIG. 1, remote location-aware mobile device 200; col. 4, line 19: “The detailed description that follows is represented largely in terms of processes and symbolic representations of operations by conventional computer components, including a processor, memory storage devices for the processor, connected display devices and input devices.”);
at least one processor (Berman, FIG. 2-3, processing unit 210, processing unit 310; col. 4, line 19: “The detailed description that follows is represented largely in terms of processes and symbolic representations of operations by conventional computer components, including a processor [i.e., at least one processor], memory storage devices for the processor, connected display devices and input devices.”); and
at least one memory coupled to the at least one processor (Berman, FIG. 2-3, memory 250, memory 350; col. 4, line 19: “The detailed description that follows is represented largely in terms of processes and symbolic representations of operations by conventional computer components, including a processor, memory storage devices for the processor [i.e., at least one memory coupled to the at least one processor], connected display devices and input devices.”),
the memory having computer-executable instructions stored thereon that, when executed by the at least one processor (Berman, FIG. 2-3; col. 7, line 63: “The memory 350 stores program code for a number of applications, which includes executable instructions for receiving routine 360 (see FIG. 22, discussed below), generation routine 365 (see FIG. 22, discussed below), authorization routine 370 (see FIG. 24, discussed below), designation routine 375 (see FIGS. 22 and 23, discussed below), and publishing/posting routine 380 (see FIGS. 17 and 20, discussed below).”), cause the system to:
receive an indication of an airplane associated with a user (Berman, col. 3, line 64: “…a mobile flight logbook system and method are described that overcome the hereinafore-mentioned disadvantages of the heretofore-known devices of this general type and that provide automatic generation of a flight logbook entry using a remote location-aware mobile device and then store the generated entry in an online flight logbook maintained on a flight server. Among other features, the remote location-aware device automatically detects aircraft takeoffs and landings for the pilot and populates flight logbook entry based on information available to the location-aware mobile device [i.e., receive an indication of an airplane associated with a user].”);
receive an indication of flight data from one or more flight data repositories based on the indicated airplane (Berman, col. 5, line 46: “…mobile flight logbook systems and methodologies, which collect flight information via a remote location-aware mobile device [i.e., receive an indication of flight data from one or more flight data repositories based on the indicated airplane] and transmit the collected flight information to a flight server. The remote location-aware mobile device automatically generates an entry for a flight logbook, including actively updating the number of detected landings associated with the flight.”);
identify one or more airports based on the flight data (Berman, col. 9, line 29: “The application has access to a database of airports 412 so that it can quickly look up the airport nearest to its position even when there is no network connectivity available, and it also keeps track of the information about the current flight in progress 415 so that it can build up the flight information as it learns more about it from the location services.”);
generate a flight path based on the identified one or more airports, the flight data, and the indicated airplane (Berman, col. 10, line 41: “…Flight Telemetry represents a text-based table of flight data, expressed as a series of rows and columns. The format of this is fairly arbitrary, but if it includes a column of timestamps, latitudes, and longitudes, then the path of the flight can be reconstructed from this.”);
receive an indication of descriptive data regarding the flight path (Berman, col. 6, line 23: “…the remote location-aware mobile device 200 may generate a flight entry from collected flight data 150 for a flight logbook 400. The flight logbook may include both flight data describing facts of the flight and pilot experience data sharing notable experiences in the con text of the flight [i.e., receive an indication of descriptive data regarding the flight path]. Examples of pilot experience data may include personal notes and multimedia, Such as pictures and video, taken during the flight to better share the experience with those reviewing the flight later.”);
cause the display to visually display a pattern showing an indication of the flight path and an indication of the descriptive data to the user on a first user device (Berman, FIG. 11 and 16; col. 16, line 9: “The pilot in this example can select the level of detail that they wish to display about the flight. In this particular embodiment, the choice is simple and stark: show simply the route of flight, or show a variety of details, but any degree of granularity for these preferences could be specified.”); and
cause the flight path and the descriptive data to be stored for transmission to a second user device (Berman, FIG. 4, myflightbook server 401, internet 408, social networks 416; col. 9, line 45: “The application then submits the flight to the web application 402, which stores a record of the flight in the flight database 406 [i.e., cause the flight path and the descriptive data to be stored]. If the request to submit the flight indicates that the flight should also be posted to one or more social networking sites, then for each requested site the web application 402 looks up an authorization token in the user database 407 and, if one is found, submits a link to the flight over the Internet 408 to the appropriate social networking site 416 [i.e., for transmission to a second user device].”).
Regarding claim 2, Berman discloses The system of claim 1, wherein, to cause the flight path and descriptive data to be stored for transmission to the second user device, the computer-executable instructions further causes the system to:
transmit data indicating the flight path and the descriptive data to a server configured to distribute the flight path and descriptive data to one or more user devices (Berman, FIG.4, myflightbook server 401, internet 408, social networks 416; col. 9, line 45: “The application then submits the flight to the web application 402, which stores a record of the flight in the flight database 406 [i.e., transmit data indicating the flight path and the descriptive data to a server]. If the request to submit the flight indicates that the flight should also be posted to one or more social networking sites, then for each requested site the web application 402 looks up an authorization token in the user database 407 and, if one is found, submits a link to the flight over the Internet 408 to the appropriate social networking site 416 [i.e., configured to distribute the flight path and descriptive data to one or more user devices].”).
Regarding claim 3, Berman discloses The system of claim 1, wherein, to generate the flight path, the computer-executable instructions further causes the system to:
receive an indication of a departure airport and a destination airport (Berman, col. 4, line 59: “The terms “flight information” and “flight data” are synonymous and generally refer to data collected for a particular flight. Flight information may include a variety of information associated with the flight, such as the date of the flight, aircraft information (e.g., aircraft registration number (tail number), make, and model), route of flight, number of landings, number of instrument approaches, amount of time spent as member of flight crew, any multimedia (e.g., video, sound, or photographs) associated with the flight, comments about the flight, and any other information that may be useful to the pilot or necessary for regulatory, employment, or insurance purposes. The route of flight may include a starting airport, any intermediate airports, and a final airport [i.e., receive an indication of a departure airport and a destination airport].”);
identify a starting time for the flight path based on the flight data and the departure airport (Berman, FIG. 5, data structures 500, “Flights”, “Flight Start Time (1st Takeoff)”);
identify an ending time for the flight path based on the flight data and the destination airport (Berman, FIG. 5, data structures 500, “Flights”, “Flight End Time (Last Landing)”);
identify one or more intermediate airports based on the starting time for the flight path, the ending time for the flight path, and the flight data (Berman, col. 4, line 59: “The terms “flight information” and “flight data” are synonymous and generally refer to data collected for a particular flight. Flight information may include a variety of information associated with the flight, such as the date of the flight, aircraft information (e.g., aircraft registration number (tail number), make, and model), route of flight, number of landings, number of instrument approaches, amount of time spent as member of flight crew, any multimedia (e.g., video, sound, or photographs) associated with the flight, comments about the flight, and any other information that may be useful to the pilot or necessary for regulatory, employment, or insurance purposes. The route of flight may include a starting airport, any intermediate airports [i.e., identify one or more intermediate airports based on the starting time for the flight path, the ending time for the flight path, and the flight data], and a final airport.);
generate the flight path based on the departure airport, the destination airport, the starting time, the ending time, and the one or more intermediate airports (Berman, FIG. 16, col. 18, line 13: “In FIG. 16, when the flight is displayed on a map, an icon can then be overlayed on the map, showing where the image was taken, or the image can include a link to zoom the map to its location. In one embodiment, the process of geotagging of images uses EXIF and adds overlays of and callouts over maps.”).
Regarding claim 4, Berman discloses The system of claim 1, wherein the computer-executable instructions further cause the system to:
identify at least one instance of content based on the indicated airplane, the flight path, the one or more airports, or the user associated with the indicated airplane; and cause the display to display the at least one instance of content (Berman, FIG. 16; col 18, line 1: “Referring now to FIG. 16, a screenshot 1600 is shown of Picture Geotagging 1650 in accordance with one embodiment. When pilots fly for fun, it is quite common to want to take pictures. But it can be difficult to determine what the picture was of or where it was taken is after the fact. Location aware mobile devices, such as Smartphones and tablets, help address this problem by enabling the user pilot to take a picture with the device's built-in camera, and then automatically add latitude/longitude information to the picture, and store the image with the record of the flight in the pilot's flight logbook [i.e., identify at least one instance of content based on the indicated airplane, the flight path, the one or more airports, or the user associated with the indicated airplane]. In one embodiment, a pilot may also geotag video taken using the camera of the location-aware mobile device and append it to the flight information. In FIG. 16, when the flight is displayed on a map, an icon can then be overlayed on the map, showing where the image was taken, or the image can include a link to zoom the map to its location [i.e., cause the display to display the at least one instance of content].”).
Regarding claim 5, Berman discloses The system of claim 1, wherein, to receive the indication of flight data, the computer-executable instructions further cause the system to:
detect that the indicated airplane has taken off from or landed at an airport; and in response to detecting that the indicated airplane has taken off from or landed at an airport, transmit a request for flight data to a flight repository (Berman, col. 4, line 3: “Among other features, the remote location-aware device automatically detects aircraft takeoffs and landings [i.e., detect that the indicated airplane has taken off from or landed at an airport] for the pilot and populates flight logbook entry [i.e., and in response to detecting that the indicated airplane has taken off from or landed at an airport, transmit a request for flight data to a flight repository] based on information available to the location-aware mobile device.”).
Regarding claim 6, Berman discloses The system of claim 1,
wherein, the flight data includes flight data for two or more flights (Berman, col. 10, line 12: “The Flights data structure includes Date of flight, Comments, Starting/Ending Hobbs timer, Engine Start/End Time, Flight Start/End Time, Number of Landings, Route of flight, Total Time, and Flight Telemetry.”), and
wherein to generate the flight path, the computer-executable instructions further cause the system to: receive an indication of two or more stops via user input; receive an indication of an order of the two or more stops via user input (Berman, FIG. 10A-10C; col. 13, line 52: “Pilots typically record their route of flight in a de-duped manner. For example, if they fly from San Francisco to Portland, do several landings at Portland and then continue to Seattle, they would record this as SFO-PDX-SEA (and record the appropriate count of landings) rather than as SFO-PDX-PDX-PDX-SEA [i.e., the flight data includes flight data for two or more flights]. In the present invention, there are several scenarios where it is desirable to append the code for the nearest airport in a de-duped manner: at engine start (typically noting the initial airport), upon take-off or landing, at engine stop, or when the pilot explicitly requests it (e.g., hitting a “nearest” button) [i.e., receive an indication of an order of the two or more stops via user input]. As a practical matter, it is sufficient to append the nearest airport code at engine start, upon landing, or upon pilot request. The algorithm for appending is fairly straightforward: the airport nearest to the most recently received position report is looked up in the application’s database [i.e., receive an indication of an order of the two or more stops via user input].”); and
generate a flight track based on the indication of flight data, the indication of two or more stops, and the indication of the order of the two or more stops (Berman, col. 10, line 12: “The Flights data [i.e., based on the indication of flight data] structure includes Date of flight, Comments, Starting/Ending Hobbs timer, Engine Start/End Time, Flight Start/End Time, Number of Landings [i.e., the indication of two or more stops], Route of flight [i.e., the indication of the order of the two or more stops], Total Time, and Flight Telemetry.”; col. 10, line 32: “Route of flight represents a sequence of airport codes that describe the route of the flight. E.g., “SEA BOS LHR” would mean Seattle to Boston to London [i.e., the indication of the order of the two or more stops]. In one embodiment, Total Time represents the total time that is recorded for the flight; when a pilot says that they have a given number of hours of flight experience, it is based on this time. This is typically the time from engine start to engine shutdown, but pilots may choose for this to be exclusively airborne time (i.e., exclusive of taxi and idle time). In one embodiment, Flight Telemetry represents a text-based table of flight data, expressed as a series of rows and columns. The format of this is fairly arbitrary, but if it includes a column of timestamps, latitudes, and longitudes, then the path of the flight can be reconstructed [i.e., generate a flight track] from this.”).
Regarding claim 7, Berman discloses The system of claim 1,
wherein the flight data includes data indicating one or more of: at least one latitude; at least one longitude; at least one altitude; or at least one groundspeed (Berman, col. 13, line 28: “Engine start/stop times are used to determine when to autodetect and record telemetry data…If the engine is running, the application looks for takeoffs and landings, and records flight telemetry [i.e., wherein the flight data includes data indicating one or more of: at least one latitude; at least one longitude; at least one altitude; or at least one groundspeed] based on received location updates. When the engine is shut down, the system assumes that the airplane is parked and stops recording telemetry or looking for takeoffs/landings.”; col. 10, line 41: “…Flight Telemetry represents a text-based table of flight data, expressed as a series of rows and columns. The format of this is fairly arbitrary, but if it includes a column of timestamps, latitudes, and longitudes [i.e., at least one latitude; at least one longitude; at least one altitude; or at least one groundspeed], then the path of the flight can be reconstructed from this.”).
Regarding claim 8, Berman discloses The system of claim 1, wherein the computer-executable instructions further cause the system to:
identify a user of the second user device; identify an airport ribbon display setting associated with the user of the second user device; identify one or more airports indicated by the flight path; and cause an airport ribbon to be displayed on the second user device based on the identified one or more airports and the airport ribbon display setting (Berman, col. 16, line 7: “The pilot in this example can select the level of detail that they wish to display about the flight. In this particular embodiment, the choice is simple and stark: show simply the route of flight, or show a variety of details, but any degree of granularity for these preferences could be specified.”; Note: It would be obvious to one of ordinary skill in the art, at the time of the application, to know that digital content that can be displayed on multiple types of devices must be responsive to display limitations and/or settings of each type of device. For example, screen size or number of items to display must be considered, or the content will be distorted, partially displayed, or otherwise difficult to view by the device user.).
Regarding claim 10, Berman discloses The system of claim 1, wherein the computer-executable instructions further cause the system to:
cause the second user device to display a post based on the flight path and descriptive data, wherein the post includes a representation of the flight path (Berman, FIG. 11, FIG. 16 [i.e., the post includes a representation of the flight path]; col. 18, line 23: “For social networking integration, the user can select on the device from among several popular social networking sites (currently Facebook and Twitter, but any number can be supported). The user can additionally choose how much detail to expose about the flight: if the flight is marked as being shared, then details such as the pilot's name, the aircraft and type, and any pictures associated with the flight [i.e., based on…descriptive data] are displayed to visitors [i.e., cause the second user device to display a post based on the flight path and descriptive data]. Otherwise, all details other than the date of the flight and the route (airport-to-airport) [i.e., based on the flight path] are suppressed. Obviously, any range of settings in between these could be offered as well, but these were the simplest to use. When the user submits the flight, they check boxes indicating which (if any) social networks on which to share the flight. This simply sets flags which are passed up along with the flight to the flight logbook server. After storing the flight in the database, a web link (“URL”) to the flight is generated for and passed to each selected social network, along with either the user-provided description of the flight (e.g., “Flew with Joe to Chelan for lunch”) or a generic comment that is generated by the server (e.g., “Flight added.”).”), and
wherein the representation of the flight path includes an indication of one or more of: a starting point of the flight path; an ending point of the flight path; a groundspeed of an airplane that flew along the flight path; a crosswind experienced by an airplane that flew along the flight path; a turbulence experienced by an airplane that flew along the flight path; a g-force experienced by an airplane that flew along the flight path; a time of day during which an airplane flew along the flight path; a stop made by an airplane that flew along the flight path; or a radio call made by an occupant of an airplane that flew along the flight path (Berman, FIG. 16 [i.e., wherein the representation of the flight path includes an indication of one or more of: a starting point of the flight path; an ending point of the flight path…]).
Regarding claim 11, Berman discloses The system of claim 1,
wherein the representation of the flight path includes two or more colors (Berman, FIG. 16, col. 18, line 13: “In FIG. 16, when the flight is displayed on a map, an icon can then be overlayed on the map, showing where the image was taken, or the image can include a link to zoom the map to its location. In one embodiment, the process of geotagging of images uses EXIF and adds overlays of and callouts over maps.”).
Regarding claim 14, Berman discloses The system of claim 1,
wherein the flight data includes data generated by one or more of: a global-positioning-system device; an automatic dependent surveillance broadcasting device; a wearable user device; and a black box of the airplane (Berman, FIG. 2, location sensor (GPS) 245; col. 6, line 66: “As shown in FIG. 2, the location-aware mobile device 200 includes a location sensor 245 for providing navigational information and/or accessing a third party navigation system, such as a Global Positioning System (GPS) or Galileo to obtain satellite navigation information.”).
Regarding claim 17, Berman discloses The system of claim 1, wherein, to generate the flight path, the computer-executable instructions further cause the system to:
detect that the flight data does not include data for one or more locations at which the airplane flew during a flight; and extrapolate flight data for the airplane based on at least a portion of the flight data; and combine the flight data and the extrapolated flight data to obtain final flight data includes flight data for the one or more locations for which data was detected as not included (Berman, col. 15, line 4: “This causes the system to move back to state (1002), but since this is treated as a final step before the completion and submission of the flight (most flights, after all, do not last long after the source of thrust is removed), hobbs, total time, and cross-country time are auto-filled (1012) at this point [i.e., detect that the flight data does not include data for one or more locations at which the airplane flew during a flight; and extrapolate flight data for the airplane based on at least a portion of the flight data] before returning to state (1002) [i.e., final flight data includes flight data for the one or more locations for which data was detected as not included]. The auto-fill process (1012) is described in more detail below in FIGS. 13, 14, and 15. From state (1002), the process can begin again, or the pilot can submit the existing flight from the device (1004) to the server, where it is processed (1005), which is described below.”).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Berman, as applied to claim 1 above, and further in view of "Reactions Now Available Globally”, hereinafter “Facebook”.
Regarding claim 9, Berman discloses The system of claim 1, wherein the computer-executable instructions further cause the system to:
cause the second user device to display a post based on the flight path and descriptive data (Berman, FIG. 28a, FIG. 29),
Berman does not appear to explicitly disclose the following:
wherein the post includes a user interface component indicating that the user of the second user device has interacted with the post; determining whether the user of the