Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
Status of the Claims
This action is in response to applicant’s filing on November 04, 2024. Claims 1-20 are pending.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., an abstract idea) without significantly more.
In sum, claims 1-20 are rejected under 35 U.S.C. §101 because the claimed invention is directed to a judicial exception to patentability (i.e., a law of nature, a natural phenomenon, or an abstract idea) and do not include an inventive concept that is something “significantly more” than the judicial exception under the January 2019 patentable subject matter eligibility guidance (2019 PEG) analysis which follows.
Under the 2019 PEG step 1 analysis, it must first be determined whether the claims are directed to one of the four statutory categories of invention (i.e., process, machine, manufacture, or composition of matter). Applying step 1 of the analysis for patentable subject matter to the claims, it is determined that the claims are directed to the statutory category of a process and a machine. Therefore, we proceed to step 2A, Prong 1.
Revised Guidance Step 2A - Prong 1
Under the 2019 PEG step 2A, Prong 1 analysis, it must be determined whether the claims recite an abstract idea that falls within one or more designated categories of patent ineligible subject matter (i.e., organizing human activity, mathematical concepts, and mental processes) that amount to a judicial exception to patentability.
Here, the claims recite the abstract idea of “determining a plurality of stops; detecting, for each of the plurality of stops, a time of arrival and a time of departure based on the time, velocity, and heading in each of the plurality of readings associated with each of the plurality of stops; determining a transition time segment based on a time of departure from a first stop of the plurality of stops and a time of arrival at a second stop of the plurality of stops; and updating an expected timeline for a delivery route based on the transition time segment” as recited in independent claims 1 and 11.
The steps fall within one or more of the three enumerated 2019 PEG categories of patent ineligible subject matter, specifically, a mental process, that can be performed in the human mind since each of the above steps could alternatively be performed in the human mind or with the aid of pen and paper. That is, a driver could do this in the normal course of driving. This conclusion follows from CyberSource Corp. v. Retail Decisions, Inc., where our reviewing court held that section 101 did not embrace a process defined simply as using a computer to perform a series of mental steps that people, aware of each step, can and regularly do perform in their heads. 654 F.3d 1366, 1373 (Fed. Cir. 2011); see also In re Grams, 888 F.2d 835, 840-41 (Fed. Cir. 1989); In re Meyer, 688 F.2d 789, 794-95 (CCPA 1982); Elec. Power Group, LLC v. Alstom S.A., 830 F. 3d 1350, 1354-1354 (Fed. Cir. 2016) (“we have treated analyzing information by steps people go through in their minds, or by mathematical algorithms, without more, as essentially mental processes within the abstract-idea category”).
Additionally, mental processes remain unpatentable even when automated to reduce the burden on the user of what once could have been done with pen and paper. See CyberSource, 654 F.3d at 1375 (“That purely mental processes can be unpatentable, even when performed by a computer, was precisely the holding of the Supreme Court in Gottschalk v. Benson.’’).
Revised Guidance Step 2A - Prong 2
Under the 2019 PEG step 2A, Prong 2 analysis, the identified abstract idea to which the claim is directed does not include limitations that integrate the abstract idea into a practical application, since the recited features of the abstract idea are being applied on a computer or computing device or via software programming that is simply being used as a tool (“apply it”) to implement the abstract idea. (See, e.g., MPEP §2106.05(f)).
In addition, limitations reciting data gathering such as “receiving a plurality of readings from a global positioning system (GPS) enabled mobile computing device, wherein each of the plurality of readings comprises a time, a velocity, and a heading” are also insignificant pre-solution activity that merely gather data and, therefore, do not integrate the exception into a practical application for that additional reason. See In re Bilski, 545 F.3d 943, 963 (Fed. Cir. 2008) (en banc), aff’d on other grounds, 561 U.S. 593 (2010) (characterizing data gathering steps as insignificant extra-solution activity); see also CyberSource, 654 F.3d at 1371-72 (noting that even if some physical steps are required to obtain information from a database (e.g., entering a query via a keyboard, clicking a mouse), such data-gathering steps cannot alone confer patentability); OIP Techs., Inc. v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015) (presenting offers and gathering statistics amounted to mere data gathering). Accord Guidance, 84 Fed. Reg. at 55 (citing MPEP § 2106.05(g)).
Revised Guidance Step 2B
Under the 2019 PEG step 2B analysis, the additional elements are evaluated to determine whether they amount to something “significantly more” than the recited abstract idea, (i.e., an innovative concept). Here, the additional elements, such as: system; mobile computing device; one or more processor; and GPS do not amount to an innovative concept since, as stated above in the step 2A, Prong 2 analysis, the claims are simply using the additional elements as a tool to carry out the abstract idea (i.e., “apply it”) on a computer or computing device and/or via software programming. (See, e.g., MPEP §2106.05(f)). The additional elements are specified at a high level of generality to simply implement the abstract idea and are not themselves being technologically improved. (See, e.g., MPEP §2106.05 I.A.); (see also, ¶¶ 95-98, 199-202 of the specification). See Alice, 573 U.S. at 223 (“[T]he mere recitation of a generic computer cannot transform a patent-ineligible abstract idea into a patent-eligible invention.”). Thus, these elements, taken individually or together, do not amount to “significantly more” than the abstract ideas themselves.
The additional elements of the dependent claims merely refine and further limit the abstract idea of the independent claims and do not add any feature that is an “inventive concept” which cures the deficiencies of their respective parent claim under the 2019 PEG analysis. None of the dependent claims considered individually, including their respective limitations, include an “inventive concept” of some additional element or combination of elements sufficient to ensure that the claims in practice amount to something “significantly more” than patent-ineligible subject matter to which the claims are directed.
The elements of the instant process steps when taken in combination do not offer substantially more than the sum of the functions of the elements when each is taken alone. The claims as a whole, do not amount to significantly more than the abstract idea itself because the claims do not effect an improvement to another technology or technical field (e.g., the field of computer coding technology is not being improved); the claims do not amount to an improvement to the functioning of an electronic device itself which implements the abstract idea (e.g., the general purpose computer and/or the computer system which implements the process are not made more efficient or technologically improved); the claims do not perform a transformation or reduction of a particular article to a different state or thing (i.e., the claims do not use the abstract idea in the claimed process to bring about a physical change. See, e.g., Diamond v. Diehr, 450 U.S. 175 (1081), where a physical change, and thus patentability, was imparted by the claimed process; contrast, Parker v. Flook, 437 U.S. 584 (1078), where a physical change, and thus patentability, was not imparted by the claimed process); and the claims do not move beyond a general link of the use of the abstract idea to a particular technological environment
As for dependent claims 2-10 and 12-20, these claims include all the limitations of the independent claim from which they depend and therefore recite the same abstract idea. The claims also fail to add additional limitations that would amount to significantly more than the abstract idea. Therefore, the invention of the claims as a whole, considering all claim elements both individually and in combination, are not patent eligible.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-6, 9, 11-16 and 19 is/are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Davidson, US 2015/0185031 A1.
Regarding claim 1, Davidson teaches method for delivering items, the method comprising:
receiving a plurality of readings from a global positioning system (GPS) enabled mobile computing device, (Davidson, see at least ¶ [0090] “FIG. 3 illustrates a detailed schematic block diagram of an exemplary telematics device 102 according to one embodiment. In the illustrated embodiment, the telematics device 102 includes the following components: a processor 201, a location-determining device or sensor 202 (e.g., GPS sensor), a real-time clock 203, J-Bus protocol architecture 204, an electronic control module (ECM) 205, a port 206 for receiving data from vehicle sensors 410 located in one of the delivery vehicles 100 (shown in FIG. 2), a communication port 207 for receiving instruction data, a radio frequency identification (RFID) tag 212, a power source 208, a data radio 209 for communication with a WWAN, a WLAN and/or a WPAN, FLASH, DRAM, and NVRAM memory modules 210, and a programmable logic controller (PLC) 211.”) wherein each of the plurality of readings comprises a time, a velocity, and a heading; (Davidson, see at least ¶ [0091] “In one embodiment, the location sensor 202 may be one of several components available in the telematics device 102. The location sensor 202 may be, for example, a GPS-based sensor compatible with a low Earth orbit (LEO) satellite system, medium Earth orbit satellite system, or a Department of Defense (DOD) satellite system. Alternatively, triangulation may be used in connection with various cellular towers positioned at various locations throughout a geographic area in order to determine the location of the delivery vehicle 100 and/or its driver. The location sensor 202 may be used to receive position, time, and speed information/data.” And ¶ [0106] “This concept may also be applied to other variable parameters sensed by vehicle sensors, such as vehicle heading”)
determining, based on the plurality of readings, a plurality of stops; (Davidson, see at least ¶ [0125] “ the vehicle with which that driver is associated at the time the service data is captured (e.g., a vehicle identification number such as 16234), the location of the mobile device 110 at the time the service data is captured (e.g., GPS coordinates), the type of service data captured (e.g., delay code, stop status), and—where applicable—the stop number at which the service data is captured (e.g., stop 3).”)
detecting, for each of the plurality of stops, a time of arrival and a time of departure based on the time, velocity, and heading in each of the plurality of readings associated with each of the plurality of stops; (Davidson, see at least ¶ [0172] “Next, at step 1006, the data segmenting module 1000 identifies and stores various vehicle trip segments based on the identified engine idle segments. According to various embodiments, a vehicle trip generally represents a vehicle's transit time from an origin location to a destination location (e.g., beginning when the vehicle's engine is turned on at the origin location and ending when the vehicle's engine is turned off at the destination location). In step 1006, the data segmenting module 1000 identifies such vehicle trips and breaks each vehicle trip into a Start of Trip segment, a Travel segment, and an End of Trip segment. Generally, the Start of Trip segment begins with the vehicle's engine turning on at its origin location and ends when the vehicle 100 first begins to move, the Travel segment beings when the vehicle 100 beings to move and ends when the vehicle 100 stops at its destination location, and the End of Trip segment begins when the vehicle 100 stops at its destination location and ends when the vehicle's engine is turned off.”)
determining a transition time segment based on a time of departure from a first stop of the plurality of stops and a time of arrival at a second stop of the plurality of stops; (Davidson, see at least ¶ [0389] “Next, the travel delay determining module can identify all travel delay segments in the retrieved segmented data (which may be based images and/or image data of the vehicle 100 in a stationary position). As discussed herein, travel delay segments identified by the data segmenting module 1000 may each represent a period of engine idle time occurring during a Travel segment (e.g., when a vehicle is stopped at an intersection, stopped in heavy traffic, or for some other reason). Next, the travel delay determining module can sum the duration of all identified travel delay segments to determine the total amount of travel delay time indicated by the retrieved data for the corresponding streets, street segments, geographic areas, geofenced areas, and/or user-specified criteria.”) and
updating an expected timeline for a delivery route along which the GPS enabled mobile computing device is traveling based on the transition time segment. (Davidson, see at least ¶ [0403] “According to various embodiments, the travel delay determining module may also be configured for generating a graphical representation of these calculated values and for providing an interactive user interface configured to enable a user to modify the various parameters noted above and perform multiple calculations.”)
Regarding claim 2, Davidson teaches method for delivering items, further comprising determining whether the GPS enabled mobile computing device was stationary at the time of each of the plurality of readings. (Davidson, see at least ¶ [0107] “In addition, vehicle events may be defined by a combination of conditions indicated by various vehicle sensors 410. For example, in certain embodiments, the telematics device 102 may be configured to detect instances of stationary vehicle engine idling (e.g., where the engine is on and the vehicle is not moving) based on a combination of data from a vehicle engine sensor and a vehicle speed sensor”)
Regarding claim 3, Davidson teaches method for delivering items, wherein the time of arrival for a stop of the plurality of stops is associated with an earliest in time reading of the plurality of readings of the stop that has a heading indicating travel toward the stop. (Davidson, see at least ¶ [0100] “As described in greater detail below, in various embodiments, the telematics device 102 may be configured to capture and store telematics data from the vehicle sensors 410 at predefined time intervals and in response to detecting the occurrence of one or more of a plurality of predefined vehicle events. Generally, a vehicle event may be defined as a condition relating to any parameter or combination of parameters measurable by the one or more vehicle sensors 410 (e.g., the engine idling, vehicle speed exceeding a certain threshold, etc.). As such, the telematics device 102 may be configured to continuously monitor the various vehicle sensors 410 and detect when the data being generated by one or more the vehicle sensors 410 indicates one or more of the plurality of predefined vehicle events. In response to detecting a vehicle event, the telematics device 102 captures data from all of the vehicle sensors 410 or a particular subset of the vehicle sensors 410 associated with the detected vehicle event.”)
Regarding claim 4, Davidson teaches method for delivering items, wherein the time of departure for a stop of the plurality of stops is associated with a last in time reading of the plurality of readings of the stop that has a velocity and heading indicating travel away from the stop. (Davidson, see at least ¶ [0134] “For example, telematics data and contextual data concurrently captured by the telematics device 102 may be stored in a data record, where each data field in the data record represents a unique data entry (e.g., a measurement of vehicle speed, GPS coordinates, the time and date the data was captured, and an ID number of the vehicle from which the data was captured).”)
Regarding claim 5, Davidson teaches method for delivering items, wherein detecting the time of departure of a stop of the plurality of stops comprises detecting the time the GPS enabled mobile computing device begins moving following the stop based on the velocity and heading in the plurality of readings associated with the stop. (Davidson, see at least ¶ [0134] “For example, telematics data and contextual data concurrently captured by the telematics device 102 may be stored in a data record, where each data field in the data record represents a unique data entry (e.g., a measurement of vehicle speed, GPS coordinates, the time and date the data was captured, and an ID number of the vehicle from which the data was captured).”)
Regarding claim 6, Davidson teaches method for delivering items, wherein updating the expected timeline for the delivery route comprises: comparing the transition time segment to an expected transition time segment between the first stop and the second stop; and based on a difference between the transition time segment and the transition time segment, updating the expected timeline for the delivery route. (Davidson, see at least ¶ [0174] “Next, at step 1008, the data segmenting module 1000 identifies and stores Travel delay segments based on the previously identified engine idle segments and Travel segments. According to various embodiments, a Travel Delay segment represents a period of engine idle time occurring during a Travel segment (e.g., when a vehicle is stopped at an intersection or stopped in heavy traffic). As such, to identify Travel delay segments, the data segmenting module 1000 reviews all engine idle segments identified in step 1004, identifies those engine idle segments occurring during any Travel segment identified in step 1006 (e.g., by comparing contextual data indicating the time each engine idle segment begins and ends with the time periods represented by each Travel segment), and defines those engine idle segments as Travel delay segments in the Segmented Data Set. Accordingly, in one embodiment, each stored Travel Delay segment is defined by data indicating the segment's start time (e.g., 12:17:23), end time (e.g., 12:17:54), and the segment type (e.g., Travel Delay).”) and ¶ [0396] “To determine the total planned idle time for a given vehicle within the streets, street segments, geographic areas, geofenced areas, and/or user-specified criteria, the travel delay determining module sets the number of planned stops received via user input as both the number of planned start of trip events and the number of planned end of trip events. Finally, based on the earlier calculated travel delays per mile value for the streets, street segments, geographic areas, geofenced areas, and/or user-specified criteria and the aforementioned parameters, the travel delay determining module determines the total planned idle time for the vehicle”)
Regarding claim 9, Davidson teaches method for delivering items, wherein the delivery route includes a third stop positioned between the first stop and the second stop, and wherein the third stop is not included in the determined plurality of stops. (Davidson, see at least ¶ [0174] “Next, at step 1008, the data segmenting module 1000 identifies and stores Travel delay segments based on the previously identified engine idle segments and Travel segments. According to various embodiments, a Travel Delay segment represents a period of engine idle time occurring during a Travel segment (e.g., when a vehicle is stopped at an intersection or stopped in heavy traffic). As such, to identify Travel delay segments, the data segmenting module 1000 reviews all engine idle segments identified in step 1004, identifies those engine idle segments occurring during any Travel segment identified in step 1006 (e.g., by comparing contextual data indicating the time each engine idle segment begins and ends with the time periods represented by each Travel segment), and defines those engine idle segments as Travel delay segments in the Segmented Data Set. Accordingly, in one embodiment, each stored Travel Delay segment is defined by data indicating the segment's start time (e.g., 12:17:23), end time (e.g., 12:17:54), and the segment type (e.g., Travel Delay).”)
Claims 11-16 are rejected using substantially the same rationale as claims 1-6, respectively, above.
Claim 19 is rejected using substantially the same rationale as claim 9 above.
Allowable Subject Matter
Claims 7-8, 10, 17-18 and 20 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to BRIAN P SWEENEY whose telephone number is (313)446-4906. The examiner can normally be reached on Monday-Thursday from 7:30AM to 5:00PM.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, James J. Lee, can be reached at telephone number 571-270-5965. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from Patent Center. Status information for published applications may be obtained from Patent Center. Status information for unpublished applications is available through Patent Center to authorized users only. Should you have questions about access to the USPTO patent electronic filing system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free).
Examiner interviews are available via a variety of formats. See MPEP § 713.01. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) Form at https://www.uspto.gov/InterviewPractice.
/BRIAN P SWEENEY/ Primary Examiner, Art Unit 3668