Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Application Status
Present office action is in response to the amendment filed 12/10/2025. Claims 9 and 17 are cancelled. Claims 8, 10 and 16 are amended. Claims 1-8,10-16 and 18-20 are currently pending in the application.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-8,10-16 and 18-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to abstract idea without significantly more.
In regard to independent claim 8, analyzed as representative claim:
Step 1: Statutory Category?
Independent Claim 8 recites “A method for generating teaching actions for drivers, comprising:”. Independent Claim 8 falls within the “process” category of 35 U.S.C. § 101.
Step 2A – Prong 1: Judicial Exception Recited?
The Independent Claim 8/Revised 2019 Guidance Table below identifies in italics the specific claim limitations found to recite an abstract idea and in bold the additional (non-abstract) claim limitations that are generic computer components.
Independent Claim 8
Revised 2019 Guidance
A method for generating teaching actions for drivers, comprising:
A process (method) is a statutory
subject matter class. See 35 U.S.C.
§ 101 (“Whoever invents or
discovers any new and useful
process, machine, manufacture, or
composition of matter, or any new
and useful improvement thereof,
may obtain a patent therefor, subject
to the conditions and requirements of
this title.”).
[L1] obtaining driving data from a plurality of driving scenarios, the driving data comprises vehicle trajectory information and corresponding scene context information, the plurality of driving scenarios comprising instructed driving events and uninstructed driving events;
Obtaining driving data … is an additional element that adds insignificant extra-solution activity to the judicial exception, e.g., mere data gathering. See January 2019 Memorandum, 84 Fed. Reg. 55, n. 31.
Alternatively, “obtaining driving data …” could be performed as a mental process, i.e., concept performed in the human mind or using pencil and paper (including an observation, evaluation, judgment, opinion) and a “[c]ertain method[] of organizing human activity. . . managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions)”. See January 2019 Memorandum, 84 Fed. Reg. at 52.
[L2] encoding, with a behavior model, the driving data, wherein the encoded driving data comprises an indication that a corresponding one of the plurality of driving scenarios comprises one of the instructed driving events or the uninstructed driving events;
The “behavior model” is an additional non-abstract limitation, namely, a generic computer component.
Abstract: “encoding the driving data …” could be performed as a mental process, i.e., concept performed in the human mind or using pencil and paper (including an observation, evaluation, judgment, opinion) and a “[c]ertain method[] of organizing human activity. . . managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions)”. See January 2019 Memorandum, 84 Fed. Reg. at 52.
[L3] determining, with a trajectory estimator processing the encoded driving data, one or more driving skill transitions based on a presence or an absence of the indication;
The “trajectory estimator” is an additional non-abstract limitation, namely, a generic computer component.
Abstract: “determining one or more driving skill transitions based on a presence or an absence of the indication” could be performed as a mental process, i.e., concept performed in the human mind or using pencil and paper (including an observation, evaluation, judgment, opinion) and a “[c]ertain method[] of organizing human activity. . . managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions)”. See January 2019 Memorandum, 84 Fed. Reg. at 52.
[L4] causing a teacher action model to learn a teacher policy encoding from the determined one or more driving skill transitions and the encoded driving data
The “teacher action model” is an additional non-abstract limitation, namely, a generic computer component.
Causing a teacher action model to learn a teacher policy encoding … is an additional element that adds insignificant extra-solution activity to the judicial exception, e.g., mere data gathering. See January 2019 Memorandum, 84 Fed. Reg. 55, n. 31.
[L5] generating, with the teacher action model, a teaching action for one of the plurality of driving scenarios;
The “teacher action model” is an additional non-abstract limitation, namely, a generic computer component.
Abstract: “generating a teaching action…” could be performed as a mental process, i.e., concept performed in the human mind or using pencil and paper (including an observation, evaluation, judgment, opinion) and a “[c]ertain method[] of organizing human activity. . . managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions)”. See January 2019 Memorandum, 84 Fed. Reg. at 52.
The published Specification discloses “techniques for training a skill advancement model for a driver and generating a teaching action for the driver” (¶ 1). Additionally, it is common knowledge that a driving instructor riding in a student driver’s car could readily observe or evaluate the student driver, dashboard data and/or contextual information such as traffic condition, traffic signs and weather. Furthermore, the driving instructor can demonstrate proper, improper and any other desired driving behavior. Based at least on the above, it is apparent that other than reciting the additional non-abstract limitations of the “behavior model”, “trajectory estimator” and “teacher action model” noted in the Independent Claim 8/Revised 2019 Guidance Table above, nothing in the claim precludes the steps from practically being performed by a human, in the mind, and/or using pen and paper. The mere nominal recitation of the “behavior model”, “trajectory estimator” and “teacher action model” does not take the claim out of the method of organizing human activity, mental processes and mathematical concept groupings. Accordingly, the claim recites a judicial exception (Step 2A, Prong One: YES).
Step 2A – Prong 2: Integrated into a Practical Application?
The body of the claim, as noted in the Independent Claim 18/Revised 2019 Guidance Table above, recites the additional limitation of the “behavior model”, “trajectory estimator” and “teacher action model”. The published Specification provides supporting exemplary descriptions of generic computer components: at least ¶ 8: … instructions may be executed by only one processor or by multiple processors in a distributed fashion, such that each apparatus of the one or more apparatuses may include one processor or multiple processors, and/or such that performance may be by only one apparatus or in a distributed fashion across multiple apparatuses); one or more computer program products embodied on one or more computer-readable storage media comprising code for performing any portion of any method described herein (e.g., such that code may be stored in only one computer-readable medium or across computer-readable media in a distributed fashion); and/or one or more apparatuses comprising one or more means for performing any portion of any method described herein (e.g., such that performance would be by only one apparatus or by multiple apparatuses in a distributed fashion). By way of example, an apparatus may comprise a processing system, a device with a processing system, or processing systems cooperating over one or more networks. An apparatus may comprise one or more memories; and one or more processors configured to cause the apparatus to perform any portion of any method described herein. In some examples, one or more of the processors may be preconfigured to perform various functions or operations described herein without requiring configuration by software; ¶ 11:… artificial intelligence (AI) model comprising a skill advancement model and a teacher action model…; ¶ 24: the AI model including the teacher action model and the skill advancement model described herein have a neural network are trained from trajectory or map encoders and additional vehicle signals, which enable a model of driver capabilities …; ¶ 26: … the AI model 100 comprising a skill advancement model 101, a teacher action model 108, and optionally and a rewards model/estimator 110…; ¶ 50: … the AI model 300 is depicted as a neural network which may include one or more layers 305, 310, 315, 320, having one or more nodes 301, connected by node connections 302. The one or more layers 305, 310, 315, 320, may include an input layer 305, one or more hidden layers 310, 315, and an output layer 320. The input layer 305 represents the raw information that is fed into the neural network … a behavior model of the AI model 300 may encode the driving data 252… The lack of details about the recited additional elements indicates that these additional elements are generic, or part of generic computer elements performing generic computer-implemented steps. The claimed limitations of “obtaining driving data …”, “encoding the driving data …”, “determining one or more driving skill transitions …” “causing a teacher action model to learn a teacher policy encoding …”, and “generating a teaching action…” as recited in the claim do not purport to improve the functioning of the “behavior model”, “trajectory estimator” and “teacher action model”, do not improve the technology of the technical field, and do not require a “particular machine.” Rather, they are performed using generic computer components. Further, the claim as a whole fails to effect any particular transformation of an article to a different state. The recited steps in the claim fail to provide meaningful limitations to limit the judicial exception. In this case, the claim merely uses the claimed computer elements as a tool to perform the abstract idea.
Considering the elements of the claim both individually and as “an ordered combination” the functions performed by the additional elements at each step of the process are purely conventional. Each step performed in the claim does no more than require a generic computer to perform a generic computer function. Thus, the claimed elements have not been shown to integrate the judicial exception into a practical application as set forth in the Revised Guidance which references the Manual of Patent Examining Procedure (“MPEP”) §§ 2106.04(d) and 2106.05(a)–(c) and (e)–(h). Because the abstract idea is not integrated into a practical application, the claim is directed to the judicial exception. (Step 2A, Prong Two: NO).
Step 2B: Claim provides an Inventive Concept?
As discussed with respect to Step 2A Prong Two, the additional elements in the claim amount to no more than mere instructions to apply the exception using generic computer components. The same analysis applies here in Step 2B, i.e., mere instructions to apply an exception using generic computer components cannot integrate a judicial exception into a practical application at Step 2A or provide an inventive concept in Step 2B.
Because the published Specification, as noted above (¶¶ 8, 11, 24, 26, 50) describes the “behavior model”, “trajectory estimator” and “teacher action model” in general terms, without describing the particulars, the claim limitations may be broadly but reasonably construed as reciting conventional computer components and techniques, particularly in light of the published Specification sufficiently well-known that the specification does not need to describe the particulars of such additional elements to satisfy 35 U.S.C. § 112(a). See MPEP 2106.05(d), as modified by the USPTO Berkheimer Memorandum. Furthermore, the Berkheimer Memorandum, Section III (A)(1) explains that a specification that describes additional elements “in a manner that indicates that the additional elements are sufficiently well-known that the specification does not need to describe the particulars of such additional elements to satisfy 35 U.S.C. § 112(a)” can show that the elements are well understood, routine, and conventional); Intellectual Ventures I LLC v. Erie Indem. Co., 850 F.3d 1315, 1331 (Fed. Cir. 2017) (“The claimed mobile interface is so lacking in implementation details that it amounts to merely a generic component (software, hardware, or firmware) that permits the performance of the abstract idea, i.e., to retrieve the user-specific resources.”). The generic description of the “behavior model”, “trajectory estimator” and “teacher action model” indicates the claim steps are well-known enough that no further description is required for a skilled artisan to understand the process and that these computer components are all used in a manner that is well-understood, routine, and conventional in the field. In particular, the recited data gathering (i.e., [L1] “obtaining driving data” is nothing more than well-understood, routine, and conventional activity because it is not distinguished from the generic, conventional data gathering with a computer. See Elec. Power Grp., 830 F.3d at 1356 (claims to gathering, analyzing, and displaying data in real time using conventional, generic technology do not have an inventive concept). Hence, the additional elements are generic, well-known, and conventional computing elements. The use of the additional elements either alone or in combination amounts to no more than mere instructions to apply the judicial exception using generic computer components. Mere instructions to apply an exception using generic computer components cannot provide an inventive concept, and thus the claims are patent ineligible. (Step 2B: NO).
In regard to independent Claim 1:
Independent claim 1 is an apparatus, which falls within the “machine” category of 35 U.S.C. § 101. The apparatus is claimed as configured for generating teaching actions for drivers, comprising: one or more memories; and one or more processors coupled to the one or more memories and configured to cause the apparatus to perform steps comparable to the steps of Claim 8. As a result, independent claim 1 is rejected similarly to representative independent Claim 8.
In regard to independent Claim 16:
Independent claim 16 is a non-transitory computer-readable medium, which falls within the “manufacture” category of 35 U.S.C. § 101. The non-transitory computer-readable medium is claimed as comprising processor-executable instructions that, when executed by one or more processors of an apparatus, causes the apparatus to perform a method comprising steps comparable to the steps of Claim 8. As a result, independent claim 16 is rejected similarly to representative independent Claim 8.
In regard to the dependent claims:
Dependent claims 2-7, 10-15 and 18-20 include all the limitations of respective independent claims 1, 8 and 16 from which they depend and as such recite the same abstract idea(s) noted above for claims 1, 8 and 16. None of the additional claim activities is used in some unconventional manner nor does any produce some unexpected result. An invocation to use known technology in the manner it is intended to be used for its ordinary purpose is both generic and conventional. As per MPEP §§ 2106.05(a)–(c), (e)–(h), none of the limitations of claims 2-7, 10-15 and 18-20 integrates the judicial exception into a practical application. While dependent claims 2-7, 10-15 and 18-20 may have a narrower scope than the representative claim, no claim contains an “inventive concept” that transforms the corresponding claim into a patent-eligible application of the otherwise ineligible abstract idea(s). Therefore, dependent claims 2-7, 10-15 and 18-20 are not drawn to patent eligible subject matter as they are directed to (an) abstract idea(s) without significantly more.
Response to Arguments
Rejections under 35 U.S.C. § 101
Step 2A
Prong One: The Claim Does Not Recite an Abstract Idea
Applicant argues that “the recitations of independent claims 1, 8, and 16 are analogous to the example claims of Example 39, which is an example of a claim for
training a neural network that is considered eligible for patent protection” and “disagrees with the Examiner assertions that independent claims 1, 8, and 16 are directed to a judicial exception under Step 2A: Prong One”. Applicant’s arguments have been fully considered but they are not persuasive.
Representative claim 8 is readily distinguishable from Example 39 at least in the fact that, unlike the hypothetical claim in Example 39, which the Office determined does not recite any judicial exception, claim 8 recites subject matter that falls with the method of organizing human activity, mental processes and mathematical concept groupings of abstract ideas as noted above. In particular, educational institutions for instructors have long been prevalent in the world. As noted by Applicant, “the present application is directed to techniques for training a skill advancement model for a driver and generating a teaching action for the driver.” simply uses machine learning technique in a particular environment”. Reviewing courts have found claims to be directed to abstract ideas when they recited similar subject matter. Recentive Analytics, Inc. v. Fox Corp., 134 F.4th 1205, 1208 (Fed. Cir. 2025) (using a generic machine learning technique in a particular environment). Thus, reciting training and execution of a machine learning model does not take representative claim 8 out of the abstract realm. See id.; see also SAP Am., Inc. v. InvestPic, LLC, 898 F.3d 1161, 1169-70 (Fed. Cir. 2018) (holding that the additional limitations do not integrate the abstract idea into a practical application where they require “already available computers, with their already available basic functions, to use as tools in executing the claimed process”). As the Federal Circuit explained in Recentive Analytics, using existing machine learning technology to speed up a task does not render a claim patent eligible. See Recentive Analytics, 134 F.4th at 1214.
The hypothetical claim of Example 39 recites steps of collecting digital facial images; applying transformations to each digital facial image; creating a first training set; training the neural network in a first stage; and training the neural networks in a second stage. The second stage used the first training set with digital non-facial images detected incorrectly as facial images in the first stage to train the neural network. As per Applicant’s disclosure, the “behavior model” encodes “driving data, wherein the encoded driving data comprises an indication that a corresponding one of the plurality of driving scenarios comprises one of the instructed driving events or the uninstructed driving events”. It is apparent that unlike the hypothetical claim in Example 39 is directed to “generating teaching actions for drivers” which is an abstract idea in the form of a method of organizing human activity, a mental process and a mathematical concept.
Additionally, the combination of features recited in the claim of Example 39 provided an improved facial detection model which, unlike prior models, could detect faces in distorted images while limiting the number of false positives. Eligibility Examples at 8. In particular, prior neural network models used for detecting facial images suffered from an inability to detect human faces in images having shifts, distortions, and variations in scale and rotation of the face pattern. Id. To address this problem, the claim applied mathematical transformations to an acquired set of facial images (thereby introducing shifts, distortions, and variations in scale and rotation of the face pattern) to develop an expanded training set, and trained the neural network using this expanded set. Id. While training with the expanded set better detects human faces in images having shifts, distortions, and variations in scale and rotation of the face pattern, it also suffers from increased false positives when classifying non-facial images. Id. To reduce these false positives, the claim retrains the neural network with an updated training set containing the false positives produced after face detection has been performed on non-facial images. Id.
No analogous technological improvement is apparent in representative claim 8. Rather than provide a patent-eligible application of the judicial exception, the steps of “encoding, with a behavior model, the driving data” and “causing a teacher action model to learn a teacher policy” amount to no more than adding the words “apply it.” See MPEP § 2106.05(f). The fact that these elements link the judicial exception to the technical field of ML model-implemented search is insufficient to render claim 8 patent eligible. See id. at § 2106.05(h).
Applicant does not indicate and the Examiner fails to see how the claim steps performed by the “behavior model” and “teacher action model”, namely the “encoding, with a behavior model, the driving data” and “causing a teacher action model to learn a teacher policy encoding”, are similar to the retraining with an updated training set of the neural network of hypothetical Example 39. Contrary to Applicant’s assertions that “the human mind is not equipped to perform the claim limitations”, the instant disclosure is akin to an on-board driving instructor or other passenger observing/monitoring a vehicle operator’s driving habits, to “obtain driving data from a plurality of driving scenarios vehicle trajectory information and corresponding scene context information, the plurality of driving scenarios comprising instructed driving events and uninstructed driving events”, “determine, with a trajectory estimator processing the encoded driving data, one or more driving skill transitions based on a presence or an absence of the indication” and “generate a teaching action for one of the plurality of driving scenarios”. Hence, the fact pattern of representative claim 8 does not match the fact pattern of the claim of Example 39.
Prong Two: If the Claim Recites a Judicial Exception, Evaluate Whether the Judicial Exception Is Integrated Into a Practical Application
Applicant emphasized the following limitations to assert “a practical application of the alleged judicial exception”:
"obtain driving data from a plurality of driving scenarios, the driving data
comprises vehicle trajectory information and corresponding scene context information, the plurality of driving scenarios comprising instructed driving events and uninstructed driving events; encode, with a behavior model, the driving data, wherein the encoded driving data comprises an indication that a corresponding one of the plurality of driving scenarios comprises one of the instructed driving events or the uninstructed driving events; determine, with a trajectory estimator processing the encoded driving data, one or more driving skill transitions based on a
presence or an absence of the indication; cause a teacher action model to learn a teacher policy encoding from the determined one or more driving skill transitions and the encoded driving data; and generate, with the teacher action model, a teaching action for one of the plurality of driving scenarios." Applicant’s arguments have been fully considered but they are not persuasive.
It is common knowledge that a driving instructor in a vehicle with a student driver can visually obtain driving data, determine one more driving skill transition based on evaluation and generate teaching action. These are steps generally used in teaching processes in any desired sequence. The abstract idea alone cannot integrate itself into a practical application. Trading Techs. Int’l, Inc. v. IBG LLC, 921 F.3d 1378, 1385 (Fed. Cir. 2019); see also Synopsys, Inc. v. Mentor Graphics Corp., 839 F.3d 1138, 1151 (Fed. Cir. 2016) (“[A] claim for a new abstract idea is still an abstract idea.”).
Representative claim 8 does not improve a computer or solve a technical problem, but rather merely recites a known practice of generating a teacher action, at most, an implied requirement to perform the method using generic computer components. However, “merely requir[ing] generic computer implementation,” “does not move into [§] 101 eligibility territory.” buySAFE, Inc. v. Google, Inc., 765 F.3d 1350, 1354 (Fed. Cir. 2014). Thus, Applicant has not shown claim 8 is directed to a practical application because Applicant has not shown claim 8 recites an additional element that (1) improves the functioning of a computer or other technology, (2) is applied with any particular machine, (3) effects a transformation of a particular article to a different state, or (4) is applied in a meaningful way. MPEP § 2106.05.
Step 2B
Applicant’s reference to DDR Holdings is misplaced. In particular, Applicant does not explain how, and the Examiner fails to see how, Applicant’s representative claim 8 parallels the claims in DDR Holdings. In DDR Holdings, the court found that the “claimed solution [was] necessarily rooted in computer technology in order to overcome a problem specifically arising in the realm of computer networks.” Id. at 1257. “In particular, the ‘399 patent’s claims address the problem of retaining website visitors that, if adhering to the routine, conventional functioning of Internet hyperlink protocol, would be instantly transported away from a host’s website after ‘clicking’ on an advertisement and activating a hyperlink.” Id. As previously noted, the claim steps have long been performed by driving instructors. Although the instant case discloses “techniques for training a skill advancement model for a driver and generating a teaching action for the driver” (¶ 2) and “technical solutions to the technical problem, for example, by implementing multi-task learning to the problem of training a machine-learning teacher in combination with an estimator to determining whether or not to emit a teaching cue and condition future behavior prediction and resulting metrics on whether teaching was provided” (¶ 20), courts have established that the use of machine learning is not necessarily rooted in technology. In particular, Applicant does not direct our attention to any portion of the Specification indicating that any of the claimed “behavior model”, “trajectory estimator” and “teacher action model” performs anything other than well understood, routine, and conventional functions, such as obtaining data, encoding data, determining data and generating data. As the Federal Circuit explained in Recentive Analytics, using existing machine learning technology to speed up a task does not render a claim patent eligible. See Recentive Analytics, 134 F.4th at 1214. As previously noted, the claim steps have long been performed by driving instructors and, as such, not necessarily rooted in technology. See DDR Holdings, 773 F.3d at 1257 (holding that the claims were directed to statutory subject matter because they claim a solution “necessarily rooted in computer technology in order to overcome a problem specifically arising in the realm of computer networks”). Even were one to consider these problems to be Internet-centric, the claim recites an invention that is merely the routine use of the computer components with generally claimed programming. See DDR Holdings, 773 F.3d at 1258–59 (cautioning that “not all claims purporting to address Internet-centric challenges are eligible for patent” and contrasting the claims to those at issue in Ultramercial, Inc. v. Hulu LLC, 772 F.3d 709 (Fed. Cir. 2014) in that, in DDR Holdings, the computer network was not operating in its “normal, expected manner” and the claims did not “recite an invention that is . . . merely the routine or conventional use of the Internet”). The purported solution comprises a generic computer operating in its ordinary and conventional capacity to perform the functions of receiving and analyzing data and sending data based on the results of the analyses. See supra; see also Alice, 573 U.S. at 224–26. Here, the limitations are recited functionally without implementation details on how they are technologically performed such that they would not be routine and conventional uses of the claimed component.
Applicant then argues that “[A]s a whole, the recited claims provide techniques for training a skill advancement model for a driver and generating a teaching action for
the driver. More specifically, the systems and methods are directed to approaches that enable the AI model to learn efficiently from both instructed data and uninstructed data by combining multitask training with skills and behavior prediction and separating behavior to be conditioned on whether or not instructions are happening. The inclusion of an explicit variable providing an indication that "teaching is happening" or not with trajectory prediction has shown to help training the AI model with respect to teaching cues and actions” and that “[T]he corresponding recitations in the claims express the "inventive concept."”. Applicant’s arguments have been fully considered but they are not persuasive because Applicant's contentions are unsubstantiated by any persuasive evidence on this record and, therefore, are mere lawyer argument and conclusory statements that are entitled to little probative value. See In re Geisler, 116 F.3d 1465, 1470 (Fed. Cir. 1997); see also Enzo Biochem, Inc. v. Gen Probe, Inc., 424 F.3d 1276, 1284 (Fed. Cir. 2005) ("Attorney argument is no substitute for evidence.").
In view of the foregoing, the Examiner maintains that each of Applicant’s pending claims 1-2, 4-10 and 12-16 considered as a whole, is directed to a patent-ineligible abstract idea that is not integrated into a practical application, and does not include an inventive concept.
Rejections under 35 U.S.C. §§ 102 and 103
The current rejections under 35 U.S.C. §§ 102 and 103 are withdrawn in view of Applicant’s amendment and remarks.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
The prior art made of record and not relied upon is listed in the attached PTO
Form 892 and is considered pertinent to applicant's disclosure.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to EDDY SAINT-VIL whose telephone number is (571)272-9845. The examiner can normally be reached Mon-Fri 6:30 AM -6:00 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, PETER VASAT can be reached on (571) 270-7625. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/EDDY SAINT-VIL/Primary Examiner, Art Unit 3715