DETAILED ACTION
This is a non-final Office Action based on the merits.
Claims 1-16 are currently pending and are addressed below.
Information Disclosure Statement
The information disclosure statement (IDS) submitted is being considered by the examiner.
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d).
Claim Objections
Claims 1-2 7-8, and 11-2 are objected to because of the following informalities:
Claim 1 recites, “wherein if the object…” in the last clause. The term “if” appears to be unnecessary for the claim limitation and almost questions if the limitation is positively claimed by the language. It is recommended to remove the term “if”.
Wherein claim 2 recites, “wherein the control unit is operable to reconstruct the image of the front surroundings…”, is obscure. How is this “reconstruction” happening? How would one of ordinary skill in the art accomplish reconstructing an image? The language is oddly disclosed or maybe some element is missing from the claim language. The claim is understood to detect an object in the image.
Claim 12 recites similar language to claim 2 above and is objected by the same rationale.
Claim 11 uses similar language of claim 1 above and objected to by the same rationale.
Claim 7 appears to have a typographical error in line 2, wherein it states “wherein if is the control unit”. Examiner believes this is a typographical error.
Claim 8 shows similar language to claim 7 above and is objected to by the same rationale.
Claim Rejections - 35 USC § 101
35 U.S.C. 101 reads as follows:
Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title.
Claims 1-16 are rejected under 35 U.S.C. 101 because:
Step 1: Statutory Category – Yes
The claim recites a system. The claim falls within one of the four statutory categories. MPEP 2106.03.
Step 2A prong one evaluation: Judicial Exception – Yes -- Mental processes
In Step 2A, Prong one of the 2019 Patent Eligibility Guidance (PEG), a claim is to be analyzed to determine whether it recites subject matter that falls within one of the following groups of abstract ideas: a) mathematical concepts, b) mental processes, and/or c) certain methods of organizing human activity.
The Office submits that the foregoing bolded limitation(s) constitutes judicial exceptions in terms of “mental processes” because under its broadest reasonable interpretation, the limitations can be “performed in the human mind, or by a human using a pen and paper”. See MPEP 2106.04(a)(2)(III)
The claim recites the limitations:
detect an object from the image of the front surroundings and an object from the image of the rear surroundings using the prior information,
compare the detected object from the image of the front surroundings with the detected object from the image of the rear surroundings so as to check if there is an object which is not detected from the image of the front surroundings but detected from the image of the rear surroundings,
wherein if the object is not detected from the image of the front surroundings but detected from the image of the rear surroundings, the control unit is operable to determine that the object has fallen from the vehicle.
which constitutes judicial exceptions in terms of “mental processes” because under its broadest reasonable interpretation in light of the specification, the claim covers performance using mental processes.
The claim recites detect an object from the image… . This limitation, as drafted, is a simple process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of “a control unit”. For example, a person can look/examine a photo or outside a window and see if there is a box in front or behind a vehicle. This step is directed to a mental process.
The claim recites compare the detected object from the image of the front surroundings… . This limitation, as drafted, is a simple process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of “a control unit”. For example, a person can see out in front of a vehicle and behind a vehicle and notice objects are behind the vehicle that were not in front of the vehicle. This step is directed to a mental process.
The claim recites … determine that the object has fallen from the vehicle. This limitation, as drafted, is a simple process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of “a control unit”. For example, a person can determine that a dog that looks like their dog, on the road behind the vehicle, that was not seen in front of the vehicle, is their dog that jumped out the back bed of the truck bed. This step is directed to a mental process.
These limitations, as drafted, is a simple process that, under its broadest reasonable interpretation, covers performance of the limitation in the mind but for the recitation of “a control unit comprising electronics…”. That is, other than reciting “a control unit” nothing in the claim elements precludes the steps from practically being performed in the mind.
Thus the claim recites a mental process.
Step 2A Prong Two evaluations – Practical Application – No
Claim 1 is evaluated whether as a whole it integrates the recited judicial exception into a practical application. As noted in the 2019 PEG, it must be determined whether any additional elements in the claim beyond the abstract idea integrate the exception into a practical application in a manner that imposes a meaningful limit on the judicial exception. The claim recites use of “a control unit comprising electronics…”. This computer is understood as generic computing components as there is no improvement of the computing. The courts have indicated that additional elements merely using a computer to implement an abstract idea, adding insignificant extra solution activity, or generally linking use of a judicial exception to a particular technological environment or field of use do not integrate a judicial exception into a “practical application”.
The Office submits that the foregoing limitation(s) recite additional elements that do not integrate the recited judicial exception into a practical application.
The claim recites additional elements or steps of: “at least one camera operable to obtain an image…”. The receiving steps from sensors and/or from external sources is recited at a high level of generality (i.e. as a general means of gathering vehicle data for use in the evaluating step), and amounts to mere data gathering, which is a form of insignificant extra-solution activity.
The claim additionally recites the additional elements of “an output unit operable to output a signal at an output of the output unit” and “to control the output unit to output the signal” is recited at a high level of generality (i.e. as a general means of outputting information results from the evaluating step), and amounts to mere post solution outputting, which is a form of insignificant extra-solution activity.
The “control unit comprising electronics and having input coupled…” merely describes how to generally “apply” the otherwise mental judgements using a generic or general-purpose processing circuit, i.e. a computer. The control unit is recited at a high level of generality and merely automates the evaluating step.
These additional elements amount to insignificant application of the identified abstraction per MPEP 2106.05(g).
Accordingly, even in combination, these additional elements do not integrate the abstract idea into a practical application because they do not impose any meaningful limit on practicing the abstract idea.
2B Evaluation: Inventive Concept – No
Claims 1 is evaluated as to whether the claim as a whole amount to significantly more than the recited exception, i.e., whether any additional element, or combination of additional elements, adds an inventive concept to the claim.
Per the evaluation in step 2A Prong Two, the additional elements in the claim amount to no more than mere instructions to apply the exception using a generic computer component. The same analysis applies here in 2B, i.e., mere instructions to apply an exception on a generic computer cannot integrate a judicial exception into a practical application at Step 2A or provide an inventive concept in Step 2B.
Under the 2019 PEG, a conclusion that an additional element is insignificant extra-solution activity in Step 2A should be reevaluated in Step 2B. Here, the steps of determining particular properties is understood as collecting/organizing data which is considered to be extra-solution activities in Step 2A, and thus they are reevaluated in Step 2B to determine if they are more than what is well-understood, routine, conventional activity in the field. The specification does not provide any indication that the computing system is anything other than a conventional computer. MPEP 2106.05(d)(II), and the cases cited therein, including Intellectual Ventures I, LLC v. Symantec Corp., 838 F.3d 1307, 1321 (Fed. Cir. 2016), TLI Communications LLC v. AV Auto. LLC, 823 F.3d 607, 610 (Fed. Cir. 2016), and OIP Techs., Inc., v. Amazon.com, Inc., 788 F.3d 1359, 1363 (Fed. Cir. 2015), indicate that mere collection or receipt of data over a network is a well‐understood, routine, and conventional function when it is claimed in a merely generic manner (as it is here). Further, the Federal Circuit in Trading Techs. Int’l v. IBG LLC, 921 F.3d 1084, 1093 (Fed. Cir. 2019), and Intellectual Ventures I LLC v. Erie Indemnity Co., 850 F.3d 1315, 1331 (Fed. Cir. 2017), for example, indicated that the mere displaying of data is a well understood, routine, and conventional function. Accordingly, a conclusion that the collecting step is well-understood, routine, conventional activity is supported under Berkheimer.
Thus, the claim is ineligible.
Independent claim 11 discloses similar limitations and are found ineligible under the 101 guidance as per similar rationale of claim 1 above.
Dependent Claims
Dependent claims(s) 2-10 and 12-16 do not recite any further limitations that cause the claim(s) to be patent eligible. Rather, the limitations of the dependent claims are directed toward additional aspects of the judicial exception and/or well-understood, routine and conventional additional elements that do not integrate the judicial exception into a practical application. Therefore, dependent claims are not patent eligible under the same rationale as provided for in the rejection of claims 1 and 11 above.
Therefore, the dependent claims are ineligible under 35 USC §101.
Examiner's Note
Examiner has cited particular paragraphs / columns and line numbers or figures in the references as applied to the claims below for the convenience of the applicant. Although the specified citations are representative of the teachings in the art and are applied to the specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested from the applicant, in preparing the responses, to fully consider the references in entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the examiner. Applicant is reminded that the Examiner is entitled to give the broadest reasonable interpretation to the language of the claims. Furthermore, the Examiner is not limited to Applicants' definition which is not specifically set forth in the claims.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-8 and 10-16 are rejected under 35 U.S.C. 103 as being unpatentable over English translation of DE 102021000241 (“Klause”) in further view of US Pat. No 11535242(“Al”) in further view of US Pat. No 9449258 (“Palacio”).
As per claim 1 Klause discloses a vehicle for facilitating detecting an object fallen from the vehicle, comprising:
at least one sensor operable to obtain information of an object for storing prior information when the object is being loaded into the vehicle [¶ 6 In a method for detecting a loss of cargo from a vehicle of the type mentioned above, the loading state of the vehicle is monitored according to the invention by evaluating a compression state of a suspension of the vehicle by means of at least one spring travel sensor (loaded object prior information)], and an image of rear surroundings of the vehicle when the vehicle is moving [¶3 monitor the load status during the journey. … This involves monitoring the vehicle’s load sensors… can be validated using the sensor signals provided by the reverse-facing sensors(using rear facing sensors such as cameras), ¶9 the sensor is at least one… a camera (load sensor data and camera sensor data taken for comparison/validation)];
an output unit operable to output a signal at an output of the output unit [¶13 If a loss of cargo is detected, the vehicle can automatically stop and/or be controlled in a targeted manner, for example to change from a motorway lane to a hard shoulder. … braking maneuver… Furthermore, by issuing acoustic and/or visual warnings…]; and
a control unit comprising electronics and having an input coupled to an output of the at least one camera and an output coupled to an input of the output unit, the control unit operable to detect an object loaded on the vehicle from the load sensors and an object from the image of the rear surroundings using the prior information, and to compare the detected object (from the charged/loaded sensors) with the detected object from the image of the rear surroundings so as to check if there is an object which is not detected by a sensor change(charge sensor detecting loss of suspension load/not there) but detected from the image of the rear surroundings [¶23 To increase the reliability of correctly detecting a charge loss (when cargo falls from truck),… a charge loss is made plausible by evaluating sensor signals from at least one further sensor 3. The additional sensor 3 is… a camera. The at least one additional sensor 3 monitors in particular a rear area H of the vehicle 2 and/or advantageously also a right vehicle area R and/or left vehicle area L with respect to a longitudinal axis A of the vehicle. (by comparison of the two sensor’s data, suspension charge and camera, it is reliably determined that cargo has fallen from the truck)],
wherein if the object is not detected by the charge loss but detected from the image of the rear surroundings, the control unit is operable to determine that the object has fallen from the vehicle, and to control the output unit to output the signal [¶23 To increase the reliability of correctly detecting a charge loss (when cargo falls from truck),… a charge loss is made plausible by evaluating sensor signals from at least one further sensor 3. The additional sensor 3 is… or a camera. The at least one additional sensor 3 monitors in particular a rear area H of the vehicle 2 and/or advantageously also a right vehicle area R and/or left vehicle area L with respect to a longitudinal axis A of the vehicle. (by comparison of the two sensor’s data, suspension charge and camera, it is reliably determined that cargo has fallen from the truck), ¶13 If a loss of cargo is detected, the vehicle can automatically stop and/or be controlled (control the output)].
Klause is silent to obtain an image of the front surrounding of the vehicle, making a comparison of images from cameras for identification, and making a determination from the image of the front surroundings.
Al discloses obtaining an image of the front surrounding of the vehicle and making a determination from the image of the front surroundings by further comparing the front image of the vehicle [col:5;11-16 In this case, too, it can be assumed that no object is leaning against the motor vehicle. During the later clearance test, an image comparison of the at least one camera image with the at least one reference image is then carried out, col:6;20 One camera 11 may be a front camera, Fig. 1.11 (shows both a front and rear camera)].
It would have been obvious to one of ordinary skill in the art before the effective filing date the invention was made to modify Klause with the teachings of Al for using image reference data for identifying a change in the vehicle environment by identifying changes correlated in image data comparisons for purposes of improving awareness of the vehicle environment by analyzing surveillance data.
Klause and Al are silent to making a comparison between two camera images for making an identification.
Palacio discloses making a comparison between two images from two cameras for making an identification [abstract: target object captured by a first camera is identified in images captured by a second camera, Fig. 1].
It would have been obvious to one of ordinary skill in the art before the effective filing date the invention was made to modify Klause in view of Al with the teachings of Palacio to use two images for comparisons to make a determination of identification by comparing information from one image to information found in a second image for purposes of identifying an object by validating the image data analyzed to improve object verification by utilizing environmental sensor data across multiple sensors.
As per claim 11 Klause discloses a method for facilitating detecting an object fallen from the vehicle comprising:
obtaining, by at least one a sensor operable to obtain information of an object for storing prior information when the object is being loaded into the vehicle [¶ 6 In a method for detecting a loss of cargo from a vehicle of the type mentioned above, the loading state of the vehicle is monitored according to the invention by evaluating a compression state of a suspension of the vehicle by means of at least one spring travel sensor (loaded object prior information)];
obtaining, by the at least one camera an image of rear surroundings of the vehicle when the vehicle is moving [¶3 monitor the load status during the journey. … This involves monitoring the vehicle’s load sensors… can be validated using the sensor signals provided by the reverse-facing sensors(using rear facing sensors such as cameras), , ¶9 the sensor is at least one… a camera (load sensor data and camera sensor data taken for comparison/validation)];
detecting, by a control unit including electronics and having an input coupled to an output of the at least one camera [¶13 If a loss of cargo is detected, the vehicle can automatically stop and/or be controlled in a targeted manner, for example to change from a motorway lane to a hard shoulder. … braking maneuver… Furthermore, by issuing acoustic and/or visual warnings…], an object loaded on the vehicle from the load/charge sensors using the prior information [¶ 6 In a method for detecting a loss of cargo from a vehicle of the type mentioned above, the loading state of the vehicle is monitored according to the invention by evaluating a compression state of a suspension of the vehicle by means of at least one spring travel sensor (loaded object prior information)];
detecting, by the control unit. an object from the image of the rear surroundings using the prior information [¶3 monitor the load status during the journey. … This involves monitoring the vehicle’s load sensors… can be validated using the sensor signals provided by the reverse-facing sensors(using rear facing sensors such as cameras), , ¶9 the sensor is at least one… a camera (load sensor data and camera sensor data taken for comparison/validation)]; and
comparing. by the control unit, the detected object from the load/charge sensors with the detected object from the image of the rear surroundings so as to check if there is an object which is not detected from the image of the front surroundings but detected from the image of the rear surroundings [¶23 To increase the reliability of correctly detecting a charge loss (when cargo falls from truck),… a charge loss is made plausible by evaluating sensor signals from at least one further sensor 3. The additional sensor 3 is… or a camera. The at least one additional sensor 3 monitors in particular a rear area H of the vehicle 2 and/or advantageously also a right vehicle area R and/or left vehicle area L with respect to a longitudinal axis A of the vehicle. (by comparison of the two sensor’s data, suspension charge and camera, it is reliably determined that cargo has fallen from the truck)];
if the object which is not detected from the charge/load loss but detected from the image of the rear surroundings, determining, by the control unit, that the object has fallen from the vehicle [¶23 To increase the reliability of correctly detecting a charge loss (when cargo falls from truck),… a charge loss is made plausible by evaluating sensor signals from at least one further sensor 3. The additional sensor 3 is… or a camera. The at least one additional sensor 3 monitors in particular a rear area H of the vehicle 2 and/or advantageously also a right vehicle area R and/or left vehicle area L with respect to a longitudinal axis A of the vehicle. (by comparison of the two sensor’s data, suspension charge and camera, it is reliably determined that cargo has fallen from the truck)]; and
controlling an output unit having an input coupled to an output of the control unit, to output a signal [¶13 If a loss of cargo is detected, the vehicle can automatically stop and/or be controlled in a targeted manner, for example to change from a motorway lane to a hard shoulder. … braking maneuver… Furthermore, by issuing acoustic and/or visual warnings…].
Klause and Al are silent to making a comparison between two camera images for making an identification.
Palacio discloses making a comparison between two images from two cameras for making an identification [abstract: target object captured by a first camera is identified in images captured by a second camera, Fig. 1].
It would have been obvious to one of ordinary skill in the art before the effective filing date the invention was made to modify Klause in view of Al with the teachings of Palacio to use two images for comparisons to make a determination of identification by comparing information from one image to information found in a second image for purposes of identifying an object by validating the image data analyzed to improve object verification by utilizing environmental sensor data across multiple sensors.
As per claims 2 and 12 Klause is silent to however Al discloses further wherein the control unit is operable to reconstruct the image of the front surroundings to detect the object from the image of the front surroundings [col:5;9-11 For example, after the motor vehicle is locked, a user can be prompted to trigger the capture of the at least one reference image if for example the user considers, col:6;20 one camera 11 may be a front camera].
It would have been obvious to one of ordinary skill in the art before the effective filing date the invention was made to modify Klause with the teachings of Al for using image reference data for identifying a change in the vehicle environment by identifying changes correlated in image data comparisons for purposes of improving awareness of the vehicle environment by analyzing surveillance data.
As per claims 3 and 13 Klause discloses further wherein the control unit is operable to detect the object from load/charge sensor when the object is being loaded into the vehicle, and to store the detected object from load/charge sensor data as the prior information [¶ 6 In a method for detecting a loss of cargo from a vehicle of the type mentioned above, the loading state of the vehicle is monitored according to the invention by evaluating a compression state of a suspension of the vehicle by means of at least one spring travel sensor (loaded object prior information)].
Klause is silent to an image obtained (during a reference scenario) and the image as prior information.
Al discloses an image obtained (during a reference scenario) and the image as prior information(reference image). [col:5;9-11 For example, after the motor vehicle is locked, a user can be prompted to trigger the capture of the at least one reference image if for example the user considers].
It would have been obvious to one of ordinary skill in the art before the effective filing date the invention was made to modify Klause with the teachings of Al for using image reference data for identifying a change in the vehicle environment by identifying changes correlated in image data comparisons for purposes of improving awareness of the vehicle environment by analyzing surveillance data.
As per claim 4 Klause discloses further wherein the control unit includes a neural network and the object from the image is detected by the neural network [¶3 camera images generated by a camera can also be evaluated using machine learning such as artificial intelligence].
As per claims 5 and 14 Klause discloses further wherein the control unit is operable to store detected load of the object as the prior information, so that the detected load of the object is used for a correlation with at least one of the detected object from the image of the front surroundings or the detected object from the image of the rear surroundings [¶3 monitor the load status during the journey. … This involves monitoring the vehicle’s load sensors… can be validated using the sensor signals provided by the reverse-facing sensors(using rear facing sensors such as cameras), ¶6 In a method for detecting a loss of cargo from a vehicle of the type mentioned above, the loading state of the vehicle is monitored according to the invention by evaluating a compression state of a suspension of the vehicle by means of at least one spring travel sensor(object prior information), ¶14 If the vehicle is transporting different types of cargo, one type of lost cargo can be determined, for example, by evaluating camera images.]
Klause is silent to store an image, the image for use as correlation (reference image).
Al discloses to store an image, the image for use as correlation (reference image). [col:5;9-11 For example, after the motor vehicle is locked, a user can be prompted to trigger the capture of the at least one reference image if for example the user considers].
It would have been obvious to one of ordinary skill in the art before the effective filing date the invention was made to modify Klause with the teachings of Al for using image reference data for identifying a change in the vehicle environment by identifying changes correlated in image data comparisons for purposes of improving awareness of the vehicle environment by analyzing surveillance data.
As per claims 6 and 15 Klause discloses further wherein if it is determined by the control unit that the object has fallen from the vehicle, the output unit is operable to, with the output signal, alert a driver of the vehicle with at least one of audio information, video information and haptic feedback to a steering wheel [¶13 driver of the vehicle can also be informed of the loss of cargo via a corresponding warning. For example, a corresponding warning message in the form of text, a logo or symbol or the like can be displayed on a screen in a driver's cab or vehicle interior, and a corresponding warning tone can be emitted via a loudspeaker.].
As per claim 7 Klause discloses further wherein if is the control unit determines that the object has fallen from the vehicle, the control unit is operable to inform another vehicle in a vicinity of the vehicle, of an existence of the fallen object via wireless communication [¶13 Furthermore, by issuing acoustic and/or visual warnings, other road users can be warned of the damaged vehicle or the lost cargo. … Information about the vehicle accident(loss cargo) can also be transmitted to third parties such as an authority, for example a traffic authority and/or the police. A transport company can also be notified if the vehicle is a truck.(third parties are understood to use vehicles) ¶14 When reporting an accident, … information about a lost cargo type be transmitted, ¶22 and secondly, other road users and/or an authority can be warned about the lost load 1 ¶23 via a wireless communication interface].
As per claim 8 Klause discloses further wherein if is the control unit determines that the object has fallen from the vehicle, the control unit is operable to monitor vehicle dynamics of the vehicle and modify a path of the vehicle to take the fallen object back [¶13 If a loss of cargo is detected, the vehicle can automatically stop and/or be controlled(modified path)… This ensures the vehicle does not stray too far from the lost cargo(path modified to take back fallen object, automatic controls are understood as modifying a path that would be used to take a fallen object back)].
As per claims 10 and 16 Klause discloses further wherein if the control unit modifies the path of the vehicle, the control unit is operable to inform another vehicle in a vicinity of the vehicle of the modified path of the vehicle via wireless communication [¶13 If a loss of cargo is detected, the vehicle can automatically stop and/or be controlled(modified path)… This ensures the vehicle does not stray too far from the lost cargo(path modified to take back fallen object)… Furthermore, by issuing acoustic and/or visual warnings, other road users can be warned of the damaged vehicle or the lost cargo.].
Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over English translation of DE 102021000241 (“Klause”) in view of US Pat. No 11535242(“Al”) and US Pat. No 9449258 (“Palacio”) in further view of US 11062582 (“Van”).
As per claim 9 Klause discloses further wherein if the control unit modifies the path of the vehicle, the output unit is operable to display the cargo loss alert causing the modified path [¶13 If a loss of cargo is detected, the vehicle can automatically stop and/or be controlled in a targeted manner,… The driver of the vehicle can also be informed of the loss of cargo via a corresponding warning. For example, a corresponding warning message in the form of text, a logo or symbol or the like can be displayed on a screen in a driver's cab.]
Klause in view of Al and Palacio are silent to display the modified path.
Van discloses further display a modified path to detected loss cargo [col:9;35-39 as well as display a map and present driving directions to the selected destination via, e.g., the user interface 210. In some instances, the navigation system 215 may develop the route according to a user preference. … such as, for example, an instruction to navigate to a last known geographic point at which the cargo was lost…].
It would have been obvious to one of ordinary skill in the art before the effective filing date the invention was made to modify Klause in view of Al and Palacio with the teachings of Van to display navigation routes for a vehicle for purposes of a user to know how to get to a desired location by having a display show the user a map, improving ease of navigation by having updated routing data.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PAUL A CASTRO whose telephone number is (571)272-4836. The examiner can normally be reached 10-6pm on campus.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ramon Mercado can be reached at 5712705744. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
PAUL A. CASTRO
Examiner
Art Unit 3662
/P.A.C/Examiner, Art Unit 3658
/Ramon A. Mercado/Supervisory Patent Examiner, Art Unit 3658