DETAILED ACTION
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claims 1-15 are pending.
Drawings
Figure 1 is objected to as it is a series of indistinct blank boxes devoid of labels. Such labels would facilitate an understanding of the invention without undue searching of the specification. The present Figure 1 does not immediately convey any information and should be amended so that one looking at the Figure may quickly determine what elements they are looking at.
Claim Objections
Claims 4 and 7 are objected to because of the following informalities:
Claim 4, line 2: “the the” should be replaced with --the--.
Claim 7, line 2: “the the” should be replaced with --the--.
Appropriate correction is required.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “positioning means” in claims 1-15; and “drive unit” in claims 14-15.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1-5, 7, 8, and 10-15 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Ebrahimi Afrouzi et al. (US Publication No. 2022/0066456).
Ebrahimi Afrouzi teaches:
Re claim 1. A method for determining a work zone for an unmanned autonomous vehicle in a terrain, wherein the unmanned autonomous vehicle comprises a camera for capturing images of the terrain and a positioning means for determining a position of the unmanned autonomous vehicle on the terrain (Paragraph [0941]: “the processor may detect a type of flooring (e.g., tile, marble, wood, carpet, etc.) based on patterns and other visual clues processed by the camera.” Paragraph [1039]: “a reading of a floor sensor of the robot and a floor map may be used by the processor of the robot to adjust the likelihood of the state of the robot being within the particular region of the phase space coinciding with the type of floor sensed. In an additional example, a measured Wi-Fi™ signal strength and a map of the expected Wi-Fi™ signal strength within the phase space may be used by the processor of the robot to adjust the phase space probability distribution.”), the method comprising the steps of:
determining a set of at least one point, each point of the set being located within the work zone to be determined in the terrain, in which, for each point in the set, at least one image of a ground of the terrain at said point is captured and classifications for ground types of the grounds in the captured images are determined (Figs. 174B and 174C; paragraph [0695]: “visual indications of classifications of floor surfaces in different areas of the map, visual indications of a path that the robot has taken during a current session or other work sessions, visual indications of a path that the robot is currently following and has computed to plan further movement in the future, and visual indications of a path that the robot has taken between two points in the environment, like between a point A and a point B on different sides of a room or a building in a point-to-point traversal mode…. depicting a line of a path traced by the robot or adjusting a visual attribute of areas or portions of areas that have been covered, like color or shade or areas or boundaries. In some embodiments, the visual attributes may be varied based upon attributes of the environment sensed by the robot, like an amount of bacteria or a classification of a flooring type since by the robot. In some embodiments, a visual odometer implemented with a downward facing camera may capture images of the floor, and those images of the floor, or a segment thereof, may be transmitted to the application to apply as a texture in the visual representation of the working environment in the map, for instance, with a map depicting the appropriate color of wood floor texture, tile, or the like to scale in the different areas of the working environment.” Paragraph [1242]: “the processor may represent and distinguish environmental characteristics using ordinal, cardinal, or nominal values, like numerical scores in various dimensions or descriptive categories that serve as nominal values. For example, the processor may denote different driving surface types, such as carpet, grass, rubber, hardwood, cement, and tile by numerical categories, such as 1, 2, 3, 4, 5 and 6, respectively…. the processor may combine the numerical values with a map of the environment forming a multidimensional map describing environmental characteristics of different locations within the environment, e.g., in a multi-channel bitmap. In some embodiments, the processor may update the map with new sensor data collected and/or information inferred from the new sensor data in real-time or after a work session.”);
exploring a contiguous part of the terrain autonomously with the unmanned autonomous vehicle, where the unmanned autonomous vehicle departs from a point in the set of at least one point within the contiguous part of the terrain, wherein the unmanned autonomous vehicle remains within a perimeter of the contiguous part of the terrain, wherein the unmanned autonomous vehicle considers an obstacle or a transition to a ground type which is different from the ground type for the ground at the mentioned point from the set of at least one point, as part of the perimeter, wherein the unmanned autonomous vehicle determines a position of the unmanned autonomous vehicle on the terrain with the aid of the positioning means during the autonomous exploration (paragraph [0571]: “the processor of the robot may first build a global map of a first area (e.g., a bedroom) and cover that first area before moving to a next area to map and cover.” Paragraph [0582]: “during the first work session the robot may enter a second room after mapping a first room”. Paragraph [1039]: “a reading of a floor sensor of the robot and a floor map may be used by the processor of the robot to adjust the likelihood of the state of the robot being within the particular region of the phase space coinciding with the type of floor sensed. In an additional example, a measured Wi-Fi™ signal strength and a map of the expected Wi-Fi™ signal strength within the phase space may be used by the processor of the robot to adjust the phase space probability distribution.” Paragraph [1142]: “the processor may use a traversability algorithm to determine different areas that may be safely traversed by the robot, from which a coverage plan of the robot may be taken…. the local map includes all local sensor data (e.g., obstacle data, cliff data, debris data, previous stalls, floor transition data, floor type data, etc.). In some embodiments, the traversability algorithm may determine a best two-dimensional coverage area based on the portion of data taken from the map.” Paragraph [1149]: “the processor may use a traversability algorithm (e.g., a probabilistic method such as a feasibility function) to evaluate possible coverage areas to determine areas in which the robot may have a reasonable chance of encountering a successful traverse (or climb)… FIG. 190 illustrates an example of a path 10500 that is not traversable by the robot because of the sudden increase in the value of z between two adjacent cells.” Paragraph [1154]: “a sudden change in a type of driving surface in an area or a sudden discovery of a cliff in an area may impact the probability of traversability of the area.”);
repeating the previous step from a next point in the set of at least one point, where the next point is not located in a contiguous part of the terrain that has already been explored autonomously (paragraph [0571]: “the processor of the robot may first build a global map of a first area (e.g., a bedroom) and cover that first area before moving to a next area to map and cover.” Paragraph [0582]: “during the first work session the robot may enter a second room after mapping a first room”); and
creating a map of the work zone based on the determined positions of the unmanned autonomous vehicle, where the work zone corresponds to the autonomously explored contiguous parts of the terrain (paragraph [0571]: “the processor of the robot may first build a global map of a first area (e.g., a bedroom) and cover that first area before moving to a next area to map and cover.” Paragraph [0582]: “during the first work session the robot may enter a second room after mapping a first room”).
Re claim 2. Further comprising the additional step of determining a coordinate of a first end point and a coordinate of a second end point, the first end point and the second end point defining a line that is considered part of the perimeter of the contiguous part by the unmanned autonomous vehicle while autonomously exploring a contiguous part of the terrain (Fig. 250 and paragraph [1403]: “the user may use the application to create boundary zones or virtual barriers and cleaning areas….The user may move control point 5513 to change the shape of the zone 5500 by dragging control point 5513, such as in direction 5514.” Paragraph [1436]: “the user may use the user interface of the application to create zones by adding dividers to the map that divide the map into two or more zones.”).
Re claim 3. Wherein the coordinates of the first end point and the second end point are determined by drawing a line on a digital map of the terrain (Fig. 250 and paragraph [1403]: “the user may use the application to create boundary zones or virtual barriers and cleaning areas….The user may move control point 5513 to change the shape of the zone 5500 by dragging control point 5513, such as in direction 5514.”).
Re claim 4. Wherein the the unmanned autonomous vehicle dynamically adjusts the map of the work zone based on identified obstacles and/or based on a changed ground type within the work zone (paragraph [1142]: “the multidimensional and dynamic map includes a global and local map of the environment, constantly changing in real-time as new data is sensed. In some embodiments, the global map includes all global sensor data (e.g., LIDAR data, depth sensor data) and the local map includes all local sensor data (e.g., obstacle data, cliff data, debris data, previous stalls, floor transition data, floor type data, etc.).”).
Re claim 5. Wherein the unmanned autonomous vehicle determines classifications for ground types of grounds on the terrain with the aid of a neural network (paragraph [1193]: “the image sensor may be positioned and programmed to capture images of an area below the robot….the identification of the object that is included in the image data input by the camera is based on provided data for identifying the object and the image training data set. In some embodiments, training of the classifier is accomplished through a deep learning method, such as supervised or semi-supervised learning. In some embodiments, a trained neural network identifies and classifies objects in captured images.” Paragraph [1237]: “the processor may use a classifier such as a convolutional neural network to classify real-time sensor data of a location within the environment into different environmental characteristic classes such as driving surface types”).
Re claim 7. Wherein the the set of at least one point is determined by moving the unmanned autonomous vehicle along a route, wherein the unmanned autonomous vehicle is only moved over parts of the terrain that are part of the work zone to be determined, wherein a point is added to the set of at least one point, by using the positioning means to determine a position of the unmanned autonomous vehicle on the terrain and at the same time taking at least one image of a ground of the terrain using the camera of the unmanned autonomous vehicle at the specified determined position and wherein the unmanned autonomous vehicle automatically adds a point to the set of at least one point at regular intervals (Paragraph [1242]: “the processor may represent and distinguish environmental characteristics using ordinal, cardinal, or nominal values, like numerical scores in various dimensions or descriptive categories that serve as nominal values. For example, the processor may denote different driving surface types, such as carpet, grass, rubber, hardwood, cement, and tile by numerical categories, such as 1, 2, 3, 4, 5 and 6, respectively…. the processor may combine the numerical values with a map of the environment forming a multidimensional map describing environmental characteristics of different locations within the environment, e.g., in a multi-channel bitmap. In some embodiments, the processor may update the map with new sensor data collected and/or information inferred from the new sensor data in real-time or after a work session.”).
Re claim 8. Wherein the unmanned autonomous vehicle moves along the route by following a person, wherein the unmanned autonomous vehicle captures images of the person with the camera, wherein the person is recognized in the captured images by using image recognition (paragraph [0750]: “A robot may follow a user based on readings from a heat camera as data from a heat camera may be used to distinguish the living (e.g., humans, animals, etc.) from the non-living (e.g., desks, chairs, and pillars in an airport).”)
Re claim 10. Wherein the unmanned autonomous vehicle creates a map of the route after determining the set of at least one point (paragraph [1163]: “the processor may convert the grid map into a routing graph G consisting of nodes N connected by edges E.”).
Re claim 11. Wherein the set of at least one point is determined by indicating the at least one point on a digital map of the terrain (path 6403, Fig. 253C).
Re claim 12. Wherein the positioning means makes use of a Global Navigation Satellite System (paragraph 1443]: “SLAM, GPS, and a camera capturing visual information may be used in real time and may be synched to provide optimal performance.”).
Re claim 13. Wherein the positioning means uses the camera of the unmanned autonomous vehicle (Paragraph [0941]: “the processor may detect a type of flooring (e.g., tile, marble, wood, carpet, etc.) based on patterns and other visual clues processed by the camera.” Paragraph [1039]: “a reading of a floor sensor of the robot and a floor map may be used by the processor of the robot to adjust the likelihood of the state of the robot being within the particular region of the phase space coinciding with the type of floor sensed.).
Re claim 14. An unmanned autonomous vehicle for performing tasks on a terrain, comprising:
a drive unit for moving the unmanned autonomous vehicle across the terrain (paragraph [0855]: “one or more wheels of the robot may be driven by one or more electric motors.”);
a camera for capturing images of grounds of the terrain (Paragraph [0941]: “the processor may detect a type of flooring (e.g., tile, marble, wood, carpet, etc.) based on patterns and other visual clues processed by the camera.”);
a positioning means for determining a position of the unmanned autonomous vehicle on the terrain (Paragraph [1039]: “a reading of a floor sensor of the robot and a floor map may be used by the processor of the robot to adjust the likelihood of the state of the robot being within the particular region of the phase space coinciding with the type of floor sensed. In an additional example, a measured Wi-Fi™ signal strength and a map of the expected Wi-Fi™ signal strength within the phase space may be used by the processor of the robot to adjust the phase space probability distribution.”); and
a memory and a processor (Fig. 10), the processor being configured to perform the following operations:
determining a set of at least one point, each point of the set being located within the work zone to be determined in the terrain, in which, for each point in the set, at least one image of a ground of the terrain at said point is captured and classifications for ground types of the grounds in the captured images are determined (Figs. 174B and 174C; paragraph [0695]: “visual indications of classifications of floor surfaces in different areas of the map, visual indications of a path that the robot has taken during a current session or other work sessions, visual indications of a path that the robot is currently following and has computed to plan further movement in the future, and visual indications of a path that the robot has taken between two points in the environment, like between a point A and a point B on different sides of a room or a building in a point-to-point traversal mode…. depicting a line of a path traced by the robot or adjusting a visual attribute of areas or portions of areas that have been covered, like color or shade or areas or boundaries. In some embodiments, the visual attributes may be varied based upon attributes of the environment sensed by the robot, like an amount of bacteria or a classification of a flooring type since by the robot. In some embodiments, a visual odometer implemented with a downward facing camera may capture images of the floor, and those images of the floor, or a segment thereof, may be transmitted to the application to apply as a texture in the visual representation of the working environment in the map, for instance, with a map depicting the appropriate color of wood floor texture, tile, or the like to scale in the different areas of the working environment.” Paragraph [1242]: “the processor may represent and distinguish environmental characteristics using ordinal, cardinal, or nominal values, like numerical scores in various dimensions or descriptive categories that serve as nominal values. For example, the processor may denote different driving surface types, such as carpet, grass, rubber, hardwood, cement, and tile by numerical categories, such as 1, 2, 3, 4, 5 and 6, respectively…. the processor may combine the numerical values with a map of the environment forming a multidimensional map describing environmental characteristics of different locations within the environment, e.g., in a multi-channel bitmap. In some embodiments, the processor may update the map with new sensor data collected and/or information inferred from the new sensor data in real-time or after a work session.”);
exploring a contiguous part of the terrain autonomously with the unmanned autonomous vehicle, where the unmanned autonomous vehicle departs from a first point in the set of at least one point within the contiguous part of the terrain, wherein the unmanned autonomous vehicle remains within a perimeter of the contiguous part of the terrain, wherein the unmanned autonomous vehicle considers an obstacle or a transition to a ground type which is different from the ground type for the ground at the first point, as part of the perimeter, wherein the unmanned autonomous vehicle determines a position of the unmanned autonomous vehicle on the terrain with the positioning system during the autonomous exploration (paragraph [0571]: “the processor of the robot may first build a global map of a first area (e.g., a bedroom) and cover that first area before moving to a next area to map and cover.” Paragraph [0582]: “during the first work session the robot may enter a second room after mapping a first room”. Paragraph [1039]: “a reading of a floor sensor of the robot and a floor map may be used by the processor of the robot to adjust the likelihood of the state of the robot being within the particular region of the phase space coinciding with the type of floor sensed. In an additional example, a measured Wi-Fi™ signal strength and a map of the expected Wi-Fi™ signal strength within the phase space may be used by the processor of the robot to adjust the phase space probability distribution.” Paragraph [1142]: “the processor may use a traversability algorithm to determine different areas that may be safely traversed by the robot, from which a coverage plan of the robot may be taken…. the local map includes all local sensor data (e.g., obstacle data, cliff data, debris data, previous stalls, floor transition data, floor type data, etc.). In some embodiments, the traversability algorithm may determine a best two-dimensional coverage area based on the portion of data taken from the map.” Paragraph [1149]: “the processor may use a traversability algorithm (e.g., a probabilistic method such as a feasibility function) to evaluate possible coverage areas to determine areas in which the robot may have a reasonable chance of encountering a successful traverse (or climb)… FIG. 190 illustrates an example of a path 10500 that is not traversable by the robot because of the sudden increase in the value of z between two adjacent cells.” Paragraph [1154]: “a sudden change in a type of driving surface in an area or a sudden discovery of a cliff in an area may impact the probability of traversability of the area.”);
repeating the previous step from a next point in the set of at least one point, where the next point is not located in the contiguous part of the terrain that has already been explored autonomously (paragraph [0571]: “the processor of the robot may first build a global map of a first area (e.g., a bedroom) and cover that first area before moving to a next area to map and cover.” Paragraph [0582]: “during the first work session the robot may enter a second room after mapping a first room”); and
creating a map of the work zone based on the determined positions of the unmanned autonomous vehicle, where the work zone corresponds to the autonomously explored parts of the terrain (paragraph [0571]: “the processor of the robot may first build a global map of a first area (e.g., a bedroom) and cover that first area before moving to a next area to map and cover.” Paragraph [0582]: “during the first work session the robot may enter a second room after mapping a first room”).
Re claim 15. The unmanned autonomous vehicle according to claim 14, wherein the terrain is a garden and the unmanned autonomous vehicle autonomously maintains the garden (Fig. 111. Paragraph [1252]: “garden watering robot”. Paragaph [1276]: “robotic lawn mower”).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Ebrahimi Afrouzi et al. (US Publication No. 2022/0066456) as applied to claim 7 above, and further in view of Kim et al. (US Publication No. 2021/0068605).
The teachings of Ebrahimi Afrouzi have been discussed above. Ebrahimi Afrouzi fails to specifically teach: (re claim 6) wherein the map of the work zone is created while the unmanned autonomous vehicle is being charged at a charging station.
Kim teaches, at paragraph [0173], docking robots at their charging stations to perform near field communication with the charging stations, thereby mapping the start locations on the charging stations. This allows for such robots to have a start location coincident with the robot’s charging station, thus increasing the likelihood that the robot will be able to return to its charging station as the charging station is at a known location.
In view of Kim’s teachings, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to include, with the method as taught by Ebrahimi Afrouzi, (re claim 6) wherein the map of the work zone is created while the unmanned autonomous vehicle is being charged at a charging station, with a reasonable expectation of success, since Kim teaches docking robots at their charging stations to perform near field communication with the charging stations, thereby mapping the start locations on the charging stations. This allows for such robots to have a start location coincident with the robot’s charging station, thus increasing the likelihood that the robot will be able to return to its charging station as the charging station is at a known location.
Claim 9 is rejected under 35 U.S.C. 103 as being unpatentable over Ebrahimi Afrouzi et al. (US Publication No. 2022/0066456) as applied to claim 7 above, and further in view of Bal et al. (US Publication No. 2020/0047343).
The teachings of Ebrahimi Afrouzi have been discussed above. Ebrahimi Afrouzi fails to specifically teach: (re claim 9) wherein the route is a closed route in which the unmanned autonomous vehicle automatically begins the step of autonomous exploration after the route is closed.
Bal teaches, at Fig. 39 and paragraph [0158], a user may teach a robotic platform an outer perimeter of a room by walking the robotic platform along the outer wall, and then the robot may autonomously navigate within the area bounded by the closed path. This provides an intuitive method for a user to teach an area to be serviced by a robot in which the user moves through the actual area to be serviced.
In view of Bal’s teachings, it would have been obvious to one of ordinary skill in the art before the effective filing date of the invention to include, with the method as taught by Ebrahimi Afrouzi, (re claim 9) wherein the route is a closed route in which the unmanned autonomous vehicle automatically begins the step of autonomous exploration after the route is closed, with a reasonable expectation of success, since Bal teaches a user may teach a robotic platform an outer perimeter of a room by walking the robotic platform along the outer wall, and then the robot may autonomously navigate within the area bounded by the closed path. This provides an intuitive method for a user to teach an area to be serviced by a robot in which the user moves through the actual area to be serviced.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SPENCER D PATTON whose telephone number is (571)270-5771. The examiner can normally be reached Monday to Friday 9:00-5:00 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi Tran can be reached at (571)272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SPENCER D PATTON/ Primary Examiner, Art Unit 3656