DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 01/16/2026 has been entered.
Status of Claims
This non-final rejection is in response to Applicant’s amended filing of 01/16/2026.
Claims 1, 3, 8-15, and 17-23 are currently pending and have been examined. Applicant has amended claims 1and 19; cancelled claims 6 and 16; and added new claims 21-23.
Response to Arguments
Applicant's arguments with respect to claims 1-20 rejected under 35 USC § 103 have been fully considered but they are only partially persuasive.
Regarding Part One arguments against Choi as a reference, the Examiner concedes that Choi does not recite the amended limitations. However, this concession is moot because the new ground of rejection does not rely on Choi for any teaching or matter specifically challenged in the argument. Instead, Zhou is relied on to suggest the missing features, as rejected below.
Regarding Part Two arguments against Zhou as a reference, the Applicant states that Zhou does not disclose the limitation “a positioning accuracy of the second positioning is greater than a positioning accuracy of the first positioning” because it does not transition between positioning modalities, does not rely on the identifier in performing second positioning, and does not implement higher accuracy without a trigger pursuant to the autonomous mobile device traveling to dock with a charging station. The Examiner respectfully disagrees. The limitation does not recite the need to transition between positioning modalities nor is that pertinent to the second positioning having a greater positioning accuracy than the first positioning. The positioning in relation to a charging station identifier is not taught or suggested in Zhou, but it is not relied on to teach those features as they are disclosed in Choi. Zhou is relevant to improving the positioning of an autonomous mobile device, by adjusting its accuracy to improve positioning, and it is reasonable to combine with Choi, Aldred, and Ebrahimi because Zhou explicitly discloses its positioning methods to improve a return path to a charging station, itself positioned in a generated map with working areas where the self-moving device travels (¶ [0186-0189]). Therefore all inventions are reasonable to combine because they are directed toward positioning an autonomous moving robot within a working area and in relation to a corresponding charging station.
Regarding Part Three arguments against Ebrahimi as a reference, the Applicant reiterates their previous arguments from preceding actions as to the inapplicability of Ebrahimi’s use of neural network to detect the pose of the charging station identifier because it is directed toward objects. The Examiner respectfully disagrees. While the Examiner acknowledges the neural network is used to detect objects and not explicitly a charging station identifier, it does consider different poses of objects detected to further improve the network in detecting objects by matching key points of the detected objects (see at least ¶ [0313] Figs. 203A-208C). This would be directly applicable to Choi’s use of detecting feature points and patterns of a marker on a charging station. Therefore it would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the neural network of Ebrahimi into the combination of Choi and Aldred with a reasonable expectation of success because all inventions are directed toward positioning an autonomous moving robot within a working area and in relation to a corresponding charging station. This would help the moving robot recognize and authenticate its charging station.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claim 23 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claim 23 recites the limitation "…determining the distance and the angle between the two adjacent feature elements…" There is insufficient antecedent basis for this limitation in the claim. Neither claim 21 nor claim 1, from which claim 23 depends, recite an angle that from which claim 23’s determination can be made.
The Examiner observes that claim 22, also dependent on claim 21 and 1, introduces “…an angle between the feature elements…” and presumes claim 23 is intended to be dependent on claim 22. Appropriate correction is required.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1, 3, 8-9, 11-15, and 17-23 are rejected under 35 U.S.C. 103 as being unpatentable over Choi (US 20190072975 A1; reference provided in IDS filed 07/17/2024) in view of Aldred et al. (US 20160091899 A1; reference provided in IDS filed 07/17/2024), Ebrahimi et al. (US 20210089040 A1), and Zhou et al. (US 20210157334 A1).
Regarding claims 1 and 19, Choi discloses an autonomous mobile device comprising a controller (claim 19; see at least abstract and ¶ [0035] and Fig. 1B) configured to execute a method for controlling the autonomous mobile device (claim 1; see at least abstract), the method comprises:
performing second positioning on the autonomous mobile device when determining to obtain a second current pose of the autonomous mobile device in a second coordinate system, the second positioning comprising positioning by detecting an identifier on the charging station using the autonomous mobile device (see at least ¶ [0115], [0148], and [0153-0156] disclosing a moving robot using images to determine its relative position, posture, and coordinate information to a marker on a charging station);
wherein the autonomous mobile device is equipped with an image collector (see at least ¶ [0084-0087] disclosing an upper camera), and the performing second positioning on the autonomous mobile device to obtain the second current pose of the autonomous mobile device in the second coordinate system comprises:
acquiring an environment image collected by the image collector of the autonomous mobile device (see at least ¶ [0085] disclosing using the upper camera to image surrounding areas of the moving robot);
and determining, when the environment image comprises an image of the identifier, the second current pose of the autonomous mobile device in the second coordinate system based on pose information of the identifier in the environment image and a real pose of the identifier in the second coordinate system, wherein a relative pose between the identifier and the charging station is constant (see at least ¶ [0115], [0118], and [0122] disclosing the camera identifying the marker on the charging station and evaluating the distance and angle the moving robot is from the marker and charging station).
While Choi discloses calculating a distance between the main body of the moving robot and the marker on the charging station (see at least ¶ [0121-0124]), Choi does not explicitly disclose performing first positioning on the autonomous mobile device to acquire a first current pose of the autonomous mobile device in a first coordinate system;
and performing second positioning on the autonomous mobile device when determining, based on the first current pose and a first preset pose of a charging station in the first coordinate system, that a distance between the autonomous mobile device and the charging station is less than or equal to a first preset distance.
However, Aldred suggests performing first positioning on the autonomous mobile device to acquire a first current pose of the autonomous mobile device in a first coordinate system (see at least ¶ [0043-0046] disclosing setting the initial location of a mobile robot and its corresponding docking station, with the robot being set as the origin of a coordinate reference frame);
and performing second positioning on the autonomous mobile device when determining, based on the first current pose and a first preset pose of a charging station in the first coordinate system, that a distance between the autonomous mobile device and the charging station is less than or equal to a first preset distance (see at least ¶ [0043-0046] disclosing operating the mobile robot within a predetermined maximum radius of the docking station).
It would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the positioning and distance considerations of Aldred into the moving robot control methods of Choi with a reasonable expectation of success because both inventions are directed toward operating autonomous moving robots in cooperation with their respective charging stations. This allows the robot to operate within visual range of the charging station so that it can be guided back to the charging station with image data more accurately (see at least ¶ [0046]).
The combination of Choi and Aldred does not disclose inputting the environment image into a neural network model for identifying the identifier, and acquiring an identification result outputted from the neural network model, wherein, when the identification result indicates that the identifier is identified, the identification result comprises the pose information of the identifier in the environment image.
However, Ebrahimi suggests inputting the environment image into a neural network model for identifying the identifier, and acquiring an identification result outputted from the neural network model, wherein, when the identification result indicates that the identifier is identified, the identification result comprises the pose information of the identifier in the environment image (see at least ¶ [0593] disclosing using a neural network on image information to identify 2- and 3-dimensional features of an environment, including signature patterns).
While Ebrahimi does not disclose using the image data to explicitly train a neural network to recognize identifiers on a charging station, the use of a neural network to take image data of the surroundings and train it to recognize patterns directly applies and would have use in identifying a moving robot’s charging station. Therefore it would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the neural network of Ebrahimi into the combination of Choi and Aldred with a reasonable expectation of success because all inventions are directed toward positioning an autonomous moving robot within a working area and in relation to a corresponding charging station. This would help the moving robot recognize and authenticate its charging station.
While Choi suggests determining, based on the second current pose and a second preset pose of the charging station in the second coordinate system, a second planned path for directing the autonomous mobile device to a docking position of the charging station (see at least ¶ [0012], [0083], and [0132] disclosing the controller directing the moving robot to follow path to the charging station based on movement distance and movement direction as well as comparing it to a traveling route), the combination of Choi, Aldred, and Ebrahimi does not explicitly disclose planning, based on the second current pose of the autonomous mobile device and a second preset pose of the charging station in the second coordinate system, a second planned path for directing the autonomous mobile device to a docking position of the charging station;
and a positioning accuracy of the second positioning is greater than a positioning accuracy of the first positioning.
However, Zhou suggests planning, based on the second current pose of the autonomous mobile device and a second preset pose of the charging station in the second coordinate system, a second planned path for directing the autonomous mobile device to a docking position of the charging station (see at least ¶ [0186-0189] disclosing a path generation module that generates a return path to a charging station, including generating first and second segments of the path);
and a positioning accuracy of the second positioning is greater than a positioning accuracy of the first positioning (see at least ¶ [0269] and [0279] disclosing a moving object moving through a working area to provide current, high-precision positioning data and modifies base station data based on real time error evaluation).
It would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the satellite positioning of Zhou into the combination of Choi, Aldred, and Ebrahimi with a reasonable expectation of success because all inventions are directed toward positioning an autonomous moving robot within a working area and in relation to a corresponding charging station. This would help strengthen the accuracy of the positioning information of the moving robot within the working area.
Regarding claim 3, Choi does not disclose the second coordinate system takes a pose of the charging station as a coordinate origin;
or, the second coordinate system takes a pose of the autonomous mobile device as the coordinate origin.
However, Aldred teaches the second coordinate system takes a pose of the autonomous mobile device as the coordinate origin (see at least ¶ [0043-0046] disclosing setting the initial location of a mobile robot and its corresponding docking station, with the robot being set as the origin of a coordinate reference frame).
It would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the positioning and coordinate considerations of Aldred into the moving robot control methods of Choi with a reasonable expectation of success because both inventions are directed toward operating autonomous moving robots in cooperation with their respective charging stations. This allows the robot to operate within visual range of the charging station so that it can be guided back to the charging station with image data more accurately (see at least ¶ [0046]).
Regarding claim 8, Choi does not explicitly disclose adjusting the pose of the autonomous mobile device when the environment image does not comprise the image of the identifier, and acquiring the environment image after adjusting a pose collected by the image collector of the autonomous mobile device, until a termination condition is satisfied.
However, Aldred suggests adjusting the pose of the autonomous mobile device when the environment image does not comprise the image of the identifier, and acquiring the environment image after adjusting a pose collected by the image collector of the autonomous mobile device, until a termination condition is satisfied (see at least ¶ [0060-0062] disclosing the docking control module making the mobile robot perform a target search procedure during image acquisition).
It would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the docking control module of Aldred into the moving robot control methods of Choi with a reasonable expectation of success because both inventions are directed toward operating autonomous moving robots in cooperation with their respective charging stations. This would help the moving robot properly align and connect with its charging station.
Regarding claim 9, Choi suggests determining, based on the second current pose and the second preset pose of the charging station, a traversable global path between the autonomous mobile device and the docking position of the charging station for use as the second planned path (see at least ¶ [0132] discloses controlling the moving robot so that its following a traveling route to the charging station).
Regarding claim 11, Choi discloses determining whether the autonomous mobile device is in a mapped region (see at least ¶ [0045], [0054], [0086] disclosing the controller and memory storing and using map information of a traveling region the moving robot operates within);
and performing first positioning on the autonomous mobile device based on visual mapping data corresponding to the mapped region when the autonomous mobile device is in the mapped region, to obtain the first current pose of the autonomous mobile device in the first coordinate system, wherein the first coordinate system is a visual coordinate system corresponding to the visual mapping data (see at least ¶ [0045], [0054], [0086] disclosing the controller and memory storing and using map information of a traveling region the moving robot operates within as part of positioning the moving robot and obstacles in its vicinity).
Regarding claim 12, Choi does not explicitly disclose performing feature identification on the environment image collected by the image collector of the autonomous mobile device;
and computing the first current pose of the autonomous mobile device in the first coordinate system based on identified feature data and the visual mapping data.
However, Aldred suggests performing feature identification on the environment image collected by the image collector of the autonomous mobile device (see at least ¶ [0060-0062] disclosing the docking control module making the mobile robot perform a target search procedure during image acquisition);
and computing the first current pose of the autonomous mobile device in the first coordinate system based on identified feature data and the visual mapping data (see at least ¶ [0052-0062] disclosing the docking control module making the mobile robot perform a target search procedure during image acquisition and change its position and orientation to get into a docking state).
It would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the docking control module of Aldred into the moving robot control methods of Choi with a reasonable expectation of success because both inventions are directed toward operating autonomous moving robots in cooperation with their respective charging stations. This would help the moving robot properly align and connect with its charging station.
Regarding claim 13, the combination of Choi, Aldred, and Ebrahimi does not disclose at least acquiring satellite positioning data corresponding to the autonomous mobile device when the autonomous mobile device is not in the mapped region;
and determining a pose of the autonomous mobile device in a geographic coordinate system corresponding to the satellite positioning data at least based on the satellite positioning data, for use as the first current pose.
However, Zhou suggests at least acquiring satellite positioning data corresponding to the autonomous mobile device when the autonomous mobile device is not in the mapped region (see at least ¶ [0150-0154] and [0164] disclosing determining the position and coordinates of an autonomous lawn mower in and outside of a mapped working area using satellite information);
and determining a pose of the autonomous mobile device in a geographic coordinate system corresponding to the satellite positioning data at least based on the satellite positioning data, for use as the first current pose (see at least ¶ [0150-0154] and [0164] disclosing determining the position and coordinates of an autonomous lawn mower in and outside of a mapped working area using satellite information).
It would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the satellite positioning of Zhou into the combination of Choi, Aldred, and Ebrahimi with a reasonable expectation of success because all inventions are directed toward positioning an autonomous moving robot within a working area and in relation to a corresponding charging station. This would help strengthen the accuracy of the positioning information of the moving robot within the working area.
Regarding claim 14, Choi does not explicitly disclose determining whether the autonomous mobile device is in a docking pose when the autonomous mobile device moves along the second planned path;
determining a pose adjustment value of the autonomous mobile device based on the second current pose and the docking pose when the autonomous mobile device is not in the docking pose;
and adjusting the pose of the autonomous mobile device based on the pose adjustment value, and continuing to execute the determining whether the autonomous mobile device is in the docking pose, until the autonomous mobile device is in the docking pose.
However, Aldred suggests determining whether the autonomous mobile device is in a docking pose when the autonomous mobile device moves along the second planned path (see at least ¶ [0052-0059] and Fig. 6-8 disclosing a docking control module for guiding the mobile robot into a docking state);
determining a pose adjustment value of the autonomous mobile device based on the second current pose and the docking pose when the autonomous mobile device is not in the docking pose (see at least ¶ [0052-0059] and Fig. 6-8 disclosing a docking control module for guiding the mobile robot into a docking state where multiple alignment positions are determined based on pixel height and width of targets on the docking station and further orient the mobile robot based on the determination);
and adjusting the pose of the autonomous mobile device based on the pose adjustment value, and continuing to execute the determining whether the autonomous mobile device is in the docking pose, until the autonomous mobile device is in the docking pose (see at least ¶ [0052-0059] and Fig. 6-8 disclosing a docking control module for guiding the mobile robot into a docking state where multiple alignment positions are determined based on pixel height and width of targets on the docking station and further orient the mobile robot based on the determination).
It would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the docking control module of Aldred into the moving robot control methods of Choi with a reasonable expectation of success because both inventions are directed toward operating autonomous moving robots in cooperation with their respective charging stations. This would help the moving robot properly align and connect with its charging station.
Regarding claim 15, Choi does not explicitly disclose determining a docking path when the autonomous mobile device is in the docking pose, and driving the autonomous mobile device to move to the docking position along the docking path.
However, Aldred suggests determining a docking path when the autonomous mobile device is in the docking pose, and driving the autonomous mobile device to move to the docking position along the docking path (see at least ¶ [0052-0059] and Fig. 6-8 disclosing a docking control module for guiding the mobile robot into a docking state and directing the mobile robot to connect to the docking station).
It would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the docking control module of Aldred into the moving robot control methods of Choi with a reasonable expectation of success because both inventions are directed toward operating autonomous moving robots in cooperation with their respective charging stations. This would help the moving robot properly align and connect with its charging station.
Regarding claim 17, Choi discloses a non-transitory computer storage medium, storing a computer program therein, wherein the computer program, when executed by a processor, implements the method for controlling an autonomous mobile device according to claim 1 (see at least abstract and ¶ [0054-0055] disclosing memory storing operating information and control program for utilizing that information).
Regarding claim 18, Choi discloses a method for controlling an autonomous mobile device (see at least abstract and ¶ [0035] and Fig. 1B), comprising:
acquiring a second charging station pose of a charging station and a second current pose of the autonomous mobile device, wherein the second charging station pose is obtained by detecting pose information of an identifier on the charging station using the autonomous mobile device (see at least ¶ [0115], [0148], and [0153-0156] disclosing a moving robot using images to determine its relative position, posture, and coordinate information to a marker on a charging station), the identifier comprises an array of feature elements (see at least ¶ [0099-0113] disclosing the marker comprising a variety of possible patterns, shapes, and appearance), and the pose information of the identifier comprises a distance and an angle between the feature elements (see at least ¶ [0109], [0122], and [0128] disclosing calculating a distance and angle from the moving robot to the marker pattern);
and directing the autonomous mobile device to a docking position of the charging station based on the second charging station pose and the second current pose along the second planned path obtained according to the method for controlling the autonomous mobile device of claim 1 (see at least ¶ [0012], [0083], and [0132] disclosing the controller directing the moving robot to follow path to the charging station based on movement distance and movement direction as well as comparing it to a traveling route).
Choi does not disclose the second current pose is a reference pose.
However, Aldred suggests the second current pose is a reference pose (see at least ¶ [0043-0046] disclosing setting the initial location of a mobile robot and its corresponding docking station, with the robot being set as the origin of a coordinate reference frame).
It would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the positioning and distance considerations of Aldred into the moving robot control methods of Choi with a reasonable expectation of success because both inventions are directed toward operating autonomous moving robots in cooperation with their respective charging stations. This allows the robot to operate within visual range of the charging station so that it can be guided back to the charging station with image data more accurately (see at least ¶ [0046]).
Regarding claim 20, Choi discloses the autonomous mobile device is a lawn mower (see at least ¶ [0089] disclosing the moving robot to be a lawn mower).
Regarding claim 21, Choi discloses the identifier comprises an element array composed of feature elements, the feature elements comprise at least one of: a feature point, a feature line, and a feature pattern (see at least ¶ [0099-0113] and Figs. 2A-2C disclosing the marker comprising a variety of possible patterns, shapes, and appearance), and the pose information of the identifier at least comprises categories and poses of feature elements in the identified identifier (see at least ¶ [0099-0113] and Figs. 2A-2C disclosing the marker comprising a variety of possible patterns, shapes, and appearance).
Regarding claim 22, Choi discloses the poses of the feature elements comprise a distance and an angle between the feature elements (see at least ¶ [0109], [0122], and [0128] disclosing calculating a distance and angle from the moving robot to the marker pattern).
Regarding claim 23, Choi discloses the real pose of the identifier comprises a spatial distance between two adjacent feature elements of a given category in the identifier in the second coordinate system (see at least ¶ [0113] and Fig. 2B disclosing the pattern including longitudinal lines offset by an interval);
and the determining the second current pose of the autonomous mobile device in the second coordinate system based on the pose information of the identifier in the environment image and the real position of the identifier in the second coordinate system comprises:
determining the distance and the angle between the two adjacent feature elements of the given category based on the categories and the poses of the feature elements in the identifier (see at least ¶ [0109], [0122], and [0153-0156] disclosing calculating the coordinates, direction, distance, and angle of the moving robot relative to the longitudinal lines of the marker pattern);
and determining the second current pose of the autonomous mobile device in the second coordinate system based on the distance and the spatial distance between the two adjacent feature elements of the given category and the angle between the two adjacent feature elements when the angle is present (see at least ¶ [0121-0122] and [0153-0156] disclosing calculating the coordinates, direction, distance, and angle of the moving robot relative to the longitudinal lines of the marker pattern).
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Choi in view of Aldred et al., Ebrahimi et al., and Zhou et al., as applied to claim 9 above, and in further view of Grufman et al. (US 20170364088 A1).
Regarding claim 10, Choi discloses performing obstacle detection and second positioning on an environment where the autonomous mobile device is located when the autonomous mobile device moves along the second planned path, to obtain obstacle information and the second current pose of the autonomous mobile device at a current moment (see at least ¶ [0061-0064], [0070-0072], and [0075-0077] disclosing the moving robot equipped with obstacle sensors to detect obstacles in a movement direction).
The combination of Choi, Aldred, Ebrahimi, and Zhou does not explicitly disclose planning a local path based on edge information among the obstacle information, the second current pose at the current moment, and the second preset pose of the charging station, when the obstacle information indicates that an obstacle is detected, and using the local path as a new second planned path.
However, Grufman suggests planning a local path based on edge information among the obstacle information, the second current pose at the current moment, and the second preset pose of the charging station, when the obstacle information indicates that an obstacle is detected, and using the local path as a new second planned path (see at least ¶ [0019], [0040], [0051], and [0060] disclosing a robotic mower using edge detection with sensors to identify objects and boundaries to avoid).
It would be obvious to one of ordinary skill in the art before the effective filing date of the present invention to incorporate the object avoidance of Grufman into the combination of Choi, Aldred, Ebrahimi, and Zhou with a reasonable expectation of success because all inventions are directed toward positioning an autonomous moving robot within a working area and in relation to a corresponding charging station. This would help the moving robot operate safely within the working area.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JARED C BEAN whose telephone number is (571)272-5255. The examiner can normally be reached 7:30AM - 5:00PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Navid Z Mehdizadeh can be reached at (571) 272-7691. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/J.C.B./Examiner, Art Unit 3669
/NAVID Z. MEHDIZADEH/Supervisory Patent Examiner, Art Unit 3669