DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The references cited in the IDS, submitted on 10/06/2025, have been considered.
Response to Amendment
The action is responsive to the Amendment filed on January 7, 2026. No claims were amended, cancelled, or added. Thus, claims 1-20 are pending.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102(a)(1) that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-10 and 12-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kwak (U.S. Patent Publication 2022/0138464 A1).
Regarding claim 1, Kwak teaches a method for adjusting performance characteristics of a farming machine as it travels through a field (Kwak: Abstract; [“…image processing module configured to determine a characteristic of the plant matter based on the image. A control system is configured to control one or more of the actuatable applicator mechanisms to apply the substance on the agricultural surface based on the determined characteristic…”]), the method comprising:
inputting images captured by the farming machine as it travels through the field into a performance model (Kwak: FIGS. 1-2; ¶¶39-57 [“Image capture system 224 includes one or more image capture components 260 configured to capture images of the field, and image processing components 238 are configured to process those images. Examples of an image processing component 238 include an image signal processor or image processing module (IPM)…spraying system 102 applies the substance to the field in a generally uniform pattern as agricultural spraying machine 100 traverses the field…targeted areas are identified using images acquired by image capture system 224 and processed by image processing components 238 to identify locations of crop plants and/or weed plants, to be sprayed, within those images…”]; FIGS. 5-1, 5-2, 5-3; ¶¶61-70 [“…the plant classifier is configured to classify portions of an image as representing non-crop plants or weeds, to be targeted for precision spraying operations. In another example, a crop classifier can be used to classify areas of an image as including crop plants, and spraying system 102 can be controlled to spray the other areas of the field having plants that are not classified as the target crop type. In either case, the classifier is utilized to identify areas of the field to be sprayed for the given spraying operation…the classifier is applied to the ROI in the image to detect target plant areas to be sprayed. The detection of the target plant areas at block 438 can be based on location within the image (block 440) and/or based on color within the image (block 442). Also, as noted above, the ROI can be normalized for changes in height, as represented at block 444. Again, ROI normalization can include adjusting the portion of the image being analyzed for plant detection, to normalize the ROI for the distance of the camera to the field surface when the camera acquired the image. Of course, the detection at block 438 can be performed in other ways, as represented by block 446…it is noted that in one example image processing component 238 can obtain crop location data to augment the detection of the target plant areas. For example, crop location data can be obtained from a prior planting operation, or otherwise, and indicate the locations where crop seeds were planted, which can be indicative of the crop rows, the crop spacing within rows, etc.…the image processing performed by image processing component 238 can analyze RGB color vectors from a pixel clustering algorithm to determine whether an area of the image that represents a plant indicates a crop plant or a non-crop plant or otherwise identifies the plant as a target plant type to be sprayed for the particular spraying operation. The target plant areas detected in the image can then be correlated to the respective areas of the field…”]);
identify plants in the images using an original identification sensitivity for the performance model (Kwak: FIGS. 1-2; ¶¶39-57; FIGS. 5-1, 5-2, 5-3; ¶¶61-70 {See above.}),
determine expected performance characteristics of the farming machine when treating plants identified in the images using the original identification sensitivity (Kwak: FIGS. 1-2; ¶¶39-57; FIGS. 5-1, 5-2, 5-3; ¶¶61-70 {See above.}),
access target performance characteristics for the farming machine (Kwak: FIGS. 1-2; ¶¶39-57; FIGS. 5-1, 5-2, 5-3; ¶¶61-70 {See above.}; FIG. 3-2; ¶¶71-77 [“…operational characteristics of the machine are obtained at block 450. For example, operational characteristics can include indications of the machine speed and/or boom height. For example, higher machine speeds and boom heights can result in lower confidence metrics due to lower image capture quality. Also, image processing data, such as data from an image processing module can be obtained. For example, image processing data can include image processing fault data,…targeted areas are identified using images acquired by image capture system 224 and processed by image processing components 238 to identify locations of crop plants and/or weed plants, to be sprayed, within those images…if the confidence metric is above the threshold, operation proceeds to block 456 in which control system 206 controls spraying system in a first spraying mode, that is based on the image data. In the present example, the first spraying mode provides image-based precision spraying, wherein selected nozzles 108 are operated to spray one or more discrete dispersal areas at block 458…If the confidence metric is below the threshold at block 455, operation proceeds to block 460 in which control system 206 controls spraying system 102 to operate in a second spraying mode…), and
determine a modified identification sensitivity for the performance model expected to achieve the target performance characteristics for the farming machine (Kwak: FIGS. 3-2, 5-1, 5-2, 5-3; ¶¶71-76 [“…operational characteristics of the machine are obtained at block 450…operational characteristics can include indications of the machine speed and/or boom height…image processing data, such as data from an image processing module can be obtained. For example, image processing data can include image processing fault data, diagnostic information…confidence metric, generated at block 448, is compared to a threshold. The threshold can be pre-defined, user-defined, or defined in other ways…the threshold is based on the detection sensitivity identified at block 412….if the confidence metric is above the threshold, operation proceeds to block 456 in which control system 206 controls spraying system in a first spraying mode…the first spraying mode provides image based precision spraying, wherein selected nozzles 108 are operated to spray one or more discrete dispersal areas at block 458. An example of image-based precision spraying is illustrated above with respect to FIG. 3-2. 3-2. If the confidence metric is below the threshold at block 455, operation proceeds to block 460 in which control system 206 controls spraying system 102 to operate in a second spraying mode…it is based on the operational characteristics obtained at block 450 that affected the confidence metric generation (e.g., reasons why the confidence metric was below the threshold)…Referring again to block 454, it can be seen that changes to the detection sensitivity can increase or decrease operation in the first mode. That is, increases in the detection sensitivity can increase the confidence in the classification resulting in more frequent precision spraying mode operation. Conversely, decreases in the detection sensitivity can decrease the confidence in the classification resulting in less frequent precision spraying mode operation.”]);
inputting additional images captured by the farming machine as it continues to travel through the field into the performance model, the performance model identifying a plant in the field using the modified identification sensitivity (Kwak: FIGS. 3-2, 5-1, 5-2, 5-3; ¶¶78-84 [“…other cameras 130 can be calibrated and utilized for image capture and control of corresponding nozzles 108. At block 472, user interface generator 231 generates a user interface to operator 228 (or other user, such as remote user 218). The user interface can display a video feed from one or more of cameras 260, as represented by block 473…camera feeds from multiple cameras 260 can be stitched at block 475…the video feed can be displayed with overlays with display elements representing configuration of the imaging system and/or spraying system, as represented by block 476. For example, an overlay can include display elements that represent the camera ROI (block 477), sensitivity (block 478), spray length/width (block 479) and can include other display elements 480 as well…an ROI display element identifies the nozzle(s) that are mapped to the particular ROI, as represented by block 481. At block 482, the user interface display includes user input mechanisms that are configured to receive input from operator 228 (or other user) to control the imaging system and/or spraying system…Based on the user input, the configuration of the imaging system and/or spraying system is adjusted at block 489. For example, a configuration adjustment can include changing the ROI at block 490, changing the sensitivity at block 491, changing the spray length and/or width at block 492, or other adjustments to the configuration at block 493. To adjust the configuration at block 489, control system 206 generates corresponding control signals to perform the configuration action…”]); and
treating the plant in the field using a treatment mechanism of the farming machine (Kwak: FIGS. 1-2; ¶¶39-57; FIGS. 5-1, 5-2, 5-3; ¶¶61-70 {See above.}).
Regarding Claims 14 and 20, each claim recites limitations found within Claim 1, and is rejected under the same rationale applied to the rejection of Claim 1.
Additionally regarding claims 14 and 20, Kwak additionally discloses a processor, and a non-transitory computer readable storage medium storing computer program instructions for calibrating performance characteristics (Kwak: FIGS. 1-2; ¶¶39-41 [“Machine 100 includes one or more processors or servers 250, a data store 251, and can include other items 252 as well. Data store 251 is configured to store data for use by machine 100. For example, data store 251 can store field location data that identifies a location of the field…”]).
Regarding claim 2, Kwak teaches all the limitations of the parent claim 1 as shown above. Kwak additionally discloses accessing target performance characteristics for the farming machine comprises: receiving the target performance characteristics from a manager of the farming machine before the farming machine travels through the field; as the farming machine travels through the field, autonomously determining the modified identification sensitivity that achieves the target performance characteristics; and modifying the performance model to identify plants in images using modified identification sensitivity (Kwak: FIGS. 1-2; ¶¶39-57 {See above.}; FIGS. 3-2, 5-1, 5-2, 5-3; ¶¶61-65 [“Block 404 identifies a particular one of cameras 260 (i.e., selecting a first camera) to be calibrated and used to acquire images to be processed to control the corresponding nozzles 108 that are mapped to the ROI of that particular camera. At block 406, for the selected camera 260, a ROI is identified, within the camera FOY. The FOY for a particular camera can be defined as a default FOY, represented at block 408…a FOY can be user selected, as represented at block 410…a detection sensitivity can be identified for the selected camera 260. Again, the detection sensitivity can be a default setting, as represented at block 414, and/or user selected, as represented at block 416. The detection sensitivity controls operation of the imaging system in acquiring images and processing those images to determine the location of plants to be sprayed. For example, detection sensitivity control can include defining a camera sensitivity at block 418. For example, adjustments to the camera sensitivity can define the number of pixels acquired by the camera for a given ROI. Also, block 412 can include changes to the functionality of image processing component 238 that processes the images acquired by the particular camera 260.”]).
Regarding claim 15, the claim recites limitations found within Claim 2, and is rejected under the same rationale applied to the rejection of Claim 2.
Regarding claim 3, Kwak teaches all the limitations of the parent claim 1 as shown above. Kwak additionally discloses displaying the expected performance characteristics on a display of the farming machine; receiving the target performance characteristics as an input from the display of the farming machine; determining the modified identification sensitivity that achieves the target performance characteristics; and modifying the performance model to identify plants in images using modified identification sensitivity (Kwak: FIGS. 1-2; ¶¶39-57 {See above.}; FIGS. 3-1, 3-2; ¶¶78-85 [“…cameras 260, as represented by block 473. For example, a single camera feed can be provided in the interface, as represented at block 474. Also, camera feeds from multiple cameras 260 can be stitched at block 475. Also, the video feed can be displayed with overlays with display elements representing configuration of the imaging system and/or spraying system, as represented by block 476. For example, an overlay can include display elements that represent the camera ROI (block 477), sensitivity (block 478), spray length/width (block 479) and can include other display elements 480 as well…At block 482, the user interface display includes user input mechanisms that are configured to receive input from operator 228 (or other user) to control the imaging system and/or spraying system. For example, user input mechanisms can receive input to adjust the configuration (block 483), select the camera or cameras to display the video feed in the user interface display (block 484), or other user input mechanisms…user input through the user input mechanisms is detected. Based on the user input, the configuration of the imaging system and/or spraying system is adjusted at block 489. For example, a configuration adjustment can include changing the ROI at block 490, changing the sensitivity at block 491, changing the spray length and/or width at block 492, or other adjustments…”]).
Regarding claim 16, the claim recites limitations found within Claim 3, and is rejected under the same rationale applied to the rejection of Claim 3.
Regarding claim 4, Kwak teaches all the limitations of the parent claim 1 as shown above. Kwak additionally discloses access target performance characteristics for the farming machine comprises: transmitting the expected performance characteristics to a remote system; receiving the target performance characteristics from the remote system; determining the modified identification sensitivity that achieves the target performance characteristics; and modifying the performance model to identify plants in images using modified identification sensitivity (Kwak: FIGS. 1-2; ¶¶39-57; FIGS. 3-1, 3-2; ¶¶78-85 {See above.}; FIG. 21; ¶¶135-137 [“…one example of the architecture shown in FIG. 2, where agricultural spraying machine 100 communicates with elements in a remote server architecture 10. In an example, remote server architecture 10 can provide computation, software, data access, and storage services that do not require end-user knowledge of the physical location or configuration of the system that delivers the services. In various examples, remote servers can deliver the services over a wide area network, such as the internet, using appropriate protocols. For instance, remote servers can deliver applications over a wide area network…”]).
Regarding claim 17, the claim recites limitations found within Claim 4, and is rejected under the same rationale applied to the rejection of Claim 4.
Regarding claim 5, Kwak teaches all the limitations of the parent claim 1 as shown above. Kwak additionally discloses determining an accuracy quantification for the original identification sensitivity; and determining a chemical usage quantification for the original identification sensitivity (Kwak: FIGS. 1-2; ¶¶39-57; FIGS. 3-1, 3-2; ¶¶78-85 {See above.}; FIG. 11; ¶¶95-100 [“Diagnostic data collector 602 is configured to collect diagnostic data from image capture system 224 and/or image processing components 238. Illustratively, collector 602 includes a confidence metric detector 614 configured to detect or otherwise identify the confidence metrics generated by image processing component 238…Spraying system 102 is selectively operated between first and second modes as machine 100 traverses the field based on image-based plant classification and corresponding confidence metrics. When one or more of image processing components 238 detect target plant areas with high confidence (confidence metrics above the threshold), the system is operated in a first, precision spraying mode…”]).
Regarding claim 18, the claim recites limitations found within Claim 5, and is rejected under the same rationale applied to the rejection of Claim 5.
Regarding claim 6, Kwak teaches all the limitations of the parent claim 5 as shown above. Kwak additionally discloses the accuracy quantification and the chemical usage quantification are based on a threshold number of images (Kwak: FIGS. 1-2; ¶¶39-57; FIGS. 3-1, 3-2; ¶¶78-85; FIG. 11; ¶¶95-100 {See above.}; ¶¶72-73 [“…processing module can be obtained…image processing data can include image processing fault processing module can be obtained. For example, image processing data can include image processing fault data, diagnostic information, etc. Examples are discussed in further detail below…the confidence metric, generated at block 448, is compared to a threshold. The threshold can be pre-defined, user-defined, or defined in other ways data, diagnostic information, etc….At block 452, the confidence metric, generated at block 448, is compared to a threshold. The threshold can be pre-defined, user-defined, or defined in other ways…”]).
Regarding claim 7, Kwak teaches all the limitations of the parent claim 5 as shown above. Kwak additionally discloses the accuracy quantification and the chemical usage quantification are based on an area in the field where the images were captured (Kwak: FIGS. 1-2; FIGS. 3-1, 3-2, 11; ¶¶39-57, ¶¶72-73, ¶¶78-85, ¶¶95-100 {See above.}).
Regarding claim 8, Kwak teaches all the limitations of the parent claim 5 as shown above. Kwak additionally discloses the accuracy quantification and the chemical usage quantification are based on plants identified with low accuracy (Kwak: FIGS. 1-2; FIGS. 3-1, 3-2, 11; ¶¶39-57, ¶¶72-73, ¶¶78-85, ¶¶95-100 {See above.}).
Regarding claim 9, Kwak teaches all the limitations of the parent claim 5 as shown above. Kwak additionally discloses determining an economic quantification for the original identification sensitivity (Kwak: FIGS. 1-2; FIGS. 3-1, 3-2; ¶¶55-56 [“…broadcast spraying can result in over-spraying (e.g., applying the substance in areas where the substance is not required) or under-spraying (e.g., applying too little of the substance in areas where the substance is required). Over-spraying can result in increased production costs and a potential negative environmental yield impact. Under-spraying, on the other hand, can reduce yields due to sub-optimal substance application…Precision spraying applications in precision farming and application techniques can reduce the use of substances, such as pesticides resulting in reduced grower costs and a reduction in environmental stress…”]).
Regarding claim 19, the claim recites limitations found within Claim 9, and is rejected under the same rationale applied to the rejection of Claim 9.
Regarding claim 10, Kwak teaches all the limitations of the parent claim 9 as shown above. Kwak additionally discloses identifying plants in images using a first model (Kwak: FIGS. 5-1, 5-2, 5-3; ¶¶61-63 [“…a classifier model to be used by plant classifier 278 is selected or otherwise obtained based on a target plant type.…”]) and determines the expected performance characteristics for the farming machine using a second model (Kwak: FIGS. 5-1, 5-2, 5-3; ¶¶69-71 [“…the image processing performed by image processing component 238 can analyze RGB color vectors from a pixel clustering algorithm to determine whether an area of the image that represents a plant indicates a crop plant or a non-crop plant or otherwise identifies the plant as a target plant type to be sprayed for the particular spraying operation. The target plant areas detected in the image can then be correlated to the respective areas of the field. These areas of the field are identified as containing target plants to be sprayed. Additionally, a confidence metric is generated representing the classification of the target plant area, as represented by block 448. The confidence metric indicates the confidence in the classification of the target plant area as an area to be sprayed by the spraying operation…”]).
Regarding claim 12, Kwak teaches all the limitations of the parent claim 1 as shown above. Kwak additionally discloses determining the expected performance characteristics of the farming machine when treating plants identified in the images using the original identification sensitivity are less than a threshold performance characteristic; and in response to determining the expected performance characteristics are less than the threshold performance characteristic, transmitting a notification to a manager of the farming machine indicating the expected performance characteristics (Kwak: FIGS. 5-1, 5-2, 5-3; ¶¶61-63; FIGS. 1-2; ¶¶39-57; FIGS. 3-1, 3-2; ¶¶78-85 {See above.}).
Regarding claim 13, Kwak teaches all the limitations of the parent claim 12 as shown above. Kwak additionally discloses a recommendation including the target performance characteristics based on the expected performance characteristics for the original identification sensitivity and the threshold performance characteristic; and wherein the notification includes the recommendation (Kwak: FIGS. 1-2, 3-2, 5-1, 5-2, 5-3; ¶¶61-63, ¶¶71-76, ¶¶78-85 {See above.}).
Claim Rejections - 35 USC § 103
The following is a quotation of the appropriate paragraphs of AIA 35 U.S.C. 103(a) which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Kwak; in view of Sibley (U.S. Patent Publication 2023/0011505 A1).
Regarding claim 11, Kwak teaches all the limitations of the parent claim 11 as shown above. However, Kwak fails to explicitly teach a convolutional neural network comprising: a single encoder to encode images onto a latent layer of the convolutional neural network; a first decoder to decode the latent layer to identify plants; and a second decoder to decode the latent layer to determine the expected performance characteristics
Sibley, in an analogous art, discloses detecting and controlling growth of undesirable vegetation in a field (Sibley: ¶006). Therein, Sibley teaches a convolutional neural network comprising: a single encoder to encode images onto a latent layer of the convolutional neural network; a first decoder to decode the latent layer to identify plants; and a second decoder to decode the latent layer to determine the expected performance characteristics (Sibley: FIG. 40; ¶¶399-400 [“…object identification may be performed using a deep learning algorithm. The deep learning algorithm may be implemented using a deep neural network. Images may be input to the deep learning algorithm which produces a list of objects perceived in the input images…a multi-layer perceptron architecture (MLP) for implementing the deep learning. In some embodiments, a convolutional neural network (CNN) may be used for object identification. The CNN may use a set of hyperparameters that characterize or describe the operation of the CNN. Such parameters include number of hidden layers, activation functions, error functions, batch size, and so on…the offsite computing resources 10200 may be used to optimize hyperparameters for the machine learning algorithm used by the onsite platform 10400. For example, at the beginning of each run made by the onsite platform 10400 in the field, the offsite computing resources 10200 may download (e.g., at 21200) a new set of models or ML behavior to the onsite platform 10400…”]).
It would have been obvious to one of ordinary skill in the art at the time the invention was made to implement the features of providing a performance model which includes a convolutional neural network comprising: a single encoder to encode images onto a latent layer of the convolutional neural network; a first decoder to decode the latent layer to identify plants; and a second decoder to decode the latent layer to determine the expected performance characteristics, disclosed by Sibley, into Kwak, with the motivation and expected benefit of accurately identifying plants in images. This method for improving Kwak, was within the ordinary ability of one of ordinary skill in the art based on the teachings of Sibley. Therefore, it would have been obvious to one of ordinary skill in the art to combine the teachings of Kwak and Sibley to obtain the invention as specified in claim 11.
Response to Arguments
Applicant’s arguments filed on January 7, 2026 have been fully considered but are not persuasive. Applicant is thanked for their arguments which were presented in an effort to overcome the outstanding rejections under 35 U.S.C. 102 and 103. However, the rejection of claims 1-20 under 35 U.S.C. 102 and 103 persists.
In regard claims 1-20 rejected under 35 U.S.C. 102 and 103, Examiner’s position and supporting remarks are presented in the rejection above.
Applicant’s argue (Remarks pp.11-12) “At best, Kwak describes a user or a controller being able to select a sensitivity for an identification model. Kwak at [0064]. This disclosure is used to, e.g., define the size of plants for detection which can thereby change the amount of treatment applied to a field (e.g., high sensitivity identifies more plants which causes more treatments, and vice versa). The claims, however, disclose linking target performance characteristics to identification sensitivity and modifying the identification sensitivity to achieve those performance characteristics, rendering them patentably distinct from the disclosure of Kwak.” Applicant’s arguments are not well taken.
Kwak discloses, in FIGS. 1-2, 3-2, 5-1, 5-2, 5-3; ¶¶39-57, ¶¶61-70, ¶¶71-77 {See above.}, obtaining operational characteristics of the machine, can include indications of the machine speed and/or boom height, and image processing data, processing that information to identify locations of crop plants and/or weed plants, to be sprayed. Additionally, Kwak further discloses, determining a modified identification sensitivity for the performance model expected to achieve the target performance characteristics for the farming machine those images (“if the confidence metric is above the threshold, operation proceeds to block 456 in which control system 206 controls spraying system in a first spraying mode…the first spraying mode provides image based precision spraying, wherein selected nozzles 108 are operated to spray one or more discrete dispersal areas at block 458. An example of image-based precision spraying is illustrated above with respect to FIG. 3-2. 3-2. If the confidence metric is below the threshold at block 455, operation proceeds to block 460 in which control system 206 controls spraying system 102 to operate in a second spraying mode…it is based on the operational characteristics obtained at block 450 that affected the confidence metric generation (e.g., reasons why the confidence metric was below the threshold)…Referring again to block 454, it can be seen that changes to the detection sensitivity can increase or decrease operation in the first mode. That is, increases in the detection sensitivity can increase the confidence in the classification resulting in more frequent precision spraying mode operation. Conversely, decreases in the detection sensitivity can decrease the confidence in the classification resulting in less frequent precision spraying mode operation.”). Thus Kwak anticipates Applicant’s claimed invention.
Therefore, the rejection of the independent claims, claim 1, as well as claims 14 and 20, and dependent claims, claims 2-13 and 15-19, under 35 USC § 102, as being anticipated by Kwak, is maintained.
Similarly, the rejection of the dependent claim, claims 11, under 35 USC § 103, as being unpatentable over Kwak; in view of Sibley, is maintained.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JEFFREY P AIELLO whose telephone number is (303) 297-4216. The examiner can normally be reached on 8 AM - 4:30 PM EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Shelby Turner can be reached on (571) 272-6334. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JEFFREY P AIELLO/Primary Examiner, Art Unit 2857