Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
This Final Office Action is in response to Applicant communication dated 12/10/2025.
Status of Claims
Applicant’s response dated 12/10/2025 cancelled claim 22, claims 9-10, 18, and 20 were previously cancelled. Claims 1-8, 11-17, 19, and 21 are pending and rejected as follows.
Response to Arguments
Applicant’s arguments with respect to 35 U.S.C. 103 are acknowledged but found unpersuasive for the reasons below.
Applicant argues that Milne’s interpolation is not the same as the “correcting” limitation as claimed because Milne does not disclose or suggest the use of hyperspectral imaging in any form or how Milne’s teaching could be adapted to line images taken by a hyperspectral imaging camera (Remarks P. 6). This argument is unpersuasive. Torrione is relied upon to teach “hyperspectral imaging data”, e.g. [0034]: “also be applicable in hyperspectral”. Torrione further describes tracking and disambiguating cuttings across multiple frames [0031] using a variety of particle detecting and image processing techniques [0041]. Although Torrione describes predicting particle position based on velocity/movement tracking [0032], Torrione fails to clearly describe correcting the “position of pixels” associated with the particles in the imaging data/across frames. However, Milne describes interpolating particle position to form a complete trajectory across frames, e.g. [0215], which reads on the “correcting the position of pixels associated with particles of interest in the plurality of [imaging data]”. Therefore, the combination of Torrione in view of Milne teaches or suggests the claimed limitation.
Examiner notes that the claims do not specify how the imaging is done or how the correcting is done. For example, the claims do not recite correcting individual 1D line scans. The claims merely specify that the position of pixels associated with particles of interest in the hyperspectral imaging data are corrected. Noting that Applicant’s specification describes 2D images of particle position and movement/speed correction of pixel position/particle location across images, e.g. Original specification P. 8, 12.
Therefore, the combination of Torrione in view of Milne reads on the broadest reasonable interpretation of the claims.
Applicant further argues that “Even though Torrione suggests that its techniques could be used with a hyperspectral camera (paragraph [0034] of Torrione), there is no disclosure of performing spatial scanning with the hyperspectral camera which would lead to the required line images. Therefore, there is no disclosure of generating a hyperspectral imaging data set comprising a plurality of lines of hyperspectral data derived from line images taken by the hyperspectral imaging device positioned along a drilling fluid cuttings path, and no disclosure of using the tracking information of an optical camera to correct hyperspectral imaging data obtained via spatial scanning.” (Remarks P. 7). This argument is unpersuasive. As noted above, the claims do not specify how the hyperspectral imaging is performed or how the correcting is performed. Further, the claims do not recite spatial scanning. The claims merely specify that “line images” are obtained using a hyperspectral imaging device. As noted above, Torrione teaches performing the method using hyperspectral imaging and thus teaches “hyperspectral imaging data”, these images are of the fluid front line and thus are “line images”, e.g. [0065]. Therefore the teachings of Torrione read on the BRI of the claims.
For the purpose of compact prosecution, Examiner notes that if the claims were amended to specify that hyperspectral one dimensional line scans are obtained and that the correcting is applied to each of the individual 1D line scans in the HSI domain prior to stitching the 2D image with corrected shapes, as described in the specification e.g. Original Specification Page 12 lines 1-10 (reflected in Fig. 3 and Fig. 4), this may help advance prosecution and overcome the current prior art rejection.
Applicant is welcome to contact the undersigned for an interview if it would be helpful to advance prosecution.
For at least these reasons, Applicant’s arguments are unpersuasive and the 35 U.S.C. 103 rejection is maintained.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-8, 11-17, 19, and 21 is/are rejected under 35 U.S.C. 103 as being unpatentable over Torrione US 20160130928 A1 (hereinafter “Torrione”) in view of Milne et al. US 20140177932 A1 (hereinafter “Milne”).
Regarding claims 1 and 11,
Torrione teaches: A method of analyzing drilling cuttings using image data output from a hyperspectral imaging device and at least one optical camera, comprising: (Fig. 1, Abstract: “The invention relates to a system and method of for measuring the characteristics and volume of drill cuttings. The system comprises at least one camera operably connected to a processor for recording characteristics of drill cutting particles wherein said processor is configured to perform particle detection, extract features of said particles, or both.”, [0034]: “Techniques similar to those discussed may also be applicable in hyperspectral”; [0033]: “In embodiments that comprise multiple cameras 102, LIDAR, RGB-D cameras and/or other distance sensing equipment 103, particles 104 may be identified using the observed “height” of the cuttings 104 as compared to the expected background height.”)
generating a hyperspectral imaging data set comprising a plurality of lines of hyperspectral data derived from line images taken by the hyperspectral imaging device positioned along a drilling fluid cuttings path; (Fig. 1; [0065] Once detected, the fluid front 118 location (line, quadratic, spline formulation, and/or any other demarcation) may be tracked over time. This may be accomplished through many different techniques. For example purposes only, tracking the fluid front 118 over time may be accomplished by appropriately parameterizing the fluid front 118 representation and leveraging time-sensitive Kalman or Particle Filtering approaches to update the location of the fluid front 118 in a frame. Preferably this would be done in many frames, and most preferably in every frame.)
obtaining tracking information with respect to particles of interest from the output of the at least one optical camera; ([0031]: “use persistence and/or tracking techniques to identify cuttings 104. Cuttings 104 often maintain approximately constant shape and size as they travel across the shaker 206. As a result, individual cuttings 104 may be able to be tracked and/or disambiguated across multiple frames. Tracking cuttings 104 may be accomplished using any of a number of tracking techniques, (e.g., Kalman filters, particle filters, and/or other ad-hoc tracking techniques). This may enable resolution of the cuttings 104 as multiple “looks” are aggregated on each cutting 104.”, [0020]: “Information about these particles 104 may be accumulated on a central computing resource 110. In the case of multiple cameras 102, information about the camera's 102 relative pose and orientation, and the corresponding particle 104 bounding boxes may be used to combine information about particles 104 that may be visible in both cameras 102. The resulting information about particles 104 may then be tracked over time and logged to a database 112 for later retrieval and further analysis.”)
correcting [images] associated with particles of interest in the plurality of lines of hyperspectral imaging data based on the obtained tracking information to generate corrected hyperspectral imaging data; ([0029]: “Cuttings may additionally appear to “move” at an approximately constant velocity across a shaker table 206. These features may enable background estimation and/or subtraction techniques to be used to identify individual cuttings 104.”; [0041]: “Particle 104 detection can be accomplished using any of a number of image processing techniques, including but not limited to corner detection, blob detection, edge detection, background subtraction, motion detection, direct object detection, adaptive modeling, statistical descriptors and a variety of similar techniques. The proper particle 104 detection approach may be site-dependent, based on the local lithology. Object detection may also be obtained using standard background depth estimation and/or subtraction approaches.”) and
analyzing the corrected hyperspectral imaging data to characterize the cuttings. ([0078]: “performing particle detection 306 using the data and a processor 110, extracting feature data 308 of any detected particles; [0019]-[0020]: “this information may be tracked over time and changes in the statistical distributions of the particles 104 may be flagged and brought to the mud-logger's or drill-team's attention with, for example, a plain-text description of the observed change (e.g., “the average cutting size has increased by 20% in the last 3 minutes”), and the corresponding video data. This information could also be used in a supervised classification system, trained using prior data to identify specific discrete cases—e.g., “cutting is from a cave-in”, or “cutting is due to X”. Supervised classification may be used on each particle 104 independently, or on the statistics of the particles 104 in the current frame and recent time in aggregate.”; [0042]-[0043])
Although Torrione describes a variety of particle detection techniques, the use of background subtraction and correction techniques for identifying the fluid front/line (e.g. [0064]), and describes predicting particle position based on velocity (e.g. [0032]), Torrione fails to clearly articulate correcting the position of pixels associated with the particles of interest.
Milne however, in analogous art of particle detection, describes correcting particle positions based on the obtained tracking information, i.e.:
correcting the position of pixels associated with particles of interest in the plurality of lines of hyperspectral imaging data based on the obtained tracking information to generate corrected hyperspectral imaging data; (Milne: [0215]: “Should the particle re-appear at an expected location within a certain timeframe, the processor can link the trajectories and interpolate virtual particle data for the interim frames. Note that from a regulatory standpoint it is important to be clear that virtual particle data is appropriately tagged so that it can be distinguished from true measured particle data.”; [0218]: “It can also use predictive tracking and the particle's velocity to extrapolate an expected particle position. If the particle appears again in an expected position, the virtual positions can be linked to form a complete trajectory.”)
Before the effective filing date of the claimed invention, it would have been obvious to a person having ordinary skill in the art to have modified Torrione’s system and method for tracking and characterizing drilling cuttings, as described above, to include correcting particle positions in the imaging data based on the tracking information in view of Milne with the motivation to increase the effectiveness and accuracy of particle detection and tracking and account for blind spots/”lost” cuttings by incorporating velocity estimations to interpolate missing particle positions in frames (e.g. Torrione: [0032], [0039], Milne: [0160]-[0161], [0215]).
Regarding claims 2 and 12, Torrione further teaches: further comprising, distinguishing between background and particles of interest in the optical camera output of a portion of the drilling fluid cuttings path that includes the hyperspectral imaging line position, and differentiating between particles of interest and background in the hyperspectral imaging data based on the step of distinguishing. (See Fig. 1, [0029] In some embodiments, discrete cuttings 104 may be identified on or near the shaker 206, and/or as they fall off the end of the shaker 206 using one of many image processing features and techniques. Background subtraction and/or change detection may be used to identify cuttings 104 in part because cuttings may appear different than the typical background, which may consist of a shale shaker 206, shale shaker screen 208, and/or other background features.; [0059]: “The visual texture of the splashing, vibrating, and/or moving fluid behind the fluid front 118 stands in contrast to the relatively regular texture of the rest of the shaker 206. As a result, it may be possible to detect the fluid front 118 using texture features. These features may be used to distinguish an area from the shaker table 206 and/or background features, (e.g., since the distinguished area differs from the shaker 206 and/or background), and/or used in a multi-class classification framework (e.g., a 2-class support vector machine (“SVM”)) to distinguish the “shaker” and/or “background” class from the “fluid” class.”)
Regarding claims 3 and 13, Torrione further teaches: further comprising tracking movement of particles in the optical camera output to obtain the tracking information and associating the particles of interest with the tracking information. ([0032]: “Still more embodiments may use fluid and/or cuttings 104 velocity estimation to identify cuttings 104. Cuttings 104 often move across the shaker screen 208 at approximately the same velocity as one another. This velocity may be estimated across all of the observed cuttings 104 and/or be tracked (e.g., with a Kalman filter or particle filters). This information may then be used to identify other cuttings 104 and/or predict the eventual locations of cuttings 104 that may be temporarily lost during the tracking and identification stage. Changes in this velocity may also be flagged to an operator.”)
Regarding claims 4 and 14, Torrione further teaches: further comprising obtaining depth information, wherein the particles of interest are distinguished from the background using the depth information. ([0041]: “Object detection may also be obtained using standard background depth estimation and/or subtraction approaches. The use of distancing equipment 103, such as LIDAR and/or RGB-D cameras, may have advantages with regard to these techniques.”; [0033]: “In embodiments that comprise multiple cameras 102, LIDAR, RGB-D cameras and/or other distance sensing equipment 103, particles 104 may be identified using the observed “height” of the cuttings 104 as compared to the expected background height.”)
Regarding claims 5 and 15, Torrione further teaches: wherein differentiating between particles of interest and background in the hyperspectral imaging data comprises masking particles of interest in the optical data based on the step of distinguishing, and applying the mask to the hyperspectral imaging data to differentiate between particles of interest and background in the hyperspectral imaging data. ([0043]: “Given the particle's 104 bounding boxes, various features about each detected particle 104 may then be extracted. These include various object shape parameters (e.g., image moments), texture features, HOG features, color descriptors, integral channel features, or the raw pixel data.”, [0041]: “Particle 104 detection can be accomplished using any of a number of image processing techniques, including but not limited to corner detection, blob detection, edge detection, background subtraction, motion detection, direct object detection, adaptive modeling, statistical descriptors and a variety of similar techniques”)
Regarding claims 6, 16, and 21, Torrione further teaches: wherein associating the particles of interest with tracking information comprises determining the speed of movement associated with pixels in the optical camera output. ([0032]: “Still more embodiments may use fluid and/or cuttings 104 velocity estimation to identify cuttings 104. Cuttings 104 often move across the shaker screen 208 at approximately the same velocity as one another. This velocity may be estimated across all of the observed cuttings 104 and/or be tracked (e.g., with a Kalman filter or particle filters). This information may then be used to identify other cuttings 104 and/or predict the eventual locations of cuttings 104 that may be temporarily lost during the tracking and identification stage. Changes in this velocity may also be flagged to an operator.”)
Regarding claims 7 and 17, Torrione further teaches: wherein the portion of the drilling fluid cuttings path is at least a portion of a shaker table. (see Fig. 1-2 and [0018], “shaker table 206”)
Regarding claim 8, Torrione further teaches: wherein the capture of images by at least one of the hyperspectral camera and the optical camera are synchronized with the frequency of movement of the shaker table. ([0027]: “cameras 102 and/or distance-sensing equipment 103 may be configured to move in response to pre-determined criteria. This movement may comprise rotation, panning, tilting and/or zoom adjustments along any axis. The movement may be automated”; [0019]: “Depending on the speed of the belt 204 and the rate at which particles 104 are moving, cameras may collect frames as slow as 0.003 Hz (1 frame/5 minutes) or much faster.”; [0054]: “the processor 110 may initiate, interrupt, alter or inhibit an automated activity using a machinery control system 116. The machinery control system 116 may increase or decrease the speed of a belt 204. The machinery control system 116 may adjust the tilt of a shale shaker 206 or may make adjustments to the functioning of any other piece of equipment 210.”)
Regarding claim 19, Torrione further teaches: A computer program embodied on a non-transitory computer readable medium and comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of claim 1. (Fig. 1, computer 110, database 112, [0020])
Pertinent Prior Art
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure: Teodorescu US20170089153A1 describing a system and methods for monitoring and analyzing drill cuttings for size distribution and density using captured image data.
Conclusion
THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SHELBY A TURNER whose telephone number is (571)272-6334. (via email: Shelby.Turner1@uspto.gov “without a written authorization by applicant in place, the USPTO will not respond via internet e-mail to an Internet correspondence” MPEP 502.02 II). The examiner can normally be reached on M-F 10-6 ET.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Technology Center Director Allana Bidder can be reached at (571) 272-5560. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SHELBY A TURNER/Supervisory Patent Examiner, Art Unit 2857