Prosecution Insights
Last updated: April 19, 2026
Application No. 18/176,693

Systems and Methods to Identify Cargo

Final Rejection §102§103
Filed
Mar 01, 2023
Examiner
MUKUNDHAN, ROHAN TEJAS
Art Unit
2663
Tech Center
2600 — Communications
Assignee
The Boeing Company
OA Round
2 (Final)
100%
Grant Probability
Favorable
3-4
OA Rounds
3y 2m
To Grant
99%
With Interview

Examiner Intelligence

Grants 100% — above average
100%
Career Allow Rate
9 granted / 9 resolved
+38.0% vs TC avg
Minimal +0% lift
Without
With
+0.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
25 currently pending
Career history
34
Total Applications
across all art units

Statute-Specific Performance

§101
8.8%
-31.2% vs TC avg
§103
52.1%
+12.1% vs TC avg
§102
16.5%
-23.5% vs TC avg
§112
22.7%
-17.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 9 resolved cases

Office Action

§102 §103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendment In response to the amendment of 22 September 2025, the rejection of claim 6 under 35 U.S.C. 112(b) is withdrawn. Response to Arguments Applicant’s arguments filed 22 September 2025 have been fully considered. Examiner notes the thorough review of the prior office action and the amendment clarifying sensor types and hierarchical object determination. However, Applicant’s arguments are not persuasive. Within this argument, Examiner’s response is directed to independent claim 1, treated as representative of independent claims 14 and 18 which recite identical steps despite being directed to different statutory categories. Applicant’s key argument, as best understood by Examiner, is summarized as follows: Kirmani appears to sense an object with different sensors but, unlike the invention of the instant application, does not disclose how object identification is handled when object information from multiple sensors matches or does not match. Respectfully, Examiner disagrees. Examiner agrees with Applicant that Kirmani discloses an object detection method which utilizes multiple sensors for object identification. However, Kirmani further discloses handling object identification when different sensors return different object IDs. Specifically, Kirmani discloses addressing mismatches within identification data (paragraphs 0048-0053 of Kirmani, wherein the match/mismatch identification method is the match confidence determination between each type of sensor; a high-confidence level indicates that multiple sensors (e.g., image sensors, weight sensors, optical sensors (barcodes/QR codes, etc.)) agree on the object detected; a low-confidence level indicates a mismatch between different identification sensor signals; and wherein, in the case of a low-confidence interval, a determined hierarchy of sensor signals is used to determine the identity of the object. The system of Kirmani defaults to the optical scanner’s signal (paras. 0068, 0086, and 0089, although there is a notice to the user that a mismatch has occurred), but also utilizes visually observed dimensions and weight mismatches for validation.). Even taking into account the amendments which further specify the plurality of different types of sensors, detecting a sensor hierarchy, and identifying the object based on the highest-rated sensor in the hierarchy, the system of Kirmani still anticipates all limitations of claim 1. Thus, the prior art rejection of independent claims 1, 14, and 18 and all dependents, modified below to reflect the amendments and the change of dependency, is maintained below. Claim Objections Claims 1 and 14 are objected to because of the following informalities: Claim 1 reads as open-ended; there is no “and” prior to the introduction of the computing device. In claim 1, “different types of sensors; determine that the final” should read “different types of sensors; and determine that the final” Claim 14 reads as open-ended; there is no “and” prior to the introduction of the computing device. Appropriate correction is required. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. Claims 1-7, 9-10, 12, and 14-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kirmani et al. (US PG Pub 20220405704, hereafter referred to as Kirmani). Regarding claim 1, Kirmani describes a system to identify cargo (paras. 0010-0012, wherein the system is intended for detection and identification of cargo based on a plurality of sensors), the system comprising: a plurality of different types of sensors with each of the types of sensors configured to sense different aspects of the cargo and to transmit signals corresponding to the aspects (paras. 0023-0032, wherein the plurality of sensor type include weight sensors, cameras or other image sensors, and optical/barcode sensors; and wherein the weigh sensors transmit signals corresponding to detected weight and/or object weight changes, the image sensors transmit a video stream, and the optical sensor transmits identification information by scanning barcodes); a computing device that receives the signals from the sensors, the computing device configured to: determine initial identifications of the cargo for each of the sensors based on the sensed aspects of the sensor (paras. 0024, 0029, and 0034, wherein the initial identifications are determined by the image processors of para. 0024 and central processor/CPU of paras. 0029 and 0034); when the initial identifications match from the different types of sensors, determine that a final identification of the cargo is equal to the initial identifications (paras. 0048-0053 and figs. 3, 4A, and 4B for initial identification methods either succeeding or failing, wherein a high level of confidence in the proper package being loaded is a result of the plurality of identification criteria (sensor signal outputs) being satisfied, and wherein a lower level of confidence arises from the initial identification criteria not matching; and paras. 0054, 0056-0068, and 0079-0089, and figs. 5 and 6A-6C, wherein at the final verification time, comparisons between the barcode scan, weight, and image properties at the first time and the final time are compared to check for package movement en route); and when the initial identifications do not match from the different types of sensors: PNG media_image1.png 6 3 media_image1.png Greyscale determine a hierarchy for the different types of sensors; [and] determine that the final identification is the equal to the initial identification from the type of sensors with a highest rating according to the hierarchy (paras. 0048-0053, wherein the hierarchy is determined based on weight sensor signals, image sensor signals, and barcode scan based on the mismatch observed to produce a high-confidence identification; a high-confidence level indicates that multiple sensors (e.g., image sensors, weight sensors, optical sensors (barcodes/QR codes, etc.)) agree on the object detected; and paras. 0068, 0086, and 0089 disclosing the identification method defaulting to the optical sensor’s signal upon a mismatch, although there is a notice to the user that a mismatch has occurred). Regarding claim 2, Kirmani discloses all limitations of claim 1. Kirmani further discloses wherein the computing device is further configured to pair an image of the cargo with the final identification (para. 0072, wherein a tracking subroutine is executed by the image processing CPU if a change between paired (time 1 vs time 2) images is observed, and wherein an image absolute difference image comparison might take place for proper package verification). Regarding claim 3, Kirmani discloses all limitations of claim 2. Kirmani further discloses wherein the image of the cargo is one or more 3D scans of the cargo (paras. 0079-0080, wherein 3D images might be obtained through either the plurality of cameras, depth sensing cameras, acoustic scanning, or machine learning generation). Regarding claim 4, Kirmani discloses all limitations of claim 1. Kirmani further discloses wherein the computing device is further configured to determine a confidence value of the final identification of the cargo and pair the confidence value with the final identification (paras. 0049-0052, and element 312 of fig. 3, wherein the confidence value of the final identification of the cargo is a result of repeated increases of the overall confidence level from an initial baseline as a result of repeated registration and confirmation of different sensor signals up to the point of a match). Regarding claim 5, Kirmani discloses all limitations of claim 4. Kirmani further discloses wherein the computing device is further configured to determine the confidence value based on a first one of the initial identifications of the cargo (paras. 0024-0026, wherein the initial degree of confidence is based on a barcode scan and image calculation); and determine that the first one of the initial identifications matches a second one of the initial identifications and increase the confidence value (paras. 0050-0052 and fig. 3 element 312, wherein the confidence value is increased based on the weight sensor readings). Regarding claim 6, Kirmani discloses all limitations of claim 1. Kirmani further discloses wherein the computing device is further configured to determine that the first one of the initial identifications of the cargo is different than a third one of the initial identifications and decrease the confidence value (paras. 0024-0026 and 0049-0054, wherein the initial degree of confidence is based on a barcode scan and image calculation, and wherein the confidence value is decreased based on a detected discrepancy in package weight or dimensions between a package in the vehicle’s cargo bay and cargo information within the database). Regarding claim 7, Kirmani discloses all limitations of claim 1. Kirmani further discloses wherein one of the sensors comprises a camera (paras. 0023-0025, wherein the plurality of cameras are disclosed within a cargo area) and one of the aspects is a storage position of the cargo (paras. 0069-0073 and 0095-0098, wherein paras. 0069-0073 describe an initial “matching” process between the image processing system and barcode scanning to determine an initial position, and paras. 0095-0098 describe augmented loading techniques, wherein position and extended location of the package might be further tracked using marking or light-based techniques for directing loading location). Regarding claim 9, Kirmani discloses all limitations of claim 2. Kirmani further discloses wherein one of the sensors comprises an optical reader and one of the aspects is an optical code that is on the cargo (paras. 0030-0031, wherein the reader is a barcode scanner, and the optical code is a cargo tag). Regarding claim 10, Kirmani discloses all limitations of claim 1. Kirmani further discloses wherein the sensors and the computing device are mounted on a vehicle (para. 0116, wherein the holding area containing the sensors is mobile, and may be placed within a method of transportation, such as within a delivery truck). Regarding claim 11, Kirmani discloses all limitations of claim 1. Kirmani further discloses wherein the computing device is configured to determine a storage position of the cargo based on an image of the cargo; determine a description for the storage position from a loading instructions report; and determine one of the initial identifications of the cargo as the description. Regarding claim 12, Kirmani discloses all limitations of claim 1. Kirmani further discloses wherein the computing device is configured to determine a first identification of the cargo based on an image of the cargo captured by a camera (paras. 0024 and 0032, wherein the optical observation system captures images of the cargo); determine a second identification of the cargo based on an optical code on the cargo (para. 0024, wherein the optical code is a barcode and an optical sensor is configured to scan the barcode to determine a second identification); and determine that the first identification and the second identification match and that the final identification is the first identification and the second identification (para. 0052, wherein the scanned barcode and images correspond to each other, wherein an image of the package has almost identical measurements to that of the item in a database which has been looked up by the barcode scan). Regarding claim 14, Kirmani describes a system to identify cargo (paras. 0010-0012, wherein the system is intended for detection and identification of cargo based on a plurality of sensors), the system comprising: a first sensor configured to sense a first aspect of the cargo (paras. 0023-0024, wherein the first sensor is (or sensors are) image sensor(s) configured to observe the location of the cargo); a second sensor configured to sense a different second aspect of the cargo (paras. 0023-0025, wherein the second sensor is (or sensors are) weight sensor(s) configured to measure the weight of the cargo within the shelf or section within/upon which the cargo is located); a computing device configured to: determine a hierarchy with the first sensor and the second sensor (paras. 0050-0053, wherein the hierarchy is determined based on weight sensor signals, image sensor signals, and barcode scan based on the mismatch observed to produce a high-confidence identification); determine a first initial identification of the cargo based on signals from the first sensor (paras. 0024, 0029, and 0032-0034, wherein the initial identifications are determined by the image sensors of para. 0024 and central processor/CPU of paras. 0029 and 0032-0034); determine a second initial identification of the cargo based on signals from the second sensor (paras. 0024-0025, 0029, and 0032-0034, wherein the second initial identifications are determined by the weight sensors of para. 0024-0025 and central processor/CPU of paras. 0029 and 0032-0034); identify the cargo in a first manner when the first initial identification matches the second initial identification (para. 0052 and fig. 4A, wherein the first and second identification methods matching indicates a high level of confidence that the appropriate cargo is present); and identify the cargo in a second manner when the first initial identification is different than the second initial identification by selecting the initial identification of the first sensor and the second sensor with a higher rating according to the hierarchy (paras. 0049-0054 and fig. 4B, wherein the first and second identification methods not matching indicates a lowered level of confidence that the appropriate cargo is present, and might indicate the wrong cargo was loaded, and wherein the confidence is “weighed” by the hierarchy of the sensor signals). Regarding claim 15, Kirmani discloses all limitations of claim 14. Kirmani further discloses wherein the computing device is further configured to pair an image of the cargo with a final identification (para. 0072, wherein a tracking subroutine is executed by the image processing CPU if a change between paired (time 1 vs time 2) images is observed, and wherein an image absolute difference image comparison might take place for proper package verification). Regarding claim 16, Kirmani discloses all limitations of claim 14. Kirmani further discloses wherein the first manner comprises determining a final identification of the cargo as the first initial identification (paras. 0023-0026 and 0052 and fig. 4A, wherein the first manner being represented by the final identification, as described in 0023-0026 is conducted by ensuring that the two identification methods match, which indicates a high level of confidence that the appropriate cargo is present as per 0052 and fig. 4A). Regarding claim 17, Kirmani discloses all limitations of claim 14. Kirmani further discloses wherein the first sensor, the second sensor, and the computing device are mounted on an aircraft (para. 0045, wherein the delivery vehicle 202 might be an airplane for cargo identification and tracking). Regarding claim 18, Kirmani describes a method of identifying cargo (paras. 0010-0012, wherein the system is intended for detection and identification of cargo based on a plurality of sensors), the method comprising: determining a hierarchy of sensors that detect the cargo (paras. 0050-0053, wherein the hierarchy is determined based on weight sensor signals, image sensor signals, and barcode scan based on the mismatch observed to produce a high-confidence identification); determining initial identifications of the cargo based on different aspects that are sensed by one or more sensors (paras. 0023-0032, wherein the weigh sensors transmit signals corresponding to detected weight and/or object weight changes, the image sensors transmit a video stream, and the optical sensor transmits identification information by scanning barcodes); comparing the initial identifications (paras. 0048-0053 and figs. 3, 4A, and 4B for initial identification methods either succeeding or failing, wherein a high level of confidence in the proper package being loaded is a result of the plurality of identification criteria (sensor signal outputs) matching, and wherein a lower level of confidence arises from the initial identification criteria not matching; and paras. 0054, 0056-0068, and 0079-0089, and figs. 5 and 6A-6C, wherein at the final verification time, comparisons between the barcode scan, weight, and image properties at the first time and the final time are compared to check for package movement en route) ; determining that a final identification of the cargo is the same as the initial identifications when the initial identifications match and determining that the final identification is the initial identification from the one or more sensors with a higher rating of the hierarchy (paras. 0054, 0056-0068, and 0079-0089, and figs. 5 and 6A-6C, wherein at the final verification time, comparisons between the barcode scan, weight, and image properties at the first time and the final time are compared to check for package movement en route); and determining a confidence value of the final identification based on the hierarchy of the sensors used to determine the final identification (paras. 0049-0054 and fig. 4B, wherein the first and second identification methods not matching indicates a lowered level of confidence that the appropriate cargo is present, and might indicate the wrong cargo was loaded, and wherein the confidence is “weighed” by the hierarchy of the sensor signals, and wherein the confidence value of the final identification of the cargo is a result of repeated increases of the overall confidence level from an initial baseline as a result of repeated registration and confirmation of different sensor signals up to the point of a match). Regarding claim 19, Kirmani discloses all limitations of claim 18. Kirmani further discloses wherein the method further compris[es] capturing an image of the cargo (para. 0033, wherein the camera collects images and/or videos (image streams)) and pairing the image with the final identification (para. 0072, wherein a tracking subroutine is executed by the image processing CPU if a change between paired (time 1 vs time 2) images is observed, and wherein an image absolute difference image comparison might take place for proper package verification). Regarding claim 20, Kirmani discloses all limitations of claim 18. Kirmani further discloses wherein determining the confidence value comprises determining a baseline value based on a first one of the initial identifications (paras. 0049-0050, where the initial baseline of confidence is established by checking whether the optical scan of the optical code of the package and the cargo image match); and increasing the baseline value based on a second one of the initial identifications matching the first one of the initial identifications (paras. 0050-0052 and fig. 3 element 312, wherein the confidence value is increased based on the weight sensor readings matching information from either or both of the image sensors and the optical sensor scanning the optical code). Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim 8 is rejected under 35 U.S.C. 103 as being obvious over Kirmani in further view of Podnar et al. (US PG Pub 20180111698, hereafter referred to as Podnar). Regarding claim 8, Kirmani discloses all limitations of claim 2. Kirmani does not disclose wherein one of the sensors comprises a RFID reader and one of the aspects is a predetermined identification that is stored in an RFID tag that is mounted on the cargo. However, Podnar discloses wherein one of the sensors comprises a RFID reader and one of the aspects is a predetermined identification that is stored in an RFID tag that is mounted on the cargo (paras. 0024 and 0028 and fig. 4, wherein the check-in process allows RFID tags to be mounted atop the cargo, and the reader is mounted within the luggage system and hold). Specifically, Podnar discloses an intelligent baggage handling method which tags pieces of baggage with unique identifiers based on weights, sizes, and tags before generating a baggage map for intelligent loading and unloading. Therefore. both Kirmani and Podnar disclose methods for tracking and monitoring cargo/baggage being transported in vehicles using multiple different types of sensors to track the cargo en route. Thus, it would have been obvious for one having ordinary skill in the art prior to the effective filing date of the claimed invention to have used the RFID tag and reader disclosed by Podnar within the method of Kirmani as a simple substitution of a known sensor element for another (potentially the optical reader method of Kirmani) to yield the predictable result of non-contact-based baggage tracking through a trackable signal. Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Kirmani in further view of Burch et al. (US Patent No. 10, 878,364, hereafter referred to as Burch). Regarding claim 11, Kirmani discloses all limitations of claim 1. Kirmani further discloses wherein the computing device is configured to determine a storage position of the cargo based on an image of the cargo (paras. 0069-0073 and 0095-0098, wherein paras. 0069-0073 describe an initial “matching” process between the image processing system and barcode scanning to determine an initial position, and paras. 0095-0098 describe augmented loading techniques, wherein position and extended location of the package might be further tracked using marking or light-based techniques for directing loading location); and determine one of the initial identifications of the cargo (paras. 0023-0026, any of the image identification, weight sensor identification, or the optical code scan. Kirmani does not disclose determining a description for the storage position from a loading instructions report to be used as the description. However, Burch discloses determining a description for the storage position from a loading instructions report (para. 0164, wherein a screen includes specific item identification, location, loading instructions, and placement instructions of a specified cargo item). Specifically, Burch discloses a logistics information and management system wherein a scanner identifies an item and returns information regarding item location, placement, and further instructions. Therefore, Kirmani and Burch both disclose systems and methods for cargo logistics, specifically with respect to cargo location, placement, and item characteristics as observed by sensors. Thus, it would have been obvious to one having ordinary skill in the art prior to the effective filing date of the claimed invention to utilize the description from loading instructions as disclosed by Burch within the method of Kirmani as the application of a known technique to a known device to yield the predictable improvement of another identification method for the system of Kirmani. Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over Kirmani in further view of Huber (US Patent No. 9,162,765). Regarding claim 13, Kirmani discloses all limitations of claim 10. Kirmani further discloses wherein the computing device is further configured to determine a position of cargo within an alignment area (paras. 0069-0073 and 0095-0098, wherein paras. 0069-0073 describe an initial “matching” process between the image processing system and barcode scanning to determine an initial position, and paras. 0095-0098 describe augmented loading techniques, wherein position and extended location of the package might be further tracked using marking or light-based techniques for directing loading location). Kirmani does not disclose determining a lane in which the cargo is moved based on the position within the alignment area; and determining a storage position based on the lane. However, Huber discloses determining a lane in which the cargo is moved based on the position within the alignment area; and determining a storage position based on the lane (Col. 5 line 38- col. 6 line 13, wherein the lanes for movement are disclosed as “rows”, cargo conveying devices are responsible for moving the cargo into position, and the cargo storage positions are indexed by row). Specifically, Huber discloses a sensor-mediated method for efficient aircraft cargo loading. Therefore, both Kirmani and Huber disclose sensor-mediated methods for cargo identification and location identification within a cargo hold or bay. Thus, it would have been obvious to one having ordinary skill in the art prior to the effective filing date of the claimed invention to utilize the row-wise cargo loading and locating methodology of Huber within the method Kirmani as the application of a known method to a known device, in this case, the known organization method of Huber to the device of Kirmani, to yield the predictable result of more streamlined cargo organization and loading (assisted by the conveyors disclosed by Kirmani) and easier image-based location of cargo. Conclusion THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to ROHAN TEJAS MUKUNDHAN whose telephone number is (571)272-2368. The examiner can normally be reached Monday - Friday 9AM - 6PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Gregory Morse can be reached at 5712723838. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ROHAN TEJAS MUKUNDHAN/Examiner, Art Unit 2663 /GREGORY A MORSE/Supervisory Patent Examiner, Art Unit 2698
Read full office action

Prosecution Timeline

Mar 01, 2023
Application Filed
Jun 17, 2025
Non-Final Rejection — §102, §103
Jun 23, 2025
Examiner Interview Summary
Jun 23, 2025
Applicant Interview (Telephonic)
Sep 22, 2025
Applicant Interview (Telephonic)
Sep 22, 2025
Response Filed
Sep 22, 2025
Examiner Interview Summary
Jan 08, 2026
Final Rejection — §102, §103
Apr 09, 2026
Request for Continued Examination
Apr 13, 2026
Response after Non-Final Action

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602740
UNSUPERVISED LEARNING-BASED SCALE-INDEPENDENT BLUR KERNEL ESTIMATION FOR SUPER-RESOLUTION
2y 5m to grant Granted Apr 14, 2026
Patent 12593827
MONITORING SYSTEM FOR INDIVIDUAL GROWTH MONITORING OF LIVESTOCK ANIMALS
2y 5m to grant Granted Apr 07, 2026
Patent 12586384
Method and Device for Camera-Based Determination of a Distance of a Moving Object in the Surroundings of a Motor Vehicle
2y 5m to grant Granted Mar 24, 2026
Patent 12585252
METHOD FOR AUTOMATICALLY ADJUSTING MANUFACTURING LIMITS PRESCRIBED ON AN ASSEMBLY LINE
2y 5m to grant Granted Mar 24, 2026
Patent 12548294
DETERMINING A DEGREE OF REALISM OF AN ARTIFICIALLY GENERATED VISUAL CONTENT
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
100%
Grant Probability
99%
With Interview (+0.0%)
3y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 9 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month