Prosecution Insights
Last updated: April 19, 2026
Application No. 18/116,081

MONITORING PACKAGE PICKUPS USING VIDEO

Non-Final OA §102§103§112
Filed
Mar 01, 2023
Examiner
BEATTY, TY MITCHELL
Art Unit
2663
Tech Center
2600 — Communications
Assignee
Objectvideo Labs LLC
OA Round
3 (Non-Final)
70%
Grant Probability
Favorable
3-4
OA Rounds
3y 1m
To Grant
99%
With Interview

Examiner Intelligence

Grants 70% — above average
70%
Career Allow Rate
19 granted / 27 resolved
+8.4% vs TC avg
Strong +42% interview lift
Without
With
+42.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
15 currently pending
Career history
42
Total Applications
across all art units

Statute-Specific Performance

§101
7.1%
-32.9% vs TC avg
§103
42.8%
+2.8% vs TC avg
§102
27.1%
-12.9% vs TC avg
§112
23.1%
-16.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 27 resolved cases

Office Action

§102 §103 §112
DETAILED ACTION Notice of Pre-AIA or AIA Status 1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . The Amendment filed 10 February, 2025 (hereinafter “the Amendment’) has been entered and considered. Claims 1, 16, and 19 have been amended. Claim 21 has been added. Claim 7 has been cancelled. Claims 1-6 and 8-21, all the claims pending in the application, are rejected. All modifications to the rejection set forth in the present action were necessitated by Applicants’ claim amendments. Response to Amendment 2. In view of the amendments to independent claims 1, 16, and 19, the rejections below have been modified to address the new claim language. Prior Art Rejections On page 8 of the Amendment, the Applicant contends that Gutke does not teach or suggest the newly added features to the independent claims, specifically, “causing presentation of an indication of which package is to be retrieved, the indication comprising a projection onto an area in which the package is physically located or an audible indication”. The Examiner respectfully disagrees and points to Gutke, P[0030]: “In some embodiments, the package detection system 20 may display the instructions on the screen 34, or may deliver an audible message through a speaker, or may send an electronic message to the delivery person's electronic device, or may employ other suitable communication methods. In other embodiments, the package detection system 20 may convey the delivery instructions and simultaneously unlock a secure box (or provide the delivery person 12 with an access code to manually unlock the box) to receive the delivery. The secure box may be equipped with internal UV lighting to perform anti-microbial treatment of the package 10 before the package 10 is recovered by the recipient and brought inside the home.”, and in P[0034]: “At step 314, the package detection system 20 may provide feedback to the delivery person 12 regarding package delivery. If the package 10 has been delivered in accordance with the delivery instructions, the package detection system 20 may send a confirmation message to the delivery person 12 or emit a successful sound to confirm the delivery. Similarly, the package detection system 20 may notify the delivery person 12 that the package 10 has not been placed properly and request that the package 10 be moved to the proper location. At step 316, once the package 10 has been properly delivered, the package detection system 20 may send an electronic message to the recipient regarding package delivery and placement, and providing package delivery details, such as a time stamp and identity of the delivery company and delivery person 12. In some embodiments, the package detection system 20 may provide images or video of the delivery process and the package location, or may provide a live feed to facilitate regular monitor of the package 10 throughout the day.” Where Gutke explicitly discloses audible instructions which provides information on which package is to be retrieved. In view of the foregoing, Gutke does indeed teach the newly added features of independent claim 1. Accordingly, the prior art rejections based on Gutke are modified to address the new claim language. Applicants assert the same arguments for independent claims 16 and 19 and all the dependent claims. These arguments are addressed above. Claim Rejections - 35 USC § 112 The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. 3. Claim 21 is rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claim 21 recites the limitation "an area" in Line 3. There is unclear as this area is already recited in parent claim 1. The limitation should read “the area”, to correspond to independent claim 1. Claim Rejections - 35 USC § 102 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. (a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention. 4. Claims 1-3, 12-13, and 16-20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by US 20220129841 A1: Steven Gutke et al., (herein after “Gutke”). Regarding claim 1, A computer-implemented method (Gutke, Fig. 2) comprising: for a first image (Gutke, P[0015]: “The package detection system 20 may further include a data reading engine 26 in communication with the imagers 22. The data reading engine 26 may be capable of reading and decoding optical codes, RFID tags, or other data types to identify the package 10.”, where the package detection system is also in communication with multiple mobile devices, P[0026]: “The package detection system 20 may further include a data reading engine 26 in communication with the imagers 22. The data reading engine 26 may be capable of reading and decoding optical codes, RFID tags, or other data types to identify the package 10. The data reading engine 26 may be any suitable engine capable of reading barcode symbols from the package 20 or from another display device, such as a mobile device carried by the delivery person 12, and may encompass laser scanners, imaging scanners (such as CMOS or CCD-based imagers), cameras, and/or image sensors.”) .; determining information for the package retrieval (Gutke discloses determining package instructions from the label in P[0029]: “In some embodiments, package barcode data may be used to identify the package 10 and obtained directly via the reading engine 26. In such embodiments, the package detection system 20 may use this item information to determine whether special delivery instructions for the package 10 are available based on package value and/or identity.”), and for a second image that depicts a package at a property and a person retrieving the package, determining, using the determined information for the package retrieval and data from the second image or from a device for the person, whether the person is authorized to retrieve the package (Gutke, P[0037]: “the package detection system 20 at step 410 determines an identity of the person collecting the package 10. For example, the package detection system 20 may include facial recognition software for authorized users, such as family members, friends, and neighbors. When the package 20 is retrieved, the package detection system 20 may obtain an image of the person collecting the package 10 and analyze that image to verify that the person is an authorized user.”); and in response to determining that the person is authorized to retrieve the package: i) causing presentation of an indication of which package is to be retrieved, the indication comprising a projection onto an area in which the package is physically located on an audible indication (Gutke, P[0030]: “In some embodiments, the package detection system 20 may display the instructions on the screen 34, or may deliver an audible message through a speaker, or may send an electronic message to the delivery person's electronic device, or may employ other suitable communication methods. In other embodiments, the package detection system 20 may convey the delivery instructions and simultaneously unlock a secure box (or provide the delivery person 12 with an access code to manually unlock the box) to receive the delivery. The secure box may be equipped with internal UV lighting to perform anti-microbial treatment of the package 10 before the package 10 is recovered by the recipient and brought inside the home.”, and in P[0034]: “At step 314, the package detection system 20 may provide feedback to the delivery person 12 regarding package delivery. If the package 10 has been delivered in accordance with the delivery instructions, the package detection system 20 may send a confirmation message to the delivery person 12 or emit a successful sound to confirm the delivery. Similarly, the package detection system 20 may notify the delivery person 12 that the package 10 has not been placed properly and request that the package 10 be moved to the proper location. At step 316, once the package 10 has been properly delivered, the package detection system 20 may send an electronic message to the recipient regarding package delivery and placement, and providing package delivery details, such as a time stamp and identity of the delivery company and delivery person 12. In some embodiments, the package detection system 20 may provide images or video of the delivery process and the package location, or may provide a live feed to facilitate regular monitor of the package 10 throughout the day.” Where Gutke explicitly discloses audible instructions which provides information on which package is to be retrieved.); and ii) determining to skip performing an automated action that would have been performed if the person was not authorized to retrieve the package (Gutke, P[0037]: “When the package 20 is retrieved, the package detection system 20 may obtain an image of the person collecting the package 10 and analyze that image to verify that the person is an authorized user. At step 412, if the person is an authorized user, the package detection system 20 takes no further action and the delivery process is completed. If the person is not an authorized user, at step 414, the package detection system 20 may issue an alarm or alert notifying the intended recipient that the package 10 has been recovered by an unauthorized user.”). Regarding claim Regarding claim 2, wherein determining, using the determined information for the package retrieval and the data from the second image or from the device for the person, whether the person is authorized to retrieve the package comprises (Gutke, P[0037]: “When the package 20 is retrieved, the package detection system 20 may obtain an image of the person collecting the package 10 and analyze that image to verify that the person is an authorized user. At step 412, if the person is an authorized user, the package detection system 20 takes no further action and the delivery process is completed. If the person is not an authorized user, at step 414, the package detection system 20 may issue an alarm or alert notifying the intended recipient that the package 10 has been recovered by an unauthorized user.”): determining, using the data from the second image, that the person is potentially not authorized to retrieve the package (Gutke, P[0037]: “When the package 20 is retrieved, the package detection system 20 may obtain an image of the person collecting the package 10 and analyze that image to verify that the person is an authorized user.); in response to determining, using the data from the second image, that the person is potentially not authorized to retrieve the package, presenting, using a device, a prompt requesting information indicating that they are authorized to retrieve the package (Gutke, P[0038]: “Upon receiving these alerts from other connected systems, the package detection system 20 may alter its default instructions and restrict access to a home, require additional approval from the recipient, or take other suitable precautions to strengthen access criteria as needed.”); in response to presenting the prompt determining a response to the prompt (the person responds to the prompt by inputting a password as contemplated by Gutke in P[0028]: “other authorization information may be used, such as a login/password combination or other suitable protocols.”, which provides additional approval as disclosed in P[0038]: “require additional approval from the recipient-”); and determining, using the response to the prompt, whether the person is authorized to retrieve the package, (where the recipient of the alert/alarm is able to confirm or deny that the package was collected by the appropriate/authorized person/party when errors arise such as incorrect determination of the appropriate/authorized person/party. Furthermore, Gutke discloses in P[0031]: “the package detection system 20 may trigger an alarm and/or contact the recipient or the authorities.”, and furthermore, Gutke discloses that a video or image of the unauthorized recipient may be used to confirm whether or not the recipient was truly authorized in P[0037]: “The package detection system 20 may include a photo or video of the unauthorized user for aiding in identifying the person.”). Regarding claim 3, presenting, using the device, the prompt comprises sending an instruction to a second device installed at the property to cause the second device to present the prompt is disclosed by Gutke in P[0012]: “the package detection system 20 may communicate with other IoT technologies (such as sensors, cameras, security systems, computers, and mobile phones) to observe the package delivery process and to continuously monitor the package 10 after delivery until the package 10 is recovered by an authorized recipient … and/or may alert the recipient if the package 10 is moved or potential theft activity is detected in the area.”, where the alert prompts the receiver. Furthermore, Gutke discloses having a networking device for sending and receiving instructions located at the property in P[0019]: “a network interface 36 to communication with one or more other systems or devices, such as a server, a host computer, a mobile phone, a database, or any suitable IoT devices.”, where the mobile phone of the recipient is installed at the location of the property where the recipient is to be located, and where a host computer is contemplated without mobility being mentioned.; and determining the response to the prompt comprises: capturing, with a microphone or a camera (Gutke, P[0014]: “the package detection system 20 may include one or more imaging cameras”), sensor data that indicates the response to the prompt (Gutke, P[0018]: “the package detection system 20 may include an input controller 30 to receive user input from an input device 32, such as a keyboard/keypad, a stylus, microphone, or other suitable device to allow the delivery person 12 to communicate with the recipient as further discussed below. The package detection system 20 may also include a display screen 34 capable of presenting data, prompts, or other communications to the delivery person 12. In some embodiments, the display screen 34 may be a touch screen display capable of receiving input from the delivery person 12 to communicate with the recipient when desired.”); and analyzing the sensor data to determine the response to the prompt (where the response to the prompt is taken via “a keyboard/keypad, a stylus, microphone, or other suitable device”, and therefore the response to the prompt is analyzed for the algorithm to continue operating to the next step.). Regarding claim 12, as best understood by the examiner, wherein determining to skip performing the automated action (where the automated action is sending the alarm/alert) that would have been performed if the person was not authorized to retrieve the package comprises determining to skip one or more confirmation actions (where the confirmation action is confirming/denying the package handler’s identification, which is the prompted authorization action.), and the confirmation actions comprise one or more actions likely to increase a confidence that the person was not authorized to retrieve the package. Where skipping the confirmation action, which is the authorization action, causes the automated action to be skipped. Where the identification of the package handler is determined automatically or visually by a user as taught by Gutke in P[0031]: “the package detection system 20 requires proper authentication (e.g., company ID, facial recognition matching, visual confirmation by recipient via camera, etc.) to provide access to the home”, and therefore access to the package, allowing the package handler to skip further authorization processing. Furthermore, “visual confirmation by recipient” increases confidence that the person was not authorized to retrieve the package, and then the user/recipient does not provide a prompt to provide further authorization access. Regarding claim 13, wherein determining to skip the one or more confirmation actions comprises determining to skip sending a message to a device for an authorized person that prompts the authorized person to perform an action is disclosed by Gutke in P[0031]: “the package detection system 20 requires proper authentication (e.g., company ID, facial recognition matching, visual confirmation by recipient via camera, etc.) to provide access to the home”, and therefore access to the package, allowing the package handler to skip further authorization processing/confirmation actions. Furthermore, “visual confirmation by recipient” increases confidence that the person was not authorized to retrieve the package, and then the user/recipient does not provide a prompt to provide further authorization access, since the recipient has already visually identified that the package handler is not authorized. Regarding claims 16-17 and 19-20, Gutke discloses a method, system (§Abstract: “relates to systems and methods”, and an non-transitory storage/memory (P[0019]: “memory unit 40-”). Claims 16 and 19 recite features nearly identical to those recited in claim 1. Claims 16 and 19 are rejected for reasons analogous to those discussed above in conjunction with claim 1. Claims 17 and 20 recite features nearly identical to those recited in claim 2. Claims 17 and 20 are rejected for reasons analogous to those discussed above in conjunction with claim 2. Claim 18 recites features nearly identical to those recited in claim 3. Claim 18 is rejected for reasons analogous to those discussed above in conjunction with claim 3. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 5. Claim 2-6, 8-11, 14-15, 17-18, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Gutke in view of US 20160182856 A1: Michael Child et al., (herein after “Child”). Regarding claim 2, as outlined above, Gutke appears to include all features of Claim 2. However, to the extent that “prompt” means more than anything getting a response from the user, then to the extent that it is not already shown, Child discloses prompt based authentication for package pickups in P[0139]: “Upon determining the person is unknown and detecting the unknown person taking the package, notification module 750 may sound an alarm. Similarly, package detection module 1015 may monitor a package left outside a premises for pickup to determine whether the package is taken by an authorized delivery person or not. A person may be determined to be authorized to interact with the package based on detection of a uniform, delivery truck, company logo, badge or identification, barcode, etc. In some cases, authorization may be determined based on facial recognition, passcode query, and the like. For example, a user approaching the package may be prompted to provide a spoken code or a bade bar code in order to authorize interaction with the package.” It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Gutke to incorporate prompt based authentication for package pickups, as taught by Child, to arrive at the claimed invention discussed above. Such a modification is the result of combining prior art elements according to known methods to yield predictable results. It is predictable that the proposed modification would have provided the benefit of increased security by utilizing additional layers of authentication. Regarding claim 3, as outlined above, Gutke appears to include all features of Claim 3. However, to the extent that “analyzing the sensor data” means more than anything accepting a response from the user, then to the extent that it is not already shown, Child discloses analyzing the sensor data from microphone/images provided after being prompted to authenticate identity in P[0100]: “the user interface 725 may include an audio device, such as an external speaker system, a microphone, an external display device such as a display screen, and/or an input device-”, and further in P[0139]: “a user approaching the package may be prompted to provide a spoken code or a bade bar code in order to authorize interaction with the package.” Regarding claim 4, wherein: presenting the prompt comprises sending an instruction to the device for the person to cause the device to present the prompt, which describes basic prompting, where instructions must be sent in order for the prompt to populate. (Child, P[0139]: “a user approaching the package may be prompted to provide a spoken code or a bade bar code in order to authorize interaction with the package.” ); and determining the response to the prompt comprises receiving, from the device for the person, the response to the prompt, which describes receiving a response to the prompt, which is disclosed by Child in P[0139]: “a user approaching the package may be prompted to provide a spoken code or a bade bar code in order to authorize interaction with the package.” Therefore, the response to the prompt is determined in order to determine authorization for handling the package. Regarding claim 5, presenting the prompt comprises presenting the prompt for at least one of a visual credential or a verbal passphrase (Child, P[0139]: “a user approaching the package may be prompted to provide a spoken code or a bade bar code in order to authorize interaction with the package.”); and determining the response to the prompt comprises receiving sensor data encoding at least one of the visual credential or the verbal passphrase is disclosed by Gutke in P[0016]: “The data reading engine 26 may be capable of reading and decoding optical codes, RFID tags, or other data types … The data reading engine 26 may be any suitable engine … such as a mobile device … and may encompass laser scanners, imaging scanners (such as CMOS or CCD-based imagers), cameras, and/or image sensors.”, and since “decoding” is disclosed, the the sensor data is encoded. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Gutke to encode the sensor data, as taught by Child, to arrive at the claimed invention discussed above. Such a modification is the result of combining prior art elements according to known methods to yield predictable results. It is predictable that the proposed modification would have provided the benefit of minimizing package theft by adding improved security measures. Regarding claim 6, wherein determining, using the determined information for the package retrieval and the data from the second image or from the device for the person, whether the person is authorized to retrieve the package comprises: recognizing, using the data from the second image and the information for the package retrieval, an identifying mark on the person or a vehicle for the person (Child discloses authorization based on identifying marks in an image in P[0046]: “The present systems and methods use computer vision technology (based on a camera) to detect information related to packages, including detecting: a package, a person carrying a package, characteristics related to a person carrying a package (e.g., facial features, uniform, vehicle, etc.)”); and in response to recognizing the identifying mark on the person or a vehicle for the person, determining that the person is authorized to retrieve the package is disclosed by Child in P[0013]: “The object may be identified as a package based at least in part on the image analysis. Upon identifying the object as a package, the systems and methods may monitor the package for unauthorized interaction.”, and furthermore in P[0123]: “Upon detecting a package and/or other feature, the package detection module 1015 monitors the package and determines whether a person that is delivering, moving, and/or removing the detected package is authorized to do so.” Regarding claim 8, wherein determining, using the determined information for the package retrieval and the data from the second image or from the device for the person, whether the person is authorized to retrieve the package comprises: in response to detecting a depiction of the person in the image, sending, to the device for the person, a request for data relating to an authorization of the person retrieving the package (Child, P[0139]: “A person may be determined to be authorized to interact with the package based on detection of a uniform, delivery truck, company logo, badge or identification, barcode, etc. In some cases, authorization may be determined based on facial recognition, passcode query, and the like. For example, a user approaching the package may be prompted to provide a spoken code or a bade bar code in order to authorize interaction with the package. For example, package detection module 1015 may be configured to detect a specified delivery service for delivery or pickup, such as UPS®, FEDEX®, DHL®, etc. Accordingly, package detection module 1015 may determine whether the person is wearing a uniform of the expected delivery service, whether the delivery truck is from the expected delivery service, etc.”); and determining, using a response or an indication that no response was received, whether the person is authorized to retrieve the package, where Child discloses authorization based upon “detection of a uniform, delivery truck, company logo, badge or identification, barcode, etc. In some cases, authorization may be determined based on facial recognition, passcode query, and the like.”, and where the response to confirm authorization is entering a passcode, yielding a response that the person is authorized to retrieve the package. Regarding claim 9, wherein the device for the person comprises a pickup vehicle, a handheld device, or a badge for the person is disclosed by Child in P[0139]: “package detection module 1015 may use facial recognition, voice recognition, pattern detection/learning, device identification (e.g., detecting an identifier associated with a device carried by a user)”, where the device is carried by a user and is therefore handheld. Regarding claim 10, wherein determining, using the determined information for the package retrieval and the data from the second image or from the device for the person, whether the person is authorized to retrieve the package comprises determining, using a) package data for the package and b) the data from the second image or from the device for the person, whether the person is authorized to retrieve the package is disclosed by Child in P[0067]: “upon detecting a package and/or other feature (e.g., a bar code on a label on a package … determines whether a person that is delivering, moving, and/or removing the detected package is authorized to do so.)”, which requires information from the person attempting to handle the package. Regarding claim 11, Gutke does not explicitly disclose determining whether a delivery service identified in the second image or from the device for the person matches the delivery service identified from the information for the package retrieval. However, Child discloses wherein determining, using the determined information for the package retrieval and the data from the second image or from the device for the person, whether the person is authorized to retrieve the package comprises determining whether a delivery service identified in the second image or from the device for the person matches the delivery service identified from the information for the package retrieval in P[0139]: “package detection module 1015 may monitor a package left outside a premises for pickup to determine whether the package is taken by an authorized delivery person or not. A person may be determined to be authorized to interact with the package based on detection of a uniform, delivery truck, company logo, badge or identification, barcode, etc. In some cases, authorization may be determined based on facial recognition, passcode query, and the like.” It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Gutke to provide verification of delivery drivers based on service provider information, as taught by Child, to arrive at the claimed invention discussed above. Such a modification is the result of combining prior art elements according to known methods to yield predictable results. It is predictable that the proposed modification would have provided the benefit of reducing false alarms for received packages. Regarding claim 14, wherein determining, using the determined information for the package retrieval and the data from the second image or from the device associated with the person, whether the person is authorized to retrieve the package comprises: determining, using the data from the second image or from the device for the person, a confidence level that indicates a likelihood that the person is authorized to retrieve the package (Child, P[0051]: “using the other additional data may increase a probability and/or a confidence level assessed. … For example, identifying a package's presence based on image and/or video data of an area may yield a first probability and/or confidence level. Then, based on additional comparisons, calculations, analysis, identifications, and/or actions, the probability and/or confidence level may increase, stay the same, and/or decrease. For example, the probability may increase based at least in part on motion detection data, person detection data, clothing detection data, shape detection data, and/or uniform detection data, etc.”, and furthermore in P[0052]: “A notification that a package has been delivered, picked up, and/or moved from a location (based at least in part on a probability and/or confidence level)”); and determining whether the confidence level satisfies a confidence threshold is disclosed by Child in P[0015]: “a notification may be sent to a user based at least in part on whether a probability of the object event exceeds a predetermined probability threshold.” It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Gutke to include a threshold for the confidence score that is based on a likelihood that the person is authorized to retrieve the package, as taught by Child, to arrive at the claimed invention discussed above. Such a modification is the result of combining prior art elements according to known methods to yield predictable results. It is predictable that the proposed modification would have provided the benefit of reducing the number of false positives (alarms triggered for authorized personnel). Regarding claim 15, determining, using data from a second image or from a device for the person (Gutke, P[0013]: “the package detection system 20 may be a doorbell camera, an overhead security camera, a standalone camera, or other suitable device … the package detection system 20 may communicate with a security system 42, a mobile phone or other electronic device”, which is the device for the person.), whether the person will likely retrieve the package for which the person is authorized to retrieve from two or more packages (Child, P[0043]: “notify the user about information related to one or more delivered/picked-up packages (e.g., delivery person, size of the package, timeliness, etc.).”) at the property; and in response to determining that the person will likely retrieve the package (which is the detection of an individual approaching the package), determining to skip performing an automated action (to skip sending an alarm/alert) that would have been performed if the person was likely to retrieve another package for which the person is not authorized to retrieve from the two or more packages at the property, where Child discloses that multiple packages may be present during retrieval in P[0043]: “notify the user about information related to one or more delivered/picked-up packages (e.g., delivery person, size of the package, timeliness, etc.).”, where the system will still send an alert when an inappropriate package is moved/retrieved since data relating to the packages is known and disclosed by Child in P[0130]: “For example, based on the detection of an edge, corner, shape, size, and/or logo in a captured image, for example, object identification sub-module 1115 may identify the presence of a certain object such as a box or package-”, where the package is a certain package and not a general package. When the appropriate package(s) is/are moved, alert(s)/alarm(s) will be sent out. Furthermore, when only the appropriate packages are retrieved, the alarm/alert is not sent and is skipped. Claims 17 and 20 recite features nearly identical to those recited in claim 2. Claims 17 and 20 are rejected for reasons analogous to those discussed above in conjunction with claim 2. Claim 18 recites features nearly identical to those recited in claim 3. Claim 18 is rejected for reasons analogous to those discussed above in conjunction with claim 3. 6. Claims 12 and 13 are rejected under 35 U.S.C. 103 as being unpatentable over Gutke in view of “IoT-Based Security with Facial Recognition Smart Lock System” by Tri-Nhut Do et al., (herein after “Do”). As outlined above, Gutke appears to include all features of Claims 12 and 13. To the extent that Gutke may not explicitly disclose skipping a prompt in order to expedite access, regarding claims 12 and 13, skipping a prompt in order to expedite access is disclosed in smart locks with facial recognition capabilities that provide a method for manual entry/access as well as facial recognition for automatic entry/access as disclosed by Do in §II, B, P[001]: “automated door security program” for facial recognition that has a manual method for gaining access as well as shown in Fig. 2. Do discloses further in the §IV, Conclusion: “In this paper, an IoT-based smart lock system using face recognition employed deep learning algorithm on YOLO v3 is proposed to provide security. The system is controlled to lock or unlock the door by an android application which is installed on smart devices, and credentials are used for the same. The user credentials are validated by the database. If invalid credentials are provided in the application, an alert email is sent with the stranger image in attachment to the house owner along with a popup warning notification to the user in the application.” It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Gutke to skip prompts and provide automatic access via facial recognition, as taught by Do, to arrive at the claimed invention discussed above. Such a modification is the result of combining prior art elements according to known methods to yield predictable results. It is predictable that the proposed modification would have provided the benefit of saving time gaining access by reducing the number of prompts/authentication steps before gaining access. 7. Claim 21 is rejected under 35 U.S.C. 103 as being unpatentable over Gutke in view of US 10853757 B1: Edward Hill et al., (herein after “Hill”). Regarding claim 21, wherein causing presentation of the indication of which package is to be retrieved comprises: projecting onto an area in which the package is physically located one or more laser marks, is not explicitly disclosed by Gutke. Gutke discloses providing an audible indication that provides instructions for package handling, but does not disclose using laser marks to highlight the appropriate package. However, Hill discloses projecting onto an area in which the package is physically located one or more laser marks in P[0046]: “For example, as a vehicle arrives at the destination address for the delivery of a certain package, a light projector (LED, laser or other) can be used to shine focused light, or a particular color light, on the location of the package within the cargo area to show the delivery person exactly where the “matched” package is in the vehicle. The focused light can be altered to change colors, blink, flash, or shine a pattern to signal additional information to the delivery person, for example, priorities of delivery and warnings of weight, or to signify that the package of interest is behind or under another package. Information is directly overlaid on the package that to be picked up, without needing any other screen or sound interface that might consume time to read or hear and consequently prolong the delivery process.”, and further and more explicitly in P[0130]: “embodiments of package tracking systems described herein include light guidance features, wherein a light, laser or light projector shines onto or near the package to be picked up or at the area where the package is to be placed. Use of the laser or light source can provide further include functionality beyond light guidance. For example, consider an embodiment of a package tracking system that includes a light projector, that light projector can superimpose images or notifications on the package or across the area where the package is to be placed. For example, when coming to pick up a package, a resident can see a notification of an upcoming event to be held on the premises as a text image superimposed on the package being picked up. For delivery, the driver may see a text message projected onto the area where the package is to be placed, for example, warning of a storm approaching the area or indicating that the package is fragile and requires handling care.” It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Gutke to incorporate light guidance, as taught by Hill, to arrive at the claimed invention discussed above. Such a modification is the result of combining prior art elements according to known methods to yield predictable results. It is predictable that the proposed modification would have provided the benefit of saving time on package deliveries/pickups. Conclusion 8. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. US 11792455 B1: by Sai-Wai Fu et al., is directed toward a smart camera with recognition features that includes a movable/aimable spotlight, as shown in P[0027] and P[0060]. 9. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TY M BEATTY whose telephone number is (703)756-5370. The examiner can normally be reached Mon-Fri: 8AM-4PM EST.. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Gregory Morse can be reached at (571) 272 - 3838. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /TY MITCHELL BEATTY/Examiner, Art Unit 2663 /GREGORY A MORSE/Supervisory Patent Examiner, Art Unit 2698
Read full office action

Prosecution Timeline

Mar 01, 2023
Application Filed
Jun 12, 2025
Non-Final Rejection — §102, §103, §112
Sep 03, 2025
Response Filed
Dec 03, 2025
Final Rejection — §102, §103, §112
Feb 10, 2026
Response after Non-Final Action
Mar 09, 2026
Request for Continued Examination
Mar 11, 2026
Response after Non-Final Action
Mar 18, 2026
Non-Final Rejection — §102, §103, §112 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12597275
VEHICLE INTERIOR MONITORING SYSTEM
2y 5m to grant Granted Apr 07, 2026
Patent 12579653
AUTOMATED METHOD FOR TOOTH SEGMENTATION OF THREE DIMENSIONAL SCAN DATA USING TOOTH BOUNDARY CURVE AND COMPUTER READABLE MEDIUM HAVING PROGRAM FOR PERFORMING THE METHOD
2y 5m to grant Granted Mar 17, 2026
Patent 12555212
OBJECT DETECTION DEVICE AND METHOD FOR DETECTING MALFUNCTION OF OBJECT DETECTION DEVICE
2y 5m to grant Granted Feb 17, 2026
Patent 12511787
METHOD, DEVICE AND SYSTEM OF POINT CLOUD COMPRESSION FOR INTELLIGENT COOPERATIVE PERCEPTION SYSTEM
2y 5m to grant Granted Dec 30, 2025
Patent 12511750
IMAGE PROCESSING METHOD AND APPARATUS BASED ON IMAGE PROCESSING MODEL, ELECTRONIC DEVICE, STORAGE MEDIUM, AND COMPUTER PROGRAM PRODUCT
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
70%
Grant Probability
99%
With Interview (+42.3%)
3y 1m
Median Time to Grant
High
PTA Risk
Based on 27 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month