Prosecution Insights
Last updated: April 19, 2026
Application No. 18/777,784

EVENT DETECTING APPARATUS, EVENT DETECTING METHOD, AND STORAGE MEDIUM

Final Rejection §102
Filed
Jul 19, 2024
Examiner
DANG, HUNG Q
Art Unit
2484
Tech Center
2400 — Computer Networks
Assignee
National Universoreity Of Singapore
OA Round
2 (Final)
68%
Grant Probability
Favorable
3-4
OA Rounds
3y 1m
To Grant
87%
With Interview

Examiner Intelligence

Grants 68% — above average
68%
Career Allow Rate
1257 granted / 1841 resolved
+10.3% vs TC avg
Strong +18% interview lift
Without
With
+18.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 1m
Avg Prosecution
95 currently pending
Career history
1936
Total Applications
across all art units

Statute-Specific Performance

§101
4.2%
-35.8% vs TC avg
§103
54.1%
+14.1% vs TC avg
§102
23.6%
-16.4% vs TC avg
§112
11.6%
-28.4% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1841 resolved cases

Office Action

§102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant's arguments filed 10/02/2025 have been fully considered but they are not persuasive. On pages 10-13, Applicant argues that, […] Buehler does not disclose, or even suggest, taking the duration of an action into account, as recited in claim 1. Specifically, Paragraph 48 of Buehler describes that “an alert can be created to test for theft in a case where a product is removed from a shelf ... and is again recognized by an RFID station at the exit of the store but no image of the person visiting a checkout counter has been recorded.” This rule can be defined as a sequence of situations such as: (1) a product is removed from a shelf, (2) the product is detected at the store exit, and (3) the person who has removed the product does not visit the cashier. Because the detection is based only on the existence or non-existence of these situations, if a person who has stolen a product merely passes in front of the cashier without stopping, the theft cannot be detected at all. In contrast, according to one or more example embodiments consistent with claim 1, if the second action-related situation is defined as “a person does not stay in front of the cashier for more than one minute,” the situation where a person who has stolen a product merely passes in front of the cashier can still be detected as theft. This is because the action-related situation of “a person merely passes in front of the cashier” corresponds to “a person stays for less than one minute,” and therefore is not regarded as opposite to the second action-related situation. Accordingly, one or more example embodiments consistent with claim 1 can detect events with higher accuracy by explicitly taking the duration of actions into consideration. Buehler fails to disclose or suggest the aforementioned features of claim 1 and, therefore, claim 1 is not anticipated by Buehler for at least these reasons. In response, Examiner respectfully disagrees and submits, without acquiescing to any of Applicant’s characterization of Buehler’s teachings that the claim does not recite any limitation of “a person does not stay in front of the cashier for more than one minute”, “a person merely passes in front of the cashier”, or “a person stays for less than one minute”. Thus, arguments of whether Buehler teaches any of these features are irrelevant. Instead, Examiner respectfully submits that, at least, Buehler teaches “a situation in which a specific action is not taken by a specific subject during a specific duration”. Specifically, Buehler teaches a theft event is detected when there occurs a sequence of action related situations comprising: (1) a person removing a product from a shelf, (2) the person carrying the product did not visit a checkout counter, and (3) the person with the product exited the store in that order during a specific duration starting from the time the situation (1) is detected and ends when the situation (3) is detected. Thus, if during such a specific duration, situation (2) occurs then it is determined that a theft event has occurred. Further, the situation (2) represents a situation in which a specific action, i.e. the action of visiting a checkout counter, is not taken by the specific subject, i.e. the person who removed the product from the shelf without putting it back. As such, Buehler clearly teaches the amended limitations; and Applicant’s arguments are not persuasive. Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1, 3-4, 6-8, 10-11, 13-15, 17-18, and 20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Buehler (US 2007/0182818 A1 – hereinafter Buehler). Regarding claim 1, Buehler discloses an event detecting apparatus ([0087] – a computer) comprising: at least one memory that is configured to store instructions ([0086] – at least one memory to store software instructions); and at least one processor that is configured to execute the instructions ([0087] – one or more processors of the computer to execute the stored software) to: acquire one or more video data ([0019]; [0049]; [0057] – acquiring a plurality of video frames captured by a camera); generate, from the one or more video data, object relationship information that indicates two or more action-related relationships between objects ([0047]-[0061]; [0067] – analyzing the video frames to detect object relationship information that indicates two or more action-related relationships between objects such as a person removing a product and the person taking the product to exit without paying for the product, etc.); acquire an event information that indicates an event of interest by a sequence of action-related situations at least one of which represents a situation in which a specific action is taken by a specific subject ([0048] – acquiring event information such as rules that indicates an event of interest by a sequence of action-related situations at least one of which represents a situation in which a specific action is taken by a specific subject – for instance, a person removing a product and the person taking the product to exit without paying for the product, etc.); and determine whether or not the event of interest occurs based on the object relationship information and the event information ([0047]; [0068] – detecting an event based on the analysis and the rules to trigger alerts and warnings etc.), wherein it is determined that the event of interest occurs in a case where the object relationship information includes a sequence of the action-related relationships that matches the sequence of the action-related situations indicated by the event information ([0048] – it is determined that an theft event occurs in a case where a sequence of the action-related relationships is as follows: (1) a product is removed from a shelf by a particular person, (2) the person did not visit a checkout counter, and (3) the person with the product exited the store), wherein the event information includes a first action-related situation, a second action-related situation, and a third action-related situation ([0048] – (1) a product is removed from a shelf by a particular person, (2) the person did not visit a checkout counter, and (3) the person with the product exited the store), wherein each one of the first action-related situation and the third action-related situation represents a situation in which a specific action is taken by a specific subject during a specific duration ([0048] – it is determined that an theft event occurs in a case where a sequence of the action-related relationships is as follows: (1) a product is removed from a shelf by a particular person, i.e. the person took an action of removing the product off the shelf and (3) the person with the product exited the store, i.e. the person took an action of taking the product exiting the store during a specific duration, which is a duration from the time the product was removed from the shelf to the time the person was detected exiting the store), wherein the second action-related situation represents a situation in which a specific action is not taken by a specific subject during a specific duration ([0048] – (2) the person did not visit a checkout counter, i.e. the person with the product did not visit the checkout counter during the specific duration from the time the product was removed from the shelf to the time the person was detected exiting the store), and wherein it is determined that the event of interest occurs in a case where the object relationship information includes a first action-related relationship matching the first action-related situation and a third action-related relationship matching the third action-related situation in this order and in a case where the object relationship information includes no action- related relationship that matches an action-related situation opposite to the second action-related situation ([0048] – the event occurs when the sequence of actions is in following order: (1) a product is removed from a shelf by a particular person, (2) the person did not visit a checkout counter, and (3) the person with the product exited the store). Regarding claim 3, Buehler also discloses the event detecting apparatus according to claim 1, wherein the temporal action-related relationship indicates a combination of an action, a subject of the action, and an object of the action, wherein the action-related situation indicates a combination of an action, a subject of the action, and an object of the action ([0068] – indicating an action, e.g. removing a product, taking the product to exit without paying, a subject of the action, e.g. a person, and an object of the action, e.g. the product), and wherein the temporal action-relate relationship is determined to match the action-related situation in a case where the action, the subject of the action, and the object of the action indicated by the temporal action-related relationship respectively match the action, the subject of the action, and the object of the action indicated by the action-related situation ([0068] – a match indicating a person removing a product, then the person taking the product to exit without paying). Regarding claim 4, Buehler also discloses the event detecting apparatus according to claim 1, wherein the event information includes a first action-related situation and a second action-related situation in this order ([0068] – first situation: a person removing a product, and the second situation: the person taking the product to exit without paying), and wherein it is determined that the event of interest occurs in a case where the object relationship information includes a first action-related relationship matching the first action-related situation and a second action-related relationship matching the second action-related situation in this order ([0068] – a match indicating a person removing a product, then the person taking the product to exit without paying in that order). Regarding claim 6, Buehler also discloses the event detecting apparatus according to claim 1, wherein the event information indicates two or more objects that are of a same type as each other and are different from each other in a distinguishable manner (Fig. 3 – person 320 and person 325 are of same type and different from each other in a distinguishable manner). Regarding claim 7, Buehler also discloses the event detecting apparatus according to claims 1, wherein the object relationship information represents a sequence of scene graphs each of which represents relationships between objects that exist at a time or during a period of time ([0068] – object relationship information represents a sequence of scene graphs, e.g. two nodes of object connected by an action edge). Claim 8 is rejected for the same reason as discussed in claim 1 above. Claim 10 is rejected for the same reason as discussed in claim 3 above. Claim 11 is rejected for the same reason as discussed in claim 4 above. Claim 13 is rejected for the same reason as discussed in claim 6 above. Claim 14 is rejected for the same reason as discussed in claim 7 above. Claim 15 is rejected for the same reason as discussed in claim 1 above in view of Buehler also disclosing a non-transitory computer-readable storage medium storing a program that causes a computer to execute the recited steps ([0086]-[0087] – at least one memory to store software instructions that cause a computer to execute the steps). Claim 17 is rejected for the same reason as discussed in claim 3 above. Claim 18 is rejected for the same reason as discussed in claim 4 above. Claim 20 is rejected for the same reason as discussed in claim 6 above. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to HUNG Q DANG whose telephone number is (571)270-1116. The examiner can normally be reached IFT. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Thai Q Tran can be reached at 571-272-7382. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /HUNG Q DANG/Primary Examiner, Art Unit 2484
Read full office action

Prosecution Timeline

Jul 19, 2024
Application Filed
Jun 30, 2025
Non-Final Rejection — §102
Oct 02, 2025
Response Filed
Oct 19, 2025
Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12594460
MANAGING BLOBS FOR TRACKING OF SPORTS PROJECTILES
2y 5m to grant Granted Apr 07, 2026
Patent 12588818
DETECTION OF A MOVABLE OBJECT WHEN 3D SCANNING A RIGID OBJECT
2y 5m to grant Granted Mar 31, 2026
Patent 12592258
METHOD AND APPARATUS FOR INTERACTIVE VIDEO EDITING PLATFORM TO CREATE OVERLAY VIDEOS TO ENHANCE ENTERTAINMENT VIDEO GAMES WITH EDUCATIONAL CONTENT
2y 5m to grant Granted Mar 31, 2026
Patent 12587693
ARTIFICIALLY INTELLIGENT AD-BREAK PREDICTION
2y 5m to grant Granted Mar 24, 2026
Patent 12574649
ENCODING AND DECODING METHOD, ELECTRONIC DEVICE, COMMUNICATION SYSTEM, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
68%
Grant Probability
87%
With Interview (+18.3%)
3y 1m
Median Time to Grant
Moderate
PTA Risk
Based on 1841 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month