Prosecution Insights
Last updated: April 19, 2026
Application No. 18/275,991

PERSON DETECTION DEVICE, PERSON DETECTION SYSTEM, AND PERSON DETECTION METHOD

Final Rejection §103
Filed
Aug 04, 2023
Examiner
KONERU, SUJAY
Art Unit
3624
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
NEC Corporation
OA Round
2 (Final)
58%
Grant Probability
Moderate
3-4
OA Rounds
3y 2m
To Grant
95%
With Interview

Examiner Intelligence

Grants 58% of resolved cases
58%
Career Allow Rate
421 granted / 722 resolved
+6.3% vs TC avg
Strong +37% interview lift
Without
With
+37.0%
Interview Lift
resolved cases with interview
Typical timeline
3y 2m
Avg Prosecution
36 currently pending
Career history
758
Total Applications
across all art units

Statute-Specific Performance

§101
37.9%
-2.1% vs TC avg
§103
50.7%
+10.7% vs TC avg
§102
2.0%
-38.0% vs TC avg
§112
7.4%
-32.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 722 resolved cases

Office Action

§103
DETAILED ACTION This Final Office Action is in response to Applicant's arguments filed on February 3, 2026. Applicant has amended claims 1, 4, 7-8, 11, 15, 18 and canceled claims 2-3, 9-10, 16-17. Currently, claims 1, 4-8, 11-15, 18-21 are pending. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Amendments The 35 U.S.C. 103 rejections of claims 1, 4-8, 11-15, 18-21 are withdrawn in light of applicant’s amendments to claims 1, 4, 7-8, 11, 15, 18. Applicant’s amendments necessitated the new grounds for rejection in this office action. Response to Arguments Applicant’s 2/3/26 remarks have been considered but are not persuasive. Applicant argues on p. 8 of the remarks that the 103 rejection is improper. Applicant argues that the cited art does not teach the amended language. Examiner disagrees and notes the Tillotson reference teaches the limitations at para [0053]-[0058]. Examiner notes that Tillotson shows the identification of an animal or human can be based on a change of shape where it is obvious to one of ordinary skill in the art that the change of a shape in the context of image identification where examples of walking and gesturing are given can be considered to also show change in posture is obvious. Moreover, examiner notes that Tillotson identifies the object by comparison of shapes to a classified images of the shapes where determination is made based on matching features of the image where each classification such as a gorilla or bird or human can be considered to show falling within a range would be obvious to one of ordinary skill in the art by matching the features of the different classifications. Therefore, the 103 rejections are maintained. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Claims 1, 4-8, 11-15, 18-21 are rejected under 35 U.S.C. 103 as being unpatentable over Kano et al. (US 2022/0404502 A1) (hereinafter Kano) in view of Kojima et al. (US 2022/0101014 A1) (hereinafter Kojima) in view of Tillotson (US 2018/0325096 A1). Claims 1, 8 and 15: Kano, as shown, discloses the following limitations of claims 1, 8 and 15: A person detection device (and corresponding system and method) (see para [0007], "In the laser radar according to this aspect, the object detection surface is set so as to widen toward the monitoring region, so that the laser light with which scanning is performed along the object detection surface as the rotary part rotates is less likely to be blocked by a facility or the like inside the monitoring region. Therefore, entry of an object such as a person into the monitoring region can be more reliably detected." ) comprising: a three dimensional model generator configured to generate… a three-dimensional model of a restricted area, based on reflected light associated with laser light with which the restricted area is irradiated (see para [0006]-[0008], showing using a three-dimensional monitoring region and light detection to make detection where it is obvious to one of ordinary skill in the art that a monitoring region can be considered be a restricted area which functions the same way); and a person detector configured to detect a person in the restricted area by using the three-dimensional model (see para [0007], "In the laser radar according to this aspect, the object detection surface is set so as to widen toward the monitoring region, so that the laser light with which scanning is performed along the object detection surface as the rotary part rotates is less likely to be blocked by a facility or the like inside the monitoring region. Therefore, entry of an object such as a person into the monitoring region can be more reliably detected." where the monitoring region is three-dimensional as shown in para [0006]) Kano, however, does not specifically disclose in an area where water of a dam is discharged an area in a periphery of the area. In analogous art, Kojima discloses the following limitations: in an area where water of a dam is discharged an area in a periphery of the area (see para [0031], "Each surveillance area 100 is an area that is a target for security or surveillance. In the present embodiment, as an example, the surveillance area 100 is a site of a facility, but may be a roadway, a sidewalk, or the like. The facility may be a plant, may be a school, a house, a train station, an airport, a museum, a hospital, a store (as an example, a restaurant, a retail store), or the like, or may be a theme park, an amusement park, attraction facilities of these, or the like. An example of the plant includes: in addition to industrial plants relating to chemistry, biotechnology, and the like, plants for managing and controlling wellheads in a gas field, an oil field, and the like, and their surroundings; plants for managing and controlling power generation of hydroelectric power, thermal power, nuclear power, and the like; plants for managing and controlling energy harvesting from solar power, wind power, and the like; plants for managing and controlling water and sewerage, dams, and the like; and the like. Each surveillance area 100 may be run or used by companies or individuals (also referred to as “tenants”) different from each other.") It would have been obvious to one or ordinary skill in the art at the time of the invention to combine the teachings of Kojima with Kano because including the context of dams for the surveillance provides a commercial use for the surveillance technology (see Kojima, para [0001]-[0006]). Moreover, it would have been obvious to one of ordinary skill in the art at the time of the invention to include the surveillance system as taught by Kojima in the laser radar system of Kano, since the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable. Kano and Kojima do not specifically disclose wherein the person detector compares a shape of an object in the restricted area with a plurality of reference shapes associated with postures different from one another. In analogous art, Tillotson discloses the following limitations: wherein the person detector: compares a shape of an object in the restricted area with a plurality of reference shapes associated with postures different from one another (see para [0053], " Regarding techniques based on feature/shape/image recognition, a shape of object O can be compared to one or more image features, one or more animal shapes, and/or compared to one or more images, to classify and/or otherwise determine whether object is an animal. For example, if the shape of object O matches at least one of the one or more animal shapes, then object O can be classified as an animal. As another example, an input image can be provided by camera(s)/sensor(s) 104 to control software 120. Then, control software 120 can extract features, such as lines, colors, patches of color, shapes, and textures, from the input image." and see para [0057], "In other examples, determining whether object O is an animal that has entered into the protected area includes: determining whether an animal has entered into the protected area includes: determining whether an object that has entered into the protected area is an animal, a human, or a human-made object based on one or more images of the object; e.g., by comparing the shape of object O to one or more animal shapes, human shapes, and shapes of human-made objects. In these examples, control software 120 determines that object O is an animal after determining that object O is not a human or human-made object." and see para [0054]-[0056], where para [0056] shows the identification of an animal or human can be based on a change of shape where it is obvious to one of ordinary skill in the art that the change of a shape in the context of image identification where examples of walking and gesturing are given can be considered to also show change in posture would have been obvious), calculates a difference between the shape of the object in the restricted area and at least one of the plurality of reference shapes (see para [0053]-[0058], where comparing object O based on similarity of matched features in comparison to the images stored in the database can be considered to show a difference because all of the classifications are different from each other), and determines that the object is a person when the calculated difference falls within a predetermined range (see para [0053]-[0058], where comparing object O based on similarity of matched features in comparison to the images stored in the database can be considered to show it is obvious a difference within a range because the matched features for a similarity analysis that results in determination of a human or a gorilla can each be considered a range of the specific matched features in order to make such a classification) It would have been obvious to one or ordinary skill in the art at the time of the invention to combine the teachings of Tillotson with Kano and Kojima because using such shapes enables more effective detection of specific objects and protect property (see Kojima, para [0001]-[0004]). Moreover, it would have been obvious to one of ordinary skill in the art at the time of the invention to include the system for using lasers to detect animals as taught by Tillotson in the Kano and Kojima combination, since the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable. Claims 4, 11 and 18: Further, Kano discloses the following limitations: a motion detector configured to detect a motion of the object, based on a difference between a frequency component contained in the laser light and frequency component contained in the reflected light (see para [0098]-[0104], especially, " The controller 201 causes projection lights to be projected from the respective optical units 40 at the angles θ1 to θ6 shown in FIG. 7B, the reflected light corresponding to each projection light is received by each optical unit 40, and the distance to an object is calculated on the basis of a time of flight. In addition, the controller 201 calculates the angle at the position of the object about the rotation axis R10 in the X-Y plane, on the basis of the angle in the circumferential direction (rotational position) of the optical unit 40 at the timing at which the reflected light is received. Then, the controller 201 determines whether or not the object exists in the detection ranges RD1 to RD6, on the basis of the calculated distance and angle. Accordingly, it is recognized whether or not the object is positioned in the monitoring region RM shown in FIGS. 10A and 10B."), wherein the person detected determines whether the object is the person, based on the shape of the object and the motion of the object (see para [0114] and Figs 14A-B, showing detection of person based on the movement of the head or toe (shape)) Claims 5-6, 12-13, 17, 20: Kano and Kojima do not specifically disclose wherein the person detector determines whether the object is a person, based on the shape of the object, when the object is still, and determines whether the object is the person, based on the motion of the object, when the object is moving. In analogous art, Tillotson discloses the following limitations: wherein the person detected determines whether the object is the person by comparing the shape of the object with a reference shape (see para [0057], "In other examples, determining whether object O is an animal that has entered into the protected area includes: determining whether an animal has entered into the protected area includes: determining whether an object that has entered into the protected area is an animal, a human, or a human-made object based on one or more images of the object; e.g., by comparing the shape of object O to one or more animal shapes, human shapes, and shapes of human-made objects. In these examples, control software 120 determines that object O is an animal after determining that object O is not a human or human-made object.") wherein the person detector determines whether the object is a person, based on the shape of the object, when the object is still, and determines whether the object is the person, based on the motion of the object, when the object is moving (see para [0055]-[0058], showing determination can be made based on a single human object image or based on based on shape changes (motion) such as walking, gesturing, etc.) wherein the person detector determines whether the object is the person, based on the shape of the object, when the object is still, and determines whether the object is the person, based on the shape of the object and the motion of the object, when the object is moving (see para [0055]-[0058], showing determination can be made based on a single human object image or based on based on shape changes (motion) such as walking, gesturing, etc.) It would have been obvious to one or ordinary skill in the art at the time of the invention to combine the teachings of Tillotson with Kano and Kojima because using such shapes enables more effective detection of specific objects and protect property (see Kojima, para [0001]-[0004]). Moreover, it would have been obvious to one of ordinary skill in the art at the time of the invention to include the system for using lasers to detect animals as taught by Tillotson in the Kano and Kojima combination, since the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable. Claims 7, 14 and 21: Kano does not explicitly disclose the information is used for a notice of warning for the person. In analogous art, Kojima discloses the following limitations: an output controller configured to control information relating to the person to an external dam maintenance system (see para [0036]-[0039], showing surveillance data communicated via network and can be carried out by a security company), wherein the information is used for a notice of warning for the person (see para [0072], "In addition, according to the surveillance control program, the surveillance processing unit 460 may generate the surveillance screen showing a situation or the like of the surveillance area 100 that is the target for surveillance. For example, when receiving, from the surveillance device 104, an alert that a suspicious person is found, or the like, the surveillance processing unit 460 may perform processing of displaying, on the surveillance screen, the alert and the captured image in which the suspicious person is shown. In addition, in response to receiving, from the surveillance terminal 140, the instruction from the security personnel or the surveillance personnel, the surveillance processing unit 460 performs, according to the surveillance control program, processing in accordance with the instruction.") It would have been obvious to one of ordinary skill in the art at the time of the invention to include the surveillance system as taught by Kojima in the laser radar system of Kano, since the claimed invention is merely a combination of old elements, and in the combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Yano et al. (US 2022/0343538 A1), a system for creating a model configured to hold at least one image of the registration target object in one or more postures and a reference model indicating a shape of a reference object; acquire information indicating a feature of the registration target object in a first posture; and correct, when a shape in the first posture that is indicated by the reference model is determined to be dissimilar based on a predetermined first condition, the reference model based on the information indicating the feature to thereby create the model indicating the shape of the registration target object THIS ACTION IS MADE FINAL. Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to SUJAY KONERU whose telephone number is (571)270-3409. The examiner can normally be reached M-F, 8:30 AM to 5 pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Patricia Munson can be reached on 571- 270-5396. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /SUJAY KONERU/ Primary Examiner, Art Unit 3624
Read full office action

Prosecution Timeline

Aug 04, 2023
Application Filed
Oct 31, 2025
Non-Final Rejection — §103
Feb 03, 2026
Response Filed
Feb 16, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12596979
PERSONALIZED RISK AND REWARD CRITERIA FOR WORKFORCE MANAGEMENT
2y 5m to grant Granted Apr 07, 2026
Patent 12596972
CONVERSATION-BASED MESSAGING METHOD AND SYSTEM
2y 5m to grant Granted Apr 07, 2026
Patent 12585868
SYSTEM TO TRACE CHANGES IN A CONFIGURATION OF A SERVICE ORDER CODE FOR SERVICE FEATURES OF A TELECOMMUNICATIONS NETWORK
2y 5m to grant Granted Mar 24, 2026
Patent 12579553
REUSABLE DATA SCIENCE MODEL ARCHITECTURES FOR RETAIL MERCHANDISING
2y 5m to grant Granted Mar 17, 2026
Patent 12572990
METHODS AND IoT SYSTEMS FOR MONITORING WELDING OF SMART GAS PIPELINE BASED ON GOVERNMENT SUPERVISION
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
58%
Grant Probability
95%
With Interview (+37.0%)
3y 2m
Median Time to Grant
Moderate
PTA Risk
Based on 722 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month