DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Election/Restrictions
Claims 5-6 are withdrawn from further consideration pursuant to 37 CFR 1.142(b), as being drawn to a nonelected Group and Species, there being no allowable generic or linking claim. Applicant timely traversed the restriction (election) requirement in the reply filed on 05/15/2023.
Applicant's election with traverse of Group I and Species III in the reply filed on 05/15/2023 is acknowledged. The traversal is on the ground(s) that “…features identified in the description of Species I on page 4 of the Restriction Requirement, such as the detector, are not identified in the description of Species II or Species III…it is noted that the specification clearly supports the use of certain elements…in Species II and III, as well as in Species I…Applicant’s election of a species for prosecution should not be limited to the description of said features provided in the Restriction Requirement, but rather, what is supported by the specification.”. This is not found persuasive because each of the Species requires unique features not required in the other Species. If applicant is traversing on the ground that the inventions are not patentably distinct, applicant should submit evidence or identify such evidence now of record showing the inventions to be obvious variants or clearly admit on the record that this is the case. In either instance, if the Examiner finds one of the inventions unpatentable over the prior art, the evidence or admission may be used in a rejection under 35 U.S.C. 103 or pre-AIA U.S.C. 103(a) of the other invention.
The requirement is still deemed proper and is therefore made FINAL.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claim(s) 1-4, 7-9, 16, 17, 19, 20, and 22 is/are rejected under 35 U.S.C. 103 as being unpatentable over KR 10-2008-0109307 to Chyun in view of Ronnau 2006/0150470 and Zirkle et al. 10,791,728 and WO 2014/166498 to Borchersen and Tracy 2007/0075860 and Argue et al. 2014/0168427 or Walz et al. 2017/0007091 and Weber-Grabau 2016/0245916.
In regard to claim 1, Chyun discloses a live catch trap for rodents (100) comprising: a trap body (110); a light-based sensor (sensor/detection unit 150 may be selected based on operational requirements and may be an infrared sensor and wherein sensor 150 is electrically connected to switch unit 135 which selectively activates a camera module 133 according to whether the sensor 150 detects the pest, wherein the camera module 133 is for collecting the image information of the pest) associated with the trap body, the light-based sensor (camera module 133) configured to monitor activity in the trap including rodent presence (camera module 133 collects video information of the pests such as type, population, and state of pests introduced into housing 110 and whether the pest are captured by the capture unit 120 or whether the pests have passed through the capture unit and whether the pests flow in and out through doorway 112), the light-based sensor is a photo sensor (camera module 133 contains a photo sensor which is the specific component of the camera which converts light into electrical signals to create an image) arranged on the inner side of the cover/lid (cover case 113; see Fig. 5) of the trap body (110), which is operated on a predetermined periodic basis (Chyun discloses that camera module 133 can operate in a sleep mode which refers to a state in which minimum power is applied and is maintained in a state similar to that of being inactive without performing normal operation and consuming minimal energy and when the pest is detected by detection unit 150 then switch unit 135 activates camera module 133, wherein the predetermined periodic basis is the occasions where the detection unit 150 detects a pest to activate the camera module 133) independent of any other operating parameter of the photo sensor (switch unit 135 for activating camera module 133 is electrically connected to the detection unit 150 and selectively activates the camera module 133 when detection unit 150 detects the pest and is not dependent upon any operating parameter of the camera module 133) for monitoring live rodent activity in the trap; a circuit module (130) comprising a circuit board (130a), switch unit (135; switch unit 135 electrically connected to detection unit 150 [may be selected based on operational requirements and may be an infrared sensor] to switch the camera module 133), wherein the circuit module is configured to receive and transmit data from the light-based sensor to determine a type of trap activity; and a communication transmitter (communication module 137) configured to send trap status information including the type of trap activity (camera module 133 obtains information of pest in housing 110, and collected information is transmitted to an external management center 160 via communication module 137) from the trap to a remote user, but does not disclose a microprocessor configured to evaluate data received from the light-based sensor to determine a type of trap activity, the microprocessor utilizing pattern recognition to evaluate the received data for the presence of rodents or a wireless communication transmitter. Ronnau discloses a microprocessor (see paras. 0040, 0149) configured to evaluate data received from the light-based sensor (see paras. 0041-42, 0057, 0059; paras. 0041 & 0057 indicate that the detection unit may include a camera cooperating with suitable analysis and recognition software/computer-based image analysis and pattern recognition and para. 0040 discloses that the detection units incorporate a microprocessor) to determine a type of trap activity, the microprocessor utilizing pattern recognition to evaluate the received data for the presence of rodents in the interior of the trap (see para. 0057 of Ronnau which discloses more complex tests when species determination is concerned, require more nuanced situation images where modern digital camera technique is used combined with special image analysis and pattern recognition software which is known from analyses of complicated biological subjects such as insects and plant seeds when identifying single individuals through biometric codes; Ronnau discloses that specific species may be determined) and a wireless communication transmitter (units 21,22 are connected with global server through radio link 32, detection and capture unit 25 is connected 31 through an incorporated GSM module with the global system server 54 through GSM link station 36; see paras. 0035-36, 0046-49, 0053-55, and 0147). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the live catch trap for rodents of Chyun such that it comprises a microprocessor configured to evaluate data received from the light-based sensor to determine a type of trap activity, the microprocessor utilizing pattern recognition to evaluate the received data for the presence of rodents in an interior of the trap and a wireless communication transmitter in view of Ronnau in order to provide a means for analyzing the data from the light-based sensor at the trap so that the data that is sent to the remote user is filtered out and highly developed to provide real information with respect to the status of the live catch trap and which avoids false positives or irrelevant information that waste the remote user’s time and resources and means for communicating with the remote user without the need to extend wiring between the trap and the remote user which could be cumbersome in the usage environment and to allow the placement of the live catch trap at locations which are remote from the user without the limitations imposed by the necessary use of communication wiring. Chyun and Ronnau do not disclose the microprocessor utilizing pattern recognition to evaluate the received data for the presence of debris not identified as a rodent in an interior of the trap. Zirkle et al. disclose the microprocessor (electronics compartment 120 including a computer processor of a wireless communications module 150) evaluating received data from a sensor (140 which may be a contact switch, digital weight scale, tilt meter, flex meter, magnet switch, infrared sensor or some other suitable sensor) for the presence of stationary debris/bait or matter intended to be consumed by a rodent (amount of bait 130 remaining) not identified as a rodent (the amount of bait 130 is identified) in an interior of a bait station (100). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the microprocessor of Chyun and Ronnau such that the microprocessor evaluates the received data for the presence of stationary debris not identified as a rodent in an interior of the trap in view of Zirkle et al. in order to monitor other features of the interior of the trap such as the amount of debris/bait available so as to alert the user as to when to replenish the debris/bait so that the amount of debris/bait is maintained at acceptable levels for proper attraction of the rodents and provide additional trap status information that is valuable to the remote user such as bait status so that the remote user can identify the exact type of maintenance required on the trap or exact type of attention needed by the trap. Chyun, Ronnau, and Zirkle et al. disclose the microprocessor to evaluate the received data for the presence of bait on a floor of the trap (120 of Chyun disposed on floor of 110; 130,135 of Zirkle et al. disposed on floor of 105), but do not disclose wherein the microprocessor utilizing pattern recognition to evaluate the received data for the presence of stationary debris/bait not identified as a rodent in an interior of the trap. Borchersen discloses a system for determining feed consumption of at least one animal wherein the microprocessor (processing means in the form of a computer 6) utilizing pattern recognition (the resulting images produced by camera 4 are evaluated according to pixels; the reduction in feed between subsequent images may be determined by identifying the feed in each image and calculating the difference in height of corresponding image areas, such as pixels, representing feed in subsequent images. The base level of the feeding area may be known, e.g. by having range images of the empty feeding area as a reference. As previously mentioned it may be difficult to determine the exact amount of feed consumed by each animal) to evaluate the received data (data received from camera 4) for the presence of animals (identification of the animals may be part of the processing means, e.g. images showing the feeding area also show at least part of the feeding animals, and the animals can then be identified in the images by means of image processing) and the presence of stationary debris/feed (the feeding area for each cow can typically be assessed with image processing, and if images are acquired continuously while the animals are eating, the feed consumption of each animal can still be assessed by determining the reduction in feed between subsequent images) not identified as an animal (identified animals). Borchersen discloses the microprocessor evaluating the received data for the presence of stationary debris (amount of feed 2) not identified as a rodent (distinguishes feed 2 from cows) on a floor of the area (floor of the feeding area bounded by feed fence 7 upon which the feed 2 rests upon). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the microprocessor utilizing pattern recognition to evaluate the received data of Chyun, Ronnau, and Zirkle et al. such that it evaluates the received data for the presence of stationary debris/bait not identified as a rodent on a floor of the trap in view of Borchersen in order to provide an alternative means for ascertaining the amount of bait remaining via visual analyzation techniques that are not prone to physical failure unlike switches, digital scales, and meters taught by Zirkle et al. that also may require calibration etc. in order to be accurate. Chyun, Ronnau, Zirkle et al., and Borchersen do not disclose wherein the microprocessor utilizing pattern recognition to evaluate the received data for the presence of stationary debris not identified as rodent bait or matter intended to be consumed by a rodent on a floor of the trap. Tracy discloses an alarm for selectively detecting intrusions by person wherein a light-based sensor (transmitter 30, sensors 40-70, filters 80-110, amplifiers 120-150; see Fig. 1) associated with an area to be monitored (location or region 20), the light-based sensor configured to monitor activity in the area of interest include animal presence (dog 200, bird 220, human 230, larger animal such as a deer 240); a microprocessor (CPU 160) configured to evaluate data received from the light-based sensor to determine a type of activity (distinguish intrusions by persons from intrusions by other such as nonhuman animals and inanimate objects), the microprocessor utilizing pattern recognition (see para. 0026) to evaluate the received data for the presence of a human or nonhuman animal (200, 220, 230) and debris (inanimate objects such as falling leaves 210) in the area to be monitored; and a communication transmitter (alarm output 170) configured to send status information including the type of activity from the area to be monitored (200) to a remote user (user is alerted by alarm output 170 which can be a speaker, visual output or any other device for alerting others to the presence of an intruder). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the trap of Chyun, Ronnau, Zirkle et al., and Borchersen such that the microprocessor utilizes pattern recognition to evaluate the received data for the presence of debris not identified as rodent bait on a floor of the trap in view of Tracy in order to determine the presence of specific inanimate objects such as debris within the interior of the trap so as to fully inform the user of the status of the trap with respect to the presence of inanimate objects within the trap which may hinder the operation of the trap. Chyun, Ronnau, Zirkle et al., Borchersen, and Tracy disclose the microprocessor (Ronnau, Zirkle et al., Borchersen, and Tracy) configured to evaluate received data for the presence of moving debris (Tracy) not identified as a rodent (Chyun, Ronnau, Borcheson, Tracy) or rodent bait or matter intended to be consumed by a rodent (Zirkle et al., Borchersen, Tracy) on a floor of the trap (Chyun discloses capture unit 120 is a portion where the pest is captured may be provided with a food containing a drug that can attract or kill the pest; Zirkle et al. disclose bait 130 and bait platform 135 disposed on the floor of the housing 105), but do not disclose the microprocessor configured to determine the presence of stationary debris on a floor of the trap. Argue et al. disclose a process for identifying floor (flooring space may be segmented into multiple floor space areas 510,520,530,540) messes such as dry product spill (see paras. 0028,0031) through computer-implemented image processing techniques using image processing module (110) and camera (120). Walz discloses a floor cleaning device comprising a control unit (24), optical recording units (26,28) comprising cameras (54,56) which create images of the floor surface portion (60), wherein the control unit (24) can compare the two images from the cameras (54,56) in order to check the cleaning result for the floor surface portion (60), and that if the control unit (24) determines that the floor surface portion (60) has not been sufficiently well cleaned, and in the event of insufficiently good cleaning, the floor cleaning device (10) stops, reverses, opposite the cleaning direction (22) and cleans the floor surface (60) again until it is determined that the floor surface portion (60) has been well cleaned. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the microprocessor of Chyun, Ronnau, Zirkle et al., Borchersen, and Tracy such that it is configured to determine the presence of stationary debris on a floor of the trap in view of Argue et al. or Walz in order to provide a mechanism for detecting the presence of non-moving/stationary debris on a floor of the trap so as to monitor the bottom surface of the trap upon which the rodent will move upon and the location at which the bait is located so that it may be determined whether there is debris present on the floor which may interfere with/obscure the visual detection of the rodents and the quantity of bait present. Chyun, Ronnau, Zirkle et al., Borchersen, Tracy, and Argue et al. or Walz do not disclose wherein the light-based sensor is a photo sensor including an LED array and a photodiode array arranged on opposing sides of the trap body. Weber-Grabau discloses wherein the light-based sensor is a photo sensor including an LED array (light source 700,705,710 includes linear arrays of LEDs) and a photodiode array (light sensors 800,805,810 comprise linear array of photodiodes 815) arranged on opposing sides of the trap body (interaction volume 102; light source 108 & light sensor 310 on opposing sides in Fig. 3; see para. 0067), light emitted by the LED array stimulating the photodiode array when the trap body is empty, said light being at least partly blocked by insect presence in the trap body (see para. 0079), said microprocessor (104; see para. 0082) receiving an output from the photodiode array and being configured to use pattern recognition (see paras. 0100, 0177) to evaluate light blockage patterns in said output for correspondence with insect presence. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to substitute the light-based sensor which is a photo sensor of Weber-Grabau including an LED array and a photodiode array arranged on opposing sides of the trap body, light emitted by the LED array stimulating the photodiode array when the trap body is empty, said light being at least partly blocked by rodent presence in the trap body, said microprocessor receiving an output from the photodiode array and being configured to use pattern recognition to evaluate light blockage patterns in said output for correspondence with rodent presence for the light-based sensor of Chyun and Ronnau so as to provide an alternatively configured light-based sensor that is equally adept at identifying specific rodents as they enter the trap.
In regard to claim 2, Chyun, Ronnau, Zirkle et al., Borchersen, and Tracy disclose wherein the microprocessor (130 of Chyun; microprocessor of Ronnau) is arranged within the trap (130 is within 113 of 110 of Chyun; see para. 0040 of Ronnau) and is configured to determine the trap status information, the trap status information including an indication of the presence of rodents (detection unit 150 detects rodent presence and triggers camera module 133 to collect image information in Chyun; detection sensors detect rodents and cameras record images of the rodents, wherein the images are analyzed by microprocessor in Ronnau) and the cleanliness of the trap interior (as taught by Zirkle et al. and Borchersen).
In regard to claim 3, Chyun discloses an activity sensor (detection unit 150 may be a contact sensor 151 or indirect sensor 153) associated with the trap body (111 of 110 in Fig. 2), said light-based sensor capturing data in response to a trap activity event indicating rodent or insect presence as detected by the activity sensor (switch unit 135 is electrically connected to detection unit 150 and selectively activates camera module 133 according to whether the detection unit 150 detects the pest).
In regard to claim 4, Chyun discloses wherein the activity sensor (151 or 153) includes one or more sensors selected from the group consisting of a motion detector (indirect sensor 153 may be an optical sensor), an accelerometer, a pressure sensor (contact sensor 151) and a temperature sensor (indirect sensor 153 may be infrared sensor).
In regard to claim 7, Chyun, Ronnau, Zirkle et al., Borchersen, Tracy, and Argue et al. or Walz and Weber-Grahau disclose wherein light emitted by the LED array stimulating the photodiode array when the trap body is empty, said light being at least partly blocked by insect presence in the trap body (see para. 0079 of Weber-Grabau), said microprocessor (104 of Weber-Grabau; see para. 0082 of Weber-Grabau) receiving an output from the photodiode array and being configured to use pattern recognition (see paras. 0100, 0177 of Weber-Grabau) to evaluate light blockage patterns in said output for correspondence with insect presence.
In regard to claim 8, Chyun, Ronnau, Tracy, Argue et al. or Walz, and Weber-Grabau disclose wherein the photo sensor further includes an amplifier (120-150 of Tracy; see paras. 0097, 0100, 0122, 0145, 0175, 0177, 0122, 0161, 0175, 0181 of Weber-Grabau) and a high pass filter (80-110 of Tracy; see paras. 0072, 0088, 0098, 0107, of Weber-Grabau) for eliminating ambient light from the output provided to the microprocessor (106 of Tracy; 104 of Weber-Grabau; see para. 0082).
In regard to claim 9, Chyun discloses wherein the trap body includes a glue board (120).
In regard to claim 16, Chyun, Ronnau, Zirkle et al., Borchersen, Tracy, Argue et al. or Walz, and Weber-Grabau disclose wherein the glue board (120 of Chyun; 130,135 of Zirkle et al.; feed 2 of Borchersen) is arranged between the LED array (camera 4 of Borchersen disposed above the feed 2 in the feeding area; light source 700,705,710 includes linear arrays of LEDs of Weber-Grau) and the photodiode array (camera 4 of Borchersen disposed above the feed 2 in the feeding area; light sensors 800,805,810 comprise linear array of photodiodes 815of Weber-Grau), the LED array and the photodiode array spaced vertically above a floor of the trap (120 of Chyun is placed on the floor of 110; feed 2 disposed on floor of feeding area bounded by feed fence 7 of Borchersen; upper extents of light sources and light sensors of Weber-Grabau disposed over a lower extent of the interaction volume 102 to be monitored).
In regard to claim 17, Chyun, Ronnau, Zirkle et al., Borchersen, Tracy, Argue et al. or Walz, and Weber-Grabau disclose the analog signal processor (160 of Tracy; 114 of Weber-Grabau) that can comprise high-pass filter or a band-pass filter to suppress a constant offset or specific frequencies such as a 100 or 120 Hz frequency entering the interaction volume (102 of Weber-Grabau) from an artificial light source (see para. 0088 of Weber-Grabau), respectively and wherein the high pass filter has a cut-off frequency of 23 Hz (see para. 0122 of Weber-Grabau), but do not disclose a cut-off frequency of approximately 400 hertz. It would have been an obvious matter of design choice before the effective filing date of the claimed invention to modify the of Chyun, Ronnau, Zirkle et al., Borchersen, Tracy, Argue et al. or Walz, and Weber-Grabau such that it has a cut-off frequency of approximately 400 hertz since applicant has not disclosed that by doing so is critical to the design or produces any unexpected results, and it appears that the invention of Chyun, Ronnau, Zirkle et al., Borchersen, and Weber-Grabau would perform equally as well by doing so, and because a person of ordinary skill in the art would readily design the system to filter out the extraneous light pollution at the given frequencies so that the sensor is able to gather the desired light detection data.
In regard to claim 19, Chyun, Ronnau, Zirkle et al., Borchersen, Tracy, Argue et al. or Walz, and Weber-Grabau disclose wherein the LED array is operated with a waveform that has a frequency component greater than 1 kilohertz (“…the modulation frequency is greater than 100 Hz and in another embodiment the modulation frequency is 3 kHz,…” as stated in para. 0084 of Weber-Grabau).
In regard to claim 20, Chyun, Ronnau, Zirkle et al., Borchersen, Tracy, Argue et al. or Walz, and Weber-Grabau disclose wherein the light blockage patterns evaluated are indicative of light blocked from the photdiode array (see Fig. 1 of Tracy; as taught by Weber-Grabau) by the rodent (as taught by Chyun and Ronnau).
In regard to claim 22, Chyun, Ronnau, Zirkle et al., Borchersen, and Tracy disclose wherein the microprocessor utilizes pattern recognition to evaluate the received data for the presence of insects (Ronnau discloses in para. 0057 of Ronnau which discloses more complex tests when species determination is concerned, require more nuanced situation images where modern digital camera technique is used combined with special image analysis and pattern recognition software which is known from analyses of complicated biological subjects such as insects and plant seeds when identifying single individuals through biometric codes; Ronnau discloses that specific species may be determined) in the interior of the trap.
Claim(s) 21 is/are rejected under 35 U.S.C. 103 as being unpatentable over KR 10-2008-0109307 to Chyun in view of Ronnau 2006/0150470 and Zirkle et al. 10,791,728 and WO 2014/166498 to Borchersen and Tracy 2007/0075860 and Argue et al. 2014/0168427 or Walz et al. 2017/0007091 and Weber-Grabau 2016/0245916 as applied to claim 4 above, and further in view of Kramer et al. 2013/0342344.
In regard to claim 21, Chyun do not disclose wherein the activity sensor is an accelerometer. Kramer et al. disclose a wireless mousetrap and system wherein the activity sensor is an accelerometer (26). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to substitute the activity sensor which is an accelerometer of Kramer et al. for the activity sensor which is a motion detector (indirect sensor 153 may be an optical sensor of Chyun), a pressure sensor (contact sensor 151 of Chyun) and a temperature sensor (indirect sensor 153 may be infrared sensor of Chyun) in order to provide an alternative yet equally effective means for detecting pest activity based on the detection of forces experienced by the trap caused by movement due to rodent activity within the trap.
Claim(s) 22 is/are rejected under 35 U.S.C. 103 as being unpatentable over KR 10-2008-0109307 to Chyun in view of Ronnau 2006/0150470 and Zirkle et al. 10,791,728 and WO 2014/166498 to Borchersen and Tracy 2007/0075860 and Argue et al. 2014/0168427 or Walz et al. 2017/0007091 and Weber-Grabau 2016/0245916 as applied to claim 1 above, and further in view of Kates 7,504,956.
Alternatively in regard to claim 22, Chyun, Ronnau, Zirkle et al., Borchersen, Tracy, Argue et al. or Walz do not disclose wherein the microprocessor utilizes pattern recognition to evaluate the received data for the presence of insects in the interior of the trap. Kates discloses a system and method for pest detection wherein the microprocessor (162) utilizes pattern recognition (sensor assembly 120 with transmitter 122 transmitting beam 126 to receiver 124; detection system 140 with receivers 160a-d to detect their respective uninterrupted beams 142, 146, 150, 154 and not detect their respective broken beams, wherein receivers 160 linked to processor 160 that can determine what type of creature is causing the one or more beams to be broken; see col. 6, lines 24-48) to evaluate the received data for the presence of insects (bug 144), rodents (148), pets (152), and humans (156). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the trap of Chyun, Ronnau, Zirkle et al., Borchersen, Tracy, and Argue et al. or Walz such that the microprocessor utilizes pattern recognition to evaluate the received data for the presence of insects in the interior of the trap in view of Kates in order to further inform the user of the kinds of animals which are entering into the interior of the trap.
Claim(s) 18, 23, 26 is/are rejected under 35 U.S.C. 103 as being unpatentable over KR 10-2008-0109307 to Chyun in view of Ronnau 2006/0150470, Zirkle et al. 10,791,728, WO 2014/166498 to Borchersen, Weber-Grabau 2016/0245916, Tracy 2007/0075860, and Argue et al. 2014/0168427 or Walz et al. 2017/0007091.
In regard to claim 23, Chyun discloses a live catch trap for rodents (100) comprising: a trap body (110) including a glue board (capture unit 120 is a portion where the pest is substantially captured and that a trap may be used and that the adhesive surface is formed on the upper surface of sticky material, when the pest passes through the upper surface, the pest is attached to and captured by the adhesive surface); a light-based sensor (sensor/detection unit 150 may be selected based on operational requirements and may be an infrared sensor and wherein sensor 150 is electrically connected to switch unit 135 which selectively activates a camera module 133 according to whether the sensor 150 detects the pest, wherein the camera module 133 is for collecting the image information of the pest) associated with the trap body, the light-based sensor (camera module 133) configured to monitor activity in the trap including rodent presence (camera module 133 collects video information of the pests such as type, population, and state of pests introduced into housing 110 and whether the pest are captured by the capture unit 120 or whether the pests have passed through the capture unit and whether the pests flow in and out through doorway 112); a circuit module (130) comprising a circuit board (130a), switch unit (135), wherein the circuit module is configured to receive and transmit data from the light-based sensor to determine a type of trap activity (camera module 133 obtains information of pest in housing 110, and collected information is transmitted to an external management center 160 via communication module 137); and a communication transmitter (communication module 137) configured to send trap status information including the type of trap activity from the trap to a remote user; the glue board (120) is arranged in view of the light-based sensor (133) and the light-based sensor is spaced vertically above (133 is mounted inside cover case 113 which is above housing 110) a floor of the trap (bottom/floor of 110) at a non-zero height (camera module 133 is mounted on an interior side of cover case 113 which is basically the height internal height of the trap body 110) sufficient to prevent the view of the light sensor from being interrupted by contamination arranged on the floor of the trap (the high position of the camera module 133 results in contamination on the floor of the trap as being a non-factor), but does not disclose a microprocessor configured to evaluate data received from the light-based sensor to determine a type of trap activity, the microprocessor utilizing pattern recognition to evaluate the received data for the presence of rodents or a wireless communication transmitter. Ronnau discloses a microprocessor (see paras. 0040, 0149) configured to evaluate data received from the light-based sensor (see paras. 0041-42, 0057, 0059; paras. 0041 & 0057 indicate that the detection unit may include a camera cooperating with suitable analysis and recognition software/computer-based image analysis and pattern recognition and para. 0040 discloses that the detection units incorporate a microprocessor) to determine a type of trap activity, the microprocessor utilizing pattern recognition to evaluate the received data for the presence of rodents in the interior of the trap (see para. 0057 of Ronnau which discloses more complex tests when species determination is concerned, require more nuanced situation images where modern digital camera technique is used combined with special image analysis and pattern recognition software which is known from analyses of complicated biological subjects such as insects and plant seeds when identifying single individuals through biometric codes; Ronnau discloses that specific species may be determined) and a wireless communication transmitter (units 21,22 are connected with global server through radio link 32, detection and capture unit 25 is connected 31 through an incorporated GSM module with the global system server 54 through GSM link station 36; see paras. 0035-36, 0046-49, 0053-55, and 0147). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the live catch trap for rodents of Chyun such that it comprises a microprocessor configured to evaluate data received from the light-based sensor to determine a type of trap activity, the microprocessor utilizing pattern recognition to evaluate the received data for the presence of rodents in an interior of the trap or a wireless communication transmitter in view of Ronnau in order to provide a means for analyzing the data from the light-based sensor at the trap so that the data that is sent to the remote user is filtered out and highly developed to provide real information with respect to the status of the live catch trap and which avoids false positives or irrelevant information that waste the remote user’s time and resources and means for communicating with the remote user without the need to extend wiring between the trap and the remote user which could be cumbersome in the usage environment and to allow the placement of the live catch trap at locations which are remote from the user without the limitations imposed by the necessary use of communication wiring. Chyun and Ronnau do not disclose the microprocessor utilizing pattern recognition to evaluate the received data for the presence of stationary debris not identified as a rodent in an interior of the trap. Zirkle et al. disclose the microprocessor (electronics compartment 120 including a computer processor of a wireless communications module 150) evaluating received data from a sensor (140 which may be a contact switch, digital weight scale, tilt meter, flex meter, magnet switch, infrared sensor or some other suitable sensor) for the presence of stationary debris/bait or matter intended to be consumed by a rodent (amount of bait 130 remaining) not identified as a rodent (the amount of bait 130 is identified) in an interior of a bait station (100). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the microprocessor of Chyun and Ronnau such that the microprocessor evaluates the received data for the presence of stationary debris not identified as a rodent in an interior of the trap in view of Zirkle et al. in order to monitor other features of the interior of the trap such as the amount of debris/bait available so as to alert the user as to when to replenish the debris/bait so that the amount of debris/bait is maintained at acceptable levels for proper attraction of the rodents and provide additional trap status information that is valuable to the remote user such as bait status so that the remote user can identify the exact type of maintenance required on the trap or exact type of attention needed by the trap. Chyun, Ronnau, and Zirkle et al. disclose a microprocessor (Ronnau, Zirkle et al.) configured to evaluate the received data for the presence of stationary debris (bait 130 of Zirkle et al.) on the floor of the trap (120 of Chyun on the floor of 110; 130,135 of Zirkle et al. on floor of 105), but do not disclose wherein the microprocessor utilizing pattern recognition to evaluate the received data for the presence of stationary debris not identified as a rodent in an interior of the trap. Borchersen discloses a system for determining feed consumption of at least one animal wherein the microprocessor (processing means in the form of a computer 6) utilizing pattern recognition (the resulting images produced by camera 4 are evaluated according to pixels; the reduction in feed between subsequent images may be determined by identifying the feed in each image and calculating the difference in height of corresponding image areas, such as pixels, representing feed in subsequent images. The base level of the feeding area may be known, e.g. by having range images of the empty feeding area as a reference. As previously mentioned it may be difficult to determine the exact amount of feed consumed by each animal) to evaluate the received data (data received from camera 4) for the presence of animals (identification of the animals may be part of the processing means, e.g. images showing the feeding area also show at least part of the feeding animals, and the animals can then be identified in the images by means of image processing) and the presence of debris/feed (the feeding area for each cow can typically be assessed with image processing, and if images are acquired continuously while the animals are eating, the feed consumption of each animal can still be assessed by determining the reduction in feed between subsequent images) not identified as an animal (identified animals). Borchersen discloses the microprocessor evaluating the received data for the presence of stationary debris (amount of feed 2) not identified as a rodent (distinguishes feed 2 from cows) on a floor of the area (floor of the feeding area bounded by feed fence 7 upon which the feed 2 rests upon). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the microprocessor utilizing pattern recognition to evaluate the received data of Chyun, Ronnau, and Zirkle et al. such that it evaluates the received data for the presence of debris not identified as a rodent on a floor of the trap in view of Borchersen in order to provide an alternative means for ascertaining the amount of bait remaining via visual analyzation techniques that are not prone to physical failure unlike switches, digital scales, and meters taught by Zirkle et al. that also may require calibration etc. in order to be accurate. Chyun, Ronnau, Zirkle et al., and Borchersen disclose the light-based sensor (133 of Chyun) is spaced vertically above (133 is mounted inside cover case 113) a floor of the trap (floor/bottom of 110 of Chyun), but do not disclose the light-based sensor is a photo sensor including an LED array and a photodiode array arranged on opposing lateral sides of the trap body, light emitted by the LED array in a lateral direction stimulating the photodiode array when the trap body is empty, said light being at least partly blocked by rodent presence in the trap body, said microprocessor receiving an output from the photodiode array and being configured to use pattern recognition to evaluate light blockage patterns in said output for correspondence with rodent presence or the glue board is arranged between the LED array and the photodiode array in the lateral direction. Weber-Grabau discloses wherein the light-based sensor is a photo sensor including an LED array (light source 700,705,710 includes linear arrays of LEDs) and a photodiode array (light sensors 800,805,810 comprise linear array of photodiodes 815) arranged on opposing lateral sides of the trap body (interaction volume 102; light source 108 & light sensor 310 on opposing sides in Fig. 3; see para. 0067), light emitted by the LED array in a lateral direction (see Fig. 3) stimulating the photodiode array when the trap body is empty, said light being at least partly blocked by insect presence in the trap body (see para. 0079), said microprocessor (104; see para. 0082) receiving an output from the photodiode array and being configured to use pattern recognition (see paras. 0100, 0177) to evaluate light blockage patterns in said output for correspondence with insect presence, and wherein the trap area to be monitored (space within trap body 102) is arranged between the LED array and the photodiode array in the lateral direction (see Fig. 3). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to substitute the light-based sensor which is a photo sensor of Weber-Grabau including an LED array and a photodiode array arranged on opposing lateral sides of the trap body, light emitted by the LED array in a lateral direction stimulating the photodiode array when the trap body is empty, said light being at least partly blocked by rodent presence in the trap body, the trap area to be monitored (referring to the glueboard of Chyun) is arranged between the LED array and the photodiode array in the lateral direction, said microprocessor receiving an output from the photodiode array and being configured to use pattern recognition to evaluate light blockage patterns in said output for correspondence with rodent presence for the light-based sensor of Chyun, Ronnau, Zirkle et al., and Borchersen so as to provide an alternatively configured light-based sensor that is equally adept at identifying specific rodents as they enter the trap. Chyun, Ronnau, Zirkle et al., Borchersen and Weber-Grabau do not disclose wherein the microprocessor utilizing pattern recognition to evaluate the received data for the presence of stationary debris not identified as rodent bait or matter intended to be consumed by a rodent on a floor of the trap. Tracy discloses an alarm for selectively detecting intrusions by person wherein a light-based sensor (transmitter 30, sensors 40-70, filters 80-110, amplifiers 120-150; see Fig. 1) associated with an area to be monitored (location or region 20), the light-based sensor configured to monitor activity in the area of interest include animal presence (dog 200, bird 220, human 230, larger animal such as a deer 240); a microprocessor (CPU 160) configured to evaluate data received from the light-based sensor to determine a type of activity (distinguish intrusions by persons from intrusions by other such as nonhuman animals and inanimate objects), the microprocessor utilizing pattern recognition (see para. 0026) to evaluate the received data for the presence of a human or nonhuman animal (200, 220, 230) and debris (inanimate objects such as falling leaves 210) in the area to be monitored; and a communication transmitter (alarm output 170) configured to send status information including the type of activity from the area to be monitored (200) to a remote user (user is alerted by alarm output 170 which can be a speaker, visual output or any other device for alerting others to the presence of an intruder). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the trap of Chyun, Ronnau, Zirkle et al., Borchersen and Weber-Grabau such that the microprocessor utilizes pattern recognition to evaluate the received data for the presence of debris not identified as rodent bait on a floor of the trap in view of Tracy in order to determine the presence of specific inanimate objects such as debris within the interior of the trap so as to fully inform the user of the status of the trap with respect to the presence of inanimate objects within the trap which may hinder the operation of the trap. Chyun, Ronnau, Zirkle et al., Borchersen, Weber-Grabau, and Tracy disclose the microprocessor (Ronnau, Zirkle et al., Borchersen, and Tracy) configured to evaluate received data for the presence of moving debris (Tracy) not identified as a rodent (Chyun, Ronnau, Borcheson, Tracy) or rodent bait or matter intended to be consumed by a rodent (Zirkle et al., Borchersen, Tracy) on a floor of the trap (Chyun discloses capture unit 120 is a portion where the pest is captured may be provided with a food containing a drug that can attract or kill the pest; Zirkle et al. disclose bait 130 and bait platform 135 disposed on the floor of the housing 105), but do not disclose the microprocessor configured to determine the presence of stationary debris on a floor of the trap. Argue et al. disclose a process for identifying floor (flooring space may be segmented into multiple floor space areas 510,520,530,540) messes such as dry product spill (see paras. 0028,0031) through computer-implemented image processing techniques using image processing module (110) and camera (120). Walz discloses a floor cleaning device comprising a control unit (24), optical recording units (26,28) comprising cameras (54,56) which create images of the floor surface portion (60), wherein the control unit (24) can compare the two images from the cameras (54,56) in order to check the cleaning result for the floor surface portion (60), and that if the control unit (24) determines that the floor surface portion (60) has not been sufficiently well cleaned, and in the event of insufficiently good cleaning, the floor cleaning device (10) stops, reverses, opposite the cleaning direction (22) and cleans the floor surface (60) again until it is determined that the floor surface portion (60) has been well cleaned. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the microprocessor of Chyun, Ronnau, Zirkle et al., Borchersen, Weber-Grabau, and Tracy such that it is configured to determine the presence of stationary debris on a floor of the trap in view of Argue et al. or Walz in order to provide a mechanism for detecting the presence of non-moving/stationary debris on a floor of the trap so as to monitor the bottom surface of the trap upon which the rodent will move upon and the location at which the bait is located so that it may be determined whether there is debris present on the floor which may interfere with/obscure the visual detection of the rodents and the quantity of bait present.
In regard to claim 18, Chyun, Ronnau, Zirkle et al., Borchersen, Weber-Grabau, Tracy, and Argue et al. or Walz disclose wherein the LED array and the photodiode array (as taught by Weber-Grabau) are operated on a predetermined periodic basis (Chyun discloses that camera module 133 can operate in a sleep mode which refers to a state in which minimum power is applied and is maintained in a state similar to that of being inactive without performing normal operation and consuming minimal energy and when the pest is detected by detection unit 150 then switch unit 135 activates camera module 133, wherein the predetermined periodic basis is the occasions where the detection unit 150 detects a pest to activate the camera module 133) independent of a periodic waveform operating the LED array (Chyun discloses switch unit 135 for activating camera module 133 is electrically connected to the detection unit 150 and selectively activates the camera module 133 when detection unit 150 detects the pest and is not dependent upon any operating parameter of the camera module 133) for monitoring live rodent activity in the trap (“…the intensity of light source 108 is modulated by the controller 104 with a periodic waveform…” as stated in para. 0083; see paras. 0083, 0084, 0158 of Weber-Grabau).
In regard to claim 26, Chyun, Ronnau, Zirkle et al., Borchersen, Weber-Grabau, Tracy, and Argue et al. or Walz disclose wherein the entire LED array (light source 700,705,710 includes linear arrays of LEDs as taught by Weber-Grabau) and the entire photodiode array (light sensors 800,805,810 comprise linear array of photodiodes 815 as taught by Weber-Grabau) are spaced vertically above (the light-based sensor comprising camera module 133 of Chun is mounted on the inside of the cover 113 as shown in Fig. 5 and that this positions the camera module 133 above the interior of the trap of Chyun) the floor (floor of 111 of Chyun) of the trap.
Claim(s) 24-25 is/are rejected under 35 U.S.C. 103 as being unpatentable over KR 10-2008-0109307 to Chyun in view of Ronnau 2006/0150470 and Zirkle et al. 10,791,728, WO 2014/166498 to Borchersen, Kates 7,504,956, Tracy 2007/0075860, Argue et al. 2014/0168427 or Walz et al. 2017/0007091, and Weber-Grabau 2016/0245916.
In regard to claim 24, Chyun discloses a live catch trap for rodents (100) comprising: a trap body (110) including a glue board (capture unit 120 is a portion where the pest is substantially captured and that a trap may be used and that the adhesive surface is formed on the upper surface of sticky material, when the pest passes through the upper surface, the pest is attached to and captured by the adhesive surface); a light-based sensor (camera module 133 or sensor 150 may be selected based on operational requirements and may be an infrared sensor) associated with the trap body, the light-based sensor (camera module 133) configured to monitor activity in the trap including rodent presence; a circuit module (130) comprising a circuit board (130a), switch unit (135), wherein the circuit module is configured to receive and transmit data from the light-based sensor to determine a type of trap activity; and a communication transmitter (communication module 137) configured to send trap status information including the type of trap activity from the trap to a remote user, the light-based sensor is a photo sensor (camera module 133 contains a photo sensor which is the specific component of the camera which converts light into electrical signals to create an image) arranged on the inner side of the cover/lid (cover case 113; see Fig. 5) of the trap body (110), the light sensor (camera module 133) is spaced vertically above the floor of the trap (floor of 111 of trap body 110) at a non-zero height (camera module 133 is mounted on an interior side of cover case 113 which is basically the height internal height of the trap body 110) sufficient to prevent the view of the light sensor from being interrupted by contamination arranged on the floor of the trap (the high position of the camera module 133 results in contamination on the floor of the trap as being a non-factor), but does not disclose a microprocessor configured to evaluate data received from the light-based sensor to determine a type of trap activity, the microprocessor utilizing pattern recognition to evaluate the received data for the presence of rodents or a wireless communication transmitter. Ronnau discloses a microprocessor (see paras. 0040, 0149) configured to evaluate data received from the light-based sensor (see paras. 0041-42, 0057, 0059; paras. 0041 & 0057 indicate that the detection unit may include a camera cooperating with suitable analysis and recognition software/computer-based image analysis and pattern recognition and para. 0040 discloses that the detection units incorporate a microprocessor) to determine a type of trap activity, the microprocessor utilizing pattern recognition to evaluate the received data for the presence of rodents in the interior of the trap (see para. 0057 of Ronnau which discloses more complex tests when species determination is concerned, require more nuanced situation images where modern digital camera technique is used combined with special image analysis and pattern recognition software which is known from analyses of complicated biological subjects such as insects and plant seeds when identifying single individuals through biometric codes; Ronnau discloses that specific species may be determined) and a wireless communication transmitter (units 21,22 are connected with global server through radio link 32, detection and capture unit 25 is connected 31 through an incorporated GSM module with the global system server 54 through GSM link station 36; see paras. 0035-36, 0046-49, 0053-55, and 0147). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the live catch trap for rodents of Chyun such that it comprises a microprocessor configured to evaluate data received from the light-based sensor to determine a type of trap activity, the microprocessor utilizing pattern recognition to evaluate the received data for the presence of rodents and insects in an interior of the trap and a wireless communication transmitter in view of Ronnau in order to provide a means for analyzing the data from the light-based sensor at the trap so that the data that is sent to the remote user is filtered out and highly developed to provide real information with respect to the status of the live catch trap, such as whether the pest type is an insect or a rodent, and which avoids false positives or irrelevant information that waste the remote user’s time and resources and means for communicating with the remote user without the need to extend wiring between the trap and the remote user which could be cumbersome in the usage environment and to allow the placement of the live catch trap at locations which are remote from the user without the limitations imposed by the necessary use of communication wiring. Chyun and Ronnau do not disclose the microprocessor utilizing pattern recognition to evaluate the received data for the presence of debris not identified as a rodent in an interior of the trap. Zirkle et al. disclose the microprocessor (electronics compartment 120 including a computer processor of a wireless communications module 150) evaluating received data from a sensor (140 which may be a contact switch, digital weight scale, tilt meter, flex meter, magnet switch, infrared sensor or some other suitable sensor) for the presence of debris/bait (amount of bait 130 remaining) not identified as a rodent (the amount of bait 130 is identified) in an interior of a bait station (100). It would have been obvious to one of ordinary skill in the art to modify the microprocessor of Chyun and Ronnau such that the microprocessor evaluates the received data for the presence of debris not identified as a rodent in an interior of the trap in view of Zirkle et al. in order to monitor other features of the interior of the trap such as the amount of debris/bait available so as to alert the user as to when to replenish the debris/bait so that the amount of debris/bait is maintained at acceptable levels for proper attraction of the rodents and provide additional trap status information that is valuable to the remote user such as bait status so that the remote user can identify the exact type of maintenance required on the trap or exact type of attention needed by the trap. Chyun and Ronnau do not disclose the microprocessor using pattern recognition to evaluate the received data for the presence of insects. Kates discloses a system and method for pest detection wherein the microprocessor (162) utilizes pattern recognition (sensor assembly 120 with transmitter 122 transmitting beam 126 to receiver 124; detection system 140 with receivers 160a-d to detect their respective uninterrupted beams 142, 146, 150, 154 and not detect their respective broken beams, wherein receivers 160 linked to processor 160 that can determine what type of creature is causing the one or more beams to be broken; see col. 6, lines 24-48) to evaluate the received data for the presence of insects (bug 144), rodents (148), pets (152), and humans (156). It would have been obvious to one of ordinary skill in the art to modify the trap of Chyun Ronnau such that the microprocessor utilizes pattern recognition to evaluate the received data for the presence of insects in the interior of the trap in view of Kates in order to further inform the user of the kinds of animals which are entering into the interior of the trap. Chyun, Ronnau, Zirkle et al., and Kates disclose the microprocessor (Chyun, Ronnau, Zirkle et al., Kates) configured to determine the presence of bait/debris (Chyun discloses capture unit 120 is a portion where the pest is captured may be provided with a food containing a drug that can attract or kill the pest; Zirkle et al. disclose bait 130 and bait platform 135) on a floor of the trap (Chyun discloses capture unit 120 disposed on the floor of 110; Zirkle et al. disclose bait 130 and bait platform 135 disposed on the floor of the housing 105), but do not disclose wherein the microprocessor utilizing pattern recognition to evaluate the received data for the presence of debris not identified as a rodent in an interior of the trap. Borchersen discloses a system for determining feed consumption of at least one animal wherein the microprocessor (processing means in the form of a computer 6) utilizing pattern recognition (the resulting images produced by camera 4 are evaluated according to pixels; the reduction in feed between subsequent images may be determined by identifying the feed in each image and calculating the difference in height of corresponding image areas, such as pixels, representing feed in subsequent images. The base level of the feeding area may be known, e.g. by having range images of the empty feeding area as a reference. As previously mentioned it may be difficult to determine the exact amount of feed consumed by each animal) to evaluate the received data (data received from camera 4) for the presence of animals (identification of the animals may be part of the processing means, e.g. images showing the feeding area also show at least part of the feeding animals, and the animals can then be identified in the images by means of image processing) and the presence of debris/feed (the feeding area for each cow can typically be assessed with image processing, and if images are acquired continuously while the animals are eating, the feed consumption of each animal can still be assessed by determining the reduction in feed between subsequent images) not identified as an animal (identified animals). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the microprocessor utilizing pattern recognition to evaluate the received data of Chyun, Ronnau, Zirkle et al., and Kates such that it evaluates the received data for the presence of debris/bait not identified as a rodent in an interior of the trap in view of Borchersen in order to provide an alternative means for ascertaining the amount of bait remaining via visual analyzation techniques that are not prone to physical failure unlike switches, digital scales, and meters taught by Zirkle et al. that also may require calibration etc. in order to be accurate. Chyun, Ronnau, Zirkle et al., Kates, and Borchersen do not disclose wherein the microprocessor utilizing pattern recognition to evaluate the received data for the presence of stationary debris not identified as rodent bait or matter intended to be consumed by a rodent on a floor of the trap. Tracy discloses an alarm for selectively detecting intrusions by person wherein a light-based sensor (transmitter 30, sensors 40-70, filters 80-110, amplifiers 120-150; see Fig. 1) associated with an area to be monitored (location or region 20), the light-based sensor configured to monitor activity in the area of interest include animal presence (dog 200, bird 220, human 230, larger animal such as a deer 240); a microprocessor (CPU 160) configured to evaluate data received from the light-based sensor to determine a type of activity (distinguish intrusions by persons from intrusions by other such as nonhuman animals and inanimate objects), the microprocessor utilizing pattern recognition (see para. 0026) to evaluate the received data for the presence of a human or nonhuman animal (200, 220, 230) and debris (inanimate objects such as falling leaves 210) in the area to be monitored; and a communication transmitter (alarm output 170) configured to send status information including the type of activity from the area to be monitored (200) to a remote user (user is alerted by alarm output 170 which can be a speaker, visual output or any other device for alerting others to the presence of an intruder). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the trap of Chyun, Ronnau, Zirkle et al., Kates, and Borchersen such that the microprocessor utilizes pattern recognition to evaluate the received data for the presence of debris not identified as rodent bait on a floor of the trap in view of Tracy in order to determine the presence of specific inanimate objects such as debris within the interior of the trap so as to fully inform the user of the status of the trap with respect to the presence of inanimate objects within the trap which may hinder the operation of the trap. Chyun, Ronnau, Zirkle et al., Kates, Borchersen, and Tracy disclose the microprocessor (Ronnau, Zirkle et al., Borchersen, and Tracy) configured to evaluate received data for the presence of moving debris (Tracy) not identified as a rodent (Chyun, Ronnau, Borcheson, Tracy) or rodent bait or matter intended to be consumed by a rodent (Zirkle et al., Borchersen, Tracy) on a floor of the trap (Chyun discloses capture unit 120 is a portion where the pest is captured may be provided with a food containing a drug that can attract or kill the pest; Zirkle et al. disclose bait 130 and bait platform 135 disposed on the floor of the housing 105), but do not disclose the microprocessor configured to determine the presence of stationary debris on a floor of the trap. Argue et al. disclose a process for identifying floor (flooring space may be segmented into multiple floor space areas 510,520,530,540) messes such as dry product spill (see paras. 0028,0031) through computer-implemented image processing techniques using image processing module (110) and camera (120). Walz discloses a floor cleaning device comprising a control unit (24), optical recording units (26,28) comprising cameras (54,56) which create images of the floor surface portion (60), wherein the control unit (24) can compare the two images from the cameras (54,56) in order to check the cleaning result for the floor surface portion (60), and that if the control unit (24) determines that the floor surface portion (60) has not been sufficiently well cleaned, and in the event of insufficiently good cleaning, the floor cleaning device (10) stops, reverses, opposite the cleaning direction (22) and cleans the floor surface (60) again until it is determined that the floor surface portion (60) has been well cleaned. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the microprocessor of Chyun, Ronnau, Zirkle et al., Kates, Borchersen, and Tracy such that it is configured to determine the presence of stationary debris on a floor of the trap in view of Argue et al. or Walz in order to provide a mechanism for detecting the presence of non-moving/stationary debris on a floor of the trap so as to monitor the bottom surface of the trap upon which the rodent will move upon and the location at which the bait is located so that it may be determined whether there is debris present on the floor which may interfere with/obscure the visual detection of the rodents and the quantity of bait present.
Also in regard to claim 24, Chyun, Ronnau, Zirkle et al., Borchersen, Kates, Tracy, Argue et al. or Walz disclose the light-based sensor is a photo sensor (camera module 133 of Chyun contains a photo sensor which is the specific component of the camera which converts light into electrical signals to create an image) arranged on the inner side of the cover/lid (cover case 113; see Fig. 5 of Chyun) of the trap body (110 of Chyun) for monitoring live rodent activity in the trap, and the microprocessor (microprocessor as taught by Ronnau) receiving an output (images from camera module 133 of Chyun) from the light-based sensor (camera module 133 of Chyun) and being configured to use pattern recognition (as taught by Ronnau) to evaluate patterns in the output for correspondence with rodent presence (as taught by Ronnau), but do not disclose the photo sensor including a laterally-facing LED array and a photodiode array arranged on opposing lateral sides of the trap body, light emitted by the LED array in a lateral direction stimulating the photodiode array when the trap body is empty, said light being at least partly blocked by rodent presence in the trap body, said microprocessor receiving an output from the photodiode array and being configured to use pattern recognition to evaluate light blockage patterns in said output for correspondence with rodent presence. Weber-Grabau discloses wherein the light-based sensor is a photo sensor including an LED array (light source 700,705,710 includes linear arrays of LEDs) and a photodiode array (light sensors 800,805,810 comprise linear array of photodiodes 815) arranged on opposing sides of the trap body (interaction volume 102; light source 108 & light sensor 310 on opposing sides in Fig. 3; see para. 0067), light emitted by the LED array stimulating the photodiode array when the trap body is empty, said light being at least partly blocked by insect presence in the trap body (see para. 0079), said microprocessor (104; see para. 0082) receiving an output from the photodiode array and being configured to use pattern recognition (see paras. 0100, 0177) to evaluate light blockage patterns in said output for correspondence with insect presence. It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to substitute the light-based sensor which is a photo sensor of Weber-Grabau including an LED array and a photodiode array arranged on opposing sides of the trap body, light emitted by the LED array stimulating the photodiode array when the trap body is empty, said light being at least partly blocked by rodent presence in the trap body, said microprocessor receiving an output from the photodiode array and being configured to use pattern recognition to evaluate light blockage patterns in said output for correspondence with rodent presence for the light-based sensor of Chyun, Ronnau, Zirkle et al., Borchersen, Tracy, and Argue et al. or Walz so as to provide an alternatively configured light-based sensor that is equally adept at identifying specific rodents as they enter the trap. Thus, Chyun, Ronnau, Zirkle et al., Borchersen, Kates, Tracy, Argue et al. or Walz, and Weber-Grabau disclose wherein the entire LED array (light source 700,705,710 includes linear arrays of LEDs as taught by Weber-Grabau) and the entire photodiode array (light sensors 800,805,810 comprise linear array of photodiodes 815 as taught by Weber-Grabau) are spaced vertically above (the light-based sensor comprising camera module 133 of Chun is mounted on the inside of the cover 113 as shown in Fig. 5 and that this positions the camera module 133 above the interior of the trap of Chyun) the floor (floor of 111 of Chyun) of the trap.
In regard to claim 25, Chyun, Ronnau, Zirkle et al., Borchersen, Kates, Tracy, Argue et al. or Walz, and Weber-Grabau disclose wherein the LED array and the photodiode array (light source 700,705,710 & light sensors 800,805,810,815 as taught by Weber-Grabau) are operated on a predetermined periodic basis (Chyun discloses that camera module 133 can operate in a sleep mode which refers to a state in which minimum power is applied and is maintained in a state similar to that of being inactive without performing normal operation and consuming minimal energy and when the pest is detected by detection unit 150 then switch unit 135 activates camera module 133, wherein the predetermined periodic basis is the occasions where the detection unit 150 detects a pest to activate the camera module 133) independent of a periodic waveform operating the LED array (Chyun discloses switch unit 135 for activating camera module 133 is electrically connected to the detection unit 150 and selectively activates the camera module 133 when detection unit 150 detects the pest and is not dependent upon any operating parameter of the camera module 133) for monitoring live rodent activity in the trap (“…the intensity of light source 108 is modulated by the controller 104 with a periodic waveform…” as stated in para. 0083; see paras. 0083, 0084, 0158 of Weber-Grabau).
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DARREN W ARK whose telephone number is (571)272-6885. The examiner can normally be reached M-F 8:30-5.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kimberly Berona can be reached at (571) 272-6909. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DARREN W ARK/Primary Examiner, Art Unit 3647
DWA