DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments with respect to claims 1-19, 21-25, 28-31, 33-49 and 50 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Furthermore, applicant argument in regarding claims rejection under 35 USC § 112, in response, applicant argument not persuasive since generalizing the process of producing 360-degree image without providing specific support considering as a well-known art will not be considered since the process of 360-degree camera and non-360-degree camera design to produce different output.
Claim Rejections - 35 USC § 112
Claims 1 and 21 rejected under 35 U.S.C. 112(a) or 35 U.S.C. 112 (pre-AIA ), first paragraph, as failing to comply with the enablement requirement. The claims 1, 20-21 and 25 contains subject matter which was not described in the specification in such a way as to enable one skilled in the art to which it pertains, or with which it is most nearly connected, to make and/or use the invention. The claimed invention “a 360-degree camera” appears does not provided sufficient support within specification to enable one skill in the art to make and/or use the invention.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-19, 21-25, 28-31, 33-49 and 50 are rejected under 35 U.S.C. 103 as being unpatentable over Harmen et al. US 2022/002805 further in view of Fransson et al. US 2018/0180841.
In regarding to claim 1 Harmen teaches:
1. A system for automatically obtaining 360° manhole video data, said system comprising: a housing; a 360° video camera in said housing;
[0054] FIG. 1B illustrates a top plan view of the example bottom unit 102. In this view, an example arrangement of four cameras 104, 104b, 104c, and 104d, which may be configured with wide angle optics such that they each have at least a partially overlapping view of adjacent camera(s). This provides 360 degree viewing coverage, e.g., as the bottom unit 102 is lowered down into a manhole, and facilitates use of stereo imaging techniques for depth imaging.
Harmen, 0054, emphasis added.
one or more lights disposed in said housing;
[0055] Referring to FIG. 1C, the bottom of the bottom unit 102 includes one or more LED panels 107, a camera 104e, as well as a laser range finder 108. The LED panel(s) 107 (similar LED panels may be included on the side(s) of the top unit 101 or bottom unit 102) permit the bottom unit 102 to illuminate dark interiors such as manholes to provide adequate lighting for visual image capture by the cameras 104, 104b, 104c, 104d, 104e. All or some of the LED panels may be automated, e.g., to adjust their brightness or output based on software control, such as using a feedback mechanism based on ambient light, time of day, type of mission, type of infrastructure, etc. Similarly, settings of cameras 104-104d may be adjusted, such as automating white balance in response to time, ambient light, infrastructure type, material construction, size or environmental condition, etc. In one example, camera and/or light settings may be automated, e.g., using presets for a mission type that are thereafter adjusted based on conditions encountered in the field.
Harmen, 0054, 055, emphasis added.
a digital processor disposed in said housing;
[0069] It should be noted that various functions described herein may be implemented using processor executable instructions stored on a non-transitory storage medium or device. A non-transitory storage device may be, for example, an electronic, electromagnetic, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a non-transitory storage medium include the following: a portable computer diskette, a hard disk, a random-access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), a solid-state drive, or any suitable combination of the foregoing. In the context of this document “non-transitory” media includes all media except non-statutory signal media.
Harmen, 0069, emphasis added
a distance finder disposed in said housing;
[0055] Referring to FIG. 1C, the bottom of the bottom unit 102 includes one or more LED panels 107, a camera 104e, as well as a laser range finder 108. The LED panel(s) 107 (similar LED panels may be included on the side(s) of the top unit 101 or bottom unit 102) permit the bottom unit 102 to illuminate dark interiors such as manholes to provide adequate lighting for visual image capture by the cameras 104, 104b, 104c, 104d, 104e. All or some of the LED panels may be automated, e.g., to adjust their brightness or output based on software control, such as using a feedback mechanism based on ambient light, time of day, type of mission, type of infrastructure, etc. Similarly, settings of cameras 104-104d may be adjusted, such as automating white balance in response to time, ambient light, infrastructure type, material construction, size or environmental condition, etc. In one example, camera and/or light settings may be automated, e.g., using presets for a mission type that are thereafter adjusted based on conditions encountered in the field.
Harmen, 0054, 055, emphasis added.
and digital storage disposed in said housing for storing digital data from said 360° video camera;
[0054] FIG. 1B illustrates a top plan view of the example bottom unit 102. In this view, an example arrangement of four cameras 104, 104b, 104c, and 104d, which may be configured with wide angle optics such that they each have at least a partially overlapping view of adjacent camera(s). This provides 360 degree viewing coverage, e.g., as the bottom unit 102 is lowered down into a manhole, and facilitates use of stereo imaging techniques for depth imaging.
Harmen, 0054, 0064, emphasis added.
wherein said digital processor is configured to automatically obtain manhole video data by automatically commencing acquisition of video data once said housing is maintained at the same vertical position from the bottom of the manhole, as determined by said distance finder, for a predetermined time;
[0056] The laser range finder 108 allows the bottom unit 102 to automate control of its height (alone or in combination with communication with another unit or system, such as top unit 101). This permits for easy operation of the unit 102 to control its decent at a given rate, stop the unit 102 at a programmed height or distance from the bottom, and ensure that the unit 102 traverses down into the manhole and back up again in a controlled fashion, e.g., to a predetermined height or at a predetermined rate.
[0079] Turning to the view offered by FIG. 4C, an example of the underlying points or vertices 401a of the model 401 is illustrated. Here, the model provides points in space, or vertices, that present the physical location of the infrastructure asset, e.g., obtained from stereo image data computations made available via software such as that described herein. By way of specific example, an HD video obtained from two or more of cameras 104-104d may be used to obtain metadata comprising 1,346,973 vertices and 2,682,393 faces for the manhole under inspection. It is noted that this is a non-limiting example of a dense point cloud provided by an embodiment. Each of the vertices represents a point in the collective stereo imagery, e.g., an overlapping point in frames from adjacent cameras as described herein. As such, the vertices comprise virtualized spatial information that may be related to one another via faces, as illustrated in FIG. 4D.
Harmen, 0056, 0064, 0079, emphasis added.
once said digital processor commences acquiring video data, prompting a user to begin lowering said housing;
[0056] The laser range finder 108 allows the bottom unit 102 to automate control of its height (alone or in combination with communication with another unit or system, such as top unit 101). This permits for easy operation of the unit 102 to control its decent at a given rate, stop the unit 102 at a programmed height or distance from the bottom, and ensure that the unit 102 traverses down into the manhole and back up again in a controlled fashion, e.g., to a predetermined height or at a predetermined rate.
Harmen, 0054, 055-0056, emphasis added.
automatically stopping acquisition of said video data once said housing is a predetermined distance from said bottom of the manhole;
[0056] The laser range finder 108 allows the bottom unit 102 to automate control of its height (alone or in combination with communication with another unit or system, such as top unit 101). This permits for easy operation of the unit 102 to control its decent at a given rate, stop the unit 102 at a programmed height or distance from the bottom, and ensure that the unit 102 traverses down into the manhole and back up again in a controlled fashion, e.g., to a predetermined height or at a predetermined rate.
Harmen, 0054, 055-0056, emphasis added.
and prompting said user to withdraw the housing from said manhole.
[0056] The laser range finder 108 allows the bottom unit 102 to automate control of its height (alone or in combination with communication with another unit or system, such as top unit 101). This permits for easy operation of the unit 102 to control its decent at a given rate, stop the unit 102 at a programmed height or distance from the bottom, and ensure that the unit 102 traverses down into the manhole and back up again in a controlled fashion, e.g., to a predetermined height or at a predetermined rate.
Harmen, 0054, 055-0056, emphasis added.
However, Harmen fails to explicitly teach but Fransson teaches: for a predetermined time;
[0029] wherein the focusing unit is arranged to [0030] receive input on whether the camera is operated in the day mode or in the night mode, and, [0031] select a focusing day mode if the camera if operated in the day mode, and a focusing night mode if the camera is operated in the night mode, wherein [0032] in the focusing day mode, the focusing unit is arranged to control the IR laser range meter to continuously measure a reference distance, and [0033] in the focusing night mode, the focusing unit is arranged to control the IR laser range meter to only measure the reference distance in response to a focus trigger signal being activated, and during a predetermined time period,
[0054] The focus trigger signal may be activated based on a movement of the camera being stopped, usually a movement of the camera's field of view in at least one of a pan or tilt direction. A common situation is that the pan-tilt motor has moved the camera and then stopped, so that the field of view is covering a new part of the scene. The focus setting from before the movement is in this situation often no longer correct, i.e. the camera is delivering unsharp images. In this situation it is important to quickly regain focus, and therefore a focus trigger signal may be activated when it is detected that movement of the pan-tilt motor has stopped. The focus trigger signal may therefore be based on input indicating that a movement, typically a pan-tilt movement, of the camera has stopped. The IR laser range meter then quickly, during the predetermined time period, measures a reference distance, and this distance is used to set the focus distance. An additional passive, contrast-based, focusing may then take place.
Fransson, 0029, 0054, emphasis added.
Accordingly, it would have been obvious to one ordinary skill in the art before the effective filing date to combine the teaching of Fransson with the system of Harmen in order to captured image based on predetermined time, as such, it is valuable to be able to quickly focus the camera so that sharp and useful images can be provided depicting any objects of interest in a scene,,,--0002.
Furthermore, Harmen and Fransson fails to explicitly teaches whether the camera capable of automatically recording and stopping recording. Official Notice is taken that both the concept and the advantage of a camera automatically recording and stopping recording are well known and expected in the art. Thus, it would have been obvious to one skilled in the art, at the time of the applicant’s invention, to utilize said feature within said system taught by Harmen and Fransson, because such incorporation would result in minimizing the process of pressing a recording button or stop recording button.
Note: The motivation that was applied to claim 1 above, applies equally as well to claims 2-24 as presented blow.
In regarding to claim 2 Harmen and Fransson teaches:
2. The system of claim 1, furthermore, Fransson teaches: wherein said predetermined time is about 3-10 seconds.
Fransson, 0029, 0054
In regarding to claim 3 Harmen and Fransson teaches:
3. The system of claim 1, furthermore, Harmen teaches: wherein commencing acquisition of said video data comprises said digital processor causing said at least one light to turn on.
[0055] Referring to FIG. 1C, the bottom of the bottom unit 102 includes one or more LED panels 107, a camera 104e, as well as a laser range finder 108. The LED panel(s) 107 (similar LED panels may be included on the side(s) of the top unit 101 or bottom unit 102) permit the bottom unit 102 to illuminate dark interiors such as manholes to provide adequate lighting for visual image capture by the cameras 104, 104b, 104c, 104d, 104e. All or some of the LED panels may be automated, e.g., to adjust their brightness or output based on software control, such as using a feedback mechanism based on ambient light, time of day, type of mission, type of infrastructure, etc. Similarly, settings of cameras 104-104d may be adjusted, such as automating white balance in response to time, ambient light, infrastructure type, material construction, size or environmental condition, etc. In one example, camera and/or light settings may be automated, e.g., using presets for a mission type that are thereafter adjusted based on conditions encountered in the field.
Harmen, 0054, 055-0056, emphasis added.
In regarding to claim 4 Harmen and Fransson teaches:
4. The system of claim 1, furthermore, Harmen teaches: wherein said digital processor is configured to evaluate distance data received from said distance finder to determine the rate of descent of said module in said manhole, and, if the rate of descent is too fast, said digital processor is configured to cause a message to be transmitted to a user to slow the rate of descent.
[0056] The laser range finder 108 allows the bottom unit 102 to automate control of its height (alone or in combination with communication with another unit or system, such as top unit 101). This permits for easy operation of the unit 102 to control its decent at a given rate, stop the unit 102 at a programmed height or distance from the bottom, and ensure that the unit 102 traverses down into the manhole and back up again in a controlled fashion, e.g., to a predetermined height or at a predetermined rate.
Harmen, 0054, 055-0056, emphasis added.
In regarding to claim 5 Harmen and Fransson teaches:
5. The system of claim 1, furthermore, Harmen teaches: wherein said predetermined distance is 0.2 m-0.5 m.
Harmen, 0048, 055-0056.
In regarding to claim 6 Harmen and Fransson teaches:
6. The system of claim 1, furthermore, Harmen teaches: wherein said digital processor is configured to turn off said lights when the acquisition of video data is stopped.
[0055] Referring to FIG. 1C, the bottom of the bottom unit 102 includes one or more LED panels 107, a camera 104e, as well as a laser range finder 108. The LED panel(s) 107 (similar LED panels may be included on the side(s) of the top unit 101 or bottom unit 102) permit the bottom unit 102 to illuminate dark interiors such as manholes to provide adequate lighting for visual image capture by the cameras 104, 104b, 104c, 104d, 104e. All or some of the LED panels may be automated, e.g., to adjust their brightness or output based on software control, such as using a feedback mechanism based on ambient light, time of day, type of mission, type of infrastructure, etc. Similarly, settings of cameras 104-104d may be adjusted, such as automating white balance in response to time, ambient light, infrastructure type, material construction, size or environmental condition, etc. In one example, camera and/or light settings may be automated, e.g., using presets for a mission type that are thereafter adjusted based on conditions encountered in the field.
Harmen, 0054, 055-0056, emphasis added.
In regarding to claim 7 Harmen and Fransson teaches:
7. The system of claim 1, furthermore, Harmen teaches: further comprising: wherein said digital processor is configured to automatically wirelessly transmit said video data once the acquisition of video data has stopped.
[0046] In the example system 100 of FIG. 1, the tripod-based system 100 is illustrated in which a tripod 103 or another stand supports an upper unit 101 and a lower unit 102. The lower unit 102 is attached by a cable such that it may be lowered from the upper unit 101, e.g., down into a manhole or like infrastructure asset. The upper unit 101 and lower unit 102 may communicate via a suitable mechanism, e.g., wireless communication may be conducted between upper unit 101 and lower unit 102 to communicate control data as well as a sensor data such as image data (which may take the form of video data, HD video data, 4K video data, 8K video data, such as obtained by one or more 13 megapixel cameras, as further described herein) and laser data obtained during an inspection. Alternatively or in addition, the upper unit 101 and lower unit 102 may communicate over a suitable wire, such as USB, HDMI, a combination thereof, or any suitable wireline communication.
Harmen, 0046, 055-0056, emphasis added.
In regarding to claim 8 Harmen and Fransson teaches:
8. The system of claim 1, furthermore, Harmen teaches: wherein multiple lights are disposed along said at least one edge.
[0055] Referring to FIG. 1C, the bottom of the bottom unit 102 includes one or more LED panels 107, a camera 104e, as well as a laser range finder 108. The LED panel(s) 107 (similar LED panels may be included on the side(s) of the top unit 101 or bottom unit 102) permit the bottom unit 102 to illuminate dark interiors such as manholes to provide adequate lighting for visual image capture by the cameras 104, 104b, 104c, 104d, 104e. All or some of the LED panels may be automated, e.g., to adjust their brightness or output based on software control, such as using a feedback mechanism based on ambient light, time of day, type of mission, type of infrastructure, etc. Similarly, settings of cameras 104-104d may be adjusted, such as automating white balance in response to time, ambient light, infrastructure type, material construction, size or environmental condition, etc. In one example, camera and/or light settings may be automated, e.g., using presets for a mission type that are thereafter adjusted based on conditions encountered in the field.
Harmen, 0054, 055-0056, emphasis added.
In regarding to claim 9 Harmen and Fransson teaches:
9. The system of claim 1, furthermore, Harmen teaches: wherein said distance finder is a laser distance finder.
[0047] As shown in FIG. 1B and FIG. 1C, the lower unit 102 includes a compliment of sensors, such as four cameras, one of which is indicated at 104, noting that more or fewer cameras may be utilized. The lower unit 102 may also include additional or other sensors, for example a laser range finder, a laser profiler for collecting laser point cloud data, etc. In one example, one or more sensors, such as a sonar unit, may be included in the lower unit 102 to determine the depth of the lower unit 102 or to determine if the lower unit 102 has reached the bottom of a shaft.
Harmen, 0047 and Fig. 2, emphasis added.
In regarding to claim 10 Harmen and Fransson teaches:
10. The system of claim 1, furthermore, Harmen teaches: wherein said at least one edge comprises two side edges in a bottom edge.
Harmen, Figs. 1B-1C
In regarding to claim 11 Harmen and Fransson teaches:
11. The system of claim 10, wherein said distance finder is disposed along said bottom edge such that said distance finder is directed to the bottom of the manhole when the system is positioned in a manhole.
[0047] As shown in FIG. 1B and FIG. 1C, the lower unit 102 includes a compliment of sensors, such as four cameras, one of which is indicated at 104, noting that more or fewer cameras may be utilized. The lower unit 102 may also include additional or other sensors, for example a laser range finder, a laser profiler for collecting laser point cloud data, etc. In one example, one or more sensors, such as a sonar unit, may be included in the lower unit 102 to determine the depth of the lower unit 102 or to determine if the lower unit 102 has reached the bottom of a shaft.
Harmen, 0047, emphasis added.
In regarding to claim 12 Harmen and Fransson teaches:
12. The system of claim 10, furthermore, Harmen teaches: wherein at least one light is disposed on each of said side edges and said bottom edge.
[0055] Referring to FIG. 1C, the bottom of the bottom unit 102 includes one or more LED panels 107, a camera 104e, as well as a laser range finder 108. The LED panel(s) 107 (similar LED panels may be included on the side(s) of the top unit 101 or bottom unit 102) permit the bottom unit 102 to illuminate dark interiors such as manholes to provide adequate lighting for visual image capture by the cameras 104, 104b, 104c, 104d, 104e. All or some of the LED panels may be automated, e.g., to adjust their brightness or output based on software control, such as using a feedback mechanism based on ambient light, time of day, type of mission, type of infrastructure, etc. Similarly, settings of cameras 104-104d may be adjusted, such as automating white balance in response to time, ambient light, infrastructure type, material construction, size or environmental condition, etc. In one example, camera and/or light settings may be automated, e.g., using presets for a mission type that are thereafter adjusted based on conditions encountered in the field.
Harmen, 0054, 055-0056, emphasis added.
In regarding to claim 13 Harmen and Fransson teaches:
13. The system of claim 1, furthermore, Harmen teaches: further comprising: a power supply comprising a battery; and a port on said housing for charging said power supply.
[0052] As with the top unit 101, the bottom unit 102 may be battery powered, with a battery included in a compartment. Further, the bottom unit 102 may include an additional camera 104e and LED panel(s) 107 for lighting, as well as a laser range finder 108 for controlling the height of the bottom unit 102.
Harmen, 0052, emphasis added.
In regarding to claim 14 Harmen and Fransson teaches:
14. The system of claim 13, furthermore, Harmen teaches: wherein said port is a USB charging cable or a wall outlet charging adapter.
Harmen, 0046, 0071
In regarding to claim 15 Harmen and Fransson teaches:
15. The system of claim 1, furthermore, Harmen teaches: wherein said housing further comprises an interface for connecting to an elongated member to facilitate extension down into a manhole.
[0046] In the example system 100 of FIG. 1, the tripod-based system 100 is illustrated in which a tripod 103 or another stand supports an upper unit 101 and a lower unit 102. The lower unit 102 is attached by a cable such that it may be lowered from the upper unit 101, e.g., down into a manhole or like infrastructure asset. The upper unit 101 and lower unit 102 may communicate via a suitable mechanism, e.g., wireless communication may be conducted between upper unit 101 and lower unit 102 to communicate control data as well as a sensor data such as image data (which may take the form of video data, HD video data, 4K video data, 8K video data, such as obtained by one or more 13 megapixel cameras, as further described herein) and laser data obtained during an inspection. Alternatively or in addition, the upper unit 101 and lower unit 102 may communicate over a suitable wire, such as USB, HDMI, a combination thereof, or any suitable wireline communication.
Harmen, 0046, Fig. 1, emphasis added.
In regarding to claim 16 Harmen and Fransson teaches:
16. The system of claim 15, furthermore, Harmen teaches: further comprising: an elongated member connect to said interface.
[0046] In the example system 100 of FIG. 1, the tripod-based system 100 is illustrated in which a tripod 103 or another stand supports an upper unit 101 and a lower unit 102. The lower unit 102 is attached by a cable such that it may be lowered from the upper unit 101, e.g., down into a manhole or like infrastructure asset. The upper unit 101 and lower unit 102 may communicate via a suitable mechanism, e.g., wireless communication may be conducted between upper unit 101 and lower unit 102 to communicate control data as well as a sensor data such as image data (which may take the form of video data, HD video data, 4K video data, 8K video data, such as obtained by one or more 13 megapixel cameras, as further described herein) and laser data obtained during an inspection. Alternatively or in addition, the upper unit 101 and lower unit 102 may communicate over a suitable wire, such as USB, HDMI, a combination thereof, or any suitable wireline communication.
Harmen, 0046, Fig. 1, emphasis added.
In regarding to claim 17 Harmen and Fransson teaches:
17. The system of claim 16, furthermore, Harmen teaches: further comprising: a stand for supporting said elongated member approximately in the center of the manhole, said stand being configured to enable a user to lower said housing.
[0046] In the example system 100 of FIG. 1, the tripod-based system 100 is illustrated in which a tripod 103 or another stand supports an upper unit 101 and a lower unit 102. The lower unit 102 is attached by a cable such that it may be lowered from the upper unit 101, e.g., down into a manhole or like infrastructure asset. The upper unit 101 and lower unit 102 may communicate via a suitable mechanism, e.g., wireless communication may be conducted between upper unit 101 and lower unit 102 to communicate control data as well as a sensor data such as image data (which may take the form of video data, HD video data, 4K video data, 8K video data, such as obtained by one or more 13 megapixel cameras, as further described herein) and laser data obtained during an inspection. Alternatively or in addition, the upper unit 101 and lower unit 102 may communicate over a suitable wire, such as USB, HDMI, a combination thereof, or any suitable wireline communication.
Harmen, 0046, Fig. 1, emphasis added.
In regarding to claim 18 Harmen and Fransson teaches:
18. The system of claim 1, further comprising: an external on-off switch for powering the digital processor.
Harmen, 0051 and Fig. 2 computer 200
In regarding to claim 19 Harmen and Fransson teaches:
19. The system of claim 1, furthermore, Harmen teaches: further comprising: an external on-off switch for powering said camera.
Harmen, 0051, Fig. 2 computer 200
Claim 21 list all similar elements of claim 1. Therefore, the supporting rationale of the rejection to claim 1 applies equally as well to claim 21.
Claims 22-24 list all similar elements of claims 3, 6 and 7. Therefore, the supporting rationale of the rejection to claims 3, 6 and 7 applies equally as well to claims 22-24.
In regarding to claim 25 Harmen teaches:
25. A system for obtaining a 360° video of a manhole, said system comprising: a housing having at least a planar portion defining first and second opposing sides and one or more edges essentially orthogonal to said first and second opposing sides;
Harmen, 0054-0056 and Fig. 1
a 360° video camera in said planar portion and operatively connected to said power supply,
[0054] FIG. 1B illustrates a top plan view of the example bottom unit 102. In this view, an example arrangement of four cameras 104, 104b, 104c, and 104d, which may be configured with wide angle optics such that they each have at least a partially overlapping view of adjacent camera(s). This provides 360 degree viewing coverage, e.g., as the bottom unit 102 is lowered down into a manhole, and facilitates use of stereo imaging techniques for depth imaging.
[0051] As shown in FIG. 1B (top plan view) and FIG. 1C (bottom plan view), the bottom unit 102 includes a power and data connector, which may be similar to power and data connector for top unit 101. By way of example, power and data connectors may be used to couple top unit 101 and bottom unit 102 to a similar system connection, such as a laptop or other computer. While the top unit 101 and bottom unit 102 communicate data wirelessly between one another, in one example, these units 101, 102 may also be wired to one another to exchange power, data, or a combination thereof.
Harmen, 0051 0054-0056, emphasis added.
said camera comprising first and second lenses, said first lens protruding from said first opposing side and said second lens protruding from said second opposing side;
[0054] FIG. 1B illustrates a top plan view of the example bottom unit 102. In this view, an example arrangement of four cameras 104, 104b, 104c, and 104d, which may be configured with wide angle optics such that they each have at least a partially overlapping view of adjacent camera(s). This provides 360 degree viewing coverage, e.g., as the bottom unit 102 is lowered down into a manhole, and facilitates use of stereo imaging techniques for depth imaging.
Harmen, 0054-0056, 0064, emphasis added.
one or more lights disposed along said at least one edge;
[0055] Referring to FIG. 1C, the bottom of the bottom unit 102 includes one or more LED panels 107, a camera 104e, as well as a laser range finder 108. The LED panel(s) 107 (similar LED panels may be included on the side(s) of the top unit 101 or bottom unit 102) permit the bottom unit 102 to illuminate dark interiors such as manholes to provide adequate lighting for visual image capture by the cameras 104, 104b, 104c, 104d, 104e. All or some of the LED panels may be automated, e.g., to adjust their brightness or output based on software control, such as using a feedback mechanism based on ambient light, time of day, type of mission, type of infrastructure, etc. Similarly, settings of cameras 104-104d may be adjusted, such as automating white balance in response to time, ambient light, infrastructure type, material construction, size or environmental condition, etc. In one example, camera and/or light settings may be automated, e.g., using presets for a mission type that are thereafter adjusted based on conditions encountered in the field.
Harmen, 0054-0056, 0064, emphasis added.
a digital processor disposed in said housing;
[0069] It should be noted that various functions described herein may be implemented using processor executable instructions stored on a non-transitory storage medium or device. A non-transitory storage device may be, for example, an electronic, electromagnetic, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a non-transitory storage medium include the following: a portable computer diskette, a hard disk, a random-access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a portable compact disc read-only memory (CD-ROM), a solid-state drive, or any suitable combination of the foregoing. In the context of this document “non-transitory” media includes all media except non-statutory signal media.
Harmen, 0069, emphasis added
a distance finder disposed in said housing and operatively connected to at least said digital processor;
[0055] Referring to FIG. 1C, the bottom of the bottom unit 102 includes one or more LED panels 107, a camera 104e, as well as a laser range finder 108. The LED panel(s) 107 (similar LED panels may be included on the side(s) of the top unit 101 or bottom unit 102) permit the bottom unit 102 to illuminate dark interiors such as manholes to provide adequate lighting for visual image capture by the cameras 104, 104b, 104c, 104d, 104e. All or some of the LED panels may be automated, e.g., to adjust their brightness or output based on software control, such as using a feedback mechanism based on ambient light, time of day, type of mission, type of infrastructure, etc. Similarly, settings of cameras 104-104d may be adjusted, such as automating white balance in response to time, ambient light, infrastructure type, material construction, size or environmental condition, etc. In one example, camera and/or light settings may be automated, e.g., using presets for a mission type that are thereafter adjusted based on conditions encountered in the field.
Harmen, 0054-0056, emphasis added.
and digital storage disposed in said housing and operatively connected to at least said digital processor for storing digital data from said 360° video camera,
[0054] FIG. 1B illustrates a top plan view of the example bottom unit 102. In this view, an example arrangement of four cameras 104, 104b, 104c, and 104d, which may be configured with wide angle optics such that they each have at least a partially overlapping view of adjacent camera(s). This provides 360 degree viewing coverage, e.g., as the bottom unit 102 is lowered down into a manhole, and facilitates use of stereo imaging techniques for depth imaging.
Harmen, 0054-0056, 0064, emphasis added.
wherein said digital processor is configured to automatically obtain manhole video data;
wherein said digital processor is configured to commence automatic acquisition of video data once said housing is maintained at the same vertical position from the bottom of the manhole, as determined by said distance finder, for a predetermined time; wherein said digital processor is configured to automatically stop acquiring video data once the housing is a predetermined distance from said bottom of the manhole.
[0056] The laser range finder 108 allows the bottom unit 102 to automate control of its height (alone or in combination with communication with another unit or system, such as top unit 101). This permits for easy operation of the unit 102 to control its decent at a given rate, stop the unit 102 at a programmed height or distance from the bottom, and ensure that the unit 102 traverses down into the manhole and back up again in a controlled fashion, e.g., to a predetermined height or at a predetermined rate.
[0079] Turning to the view offered by FIG. 4C, an example of the underlying points or vertices 401a of the model 401 is illustrated. Here, the model provides points in space, or vertices, that present the physical location of the infrastructure asset, e.g., obtained from stereo image data computations made available via software such as that described herein. By way of specific example, an HD video obtained from two or more of cameras 104-104d may be used to obtain metadata comprising 1,346,973 vertices and 2,682,393 faces for the manhole under inspection. It is noted that this is a non-limiting example of a dense point cloud provided by an embodiment. Each of the vertices represents a point in the collective stereo imagery, e.g., an overlapping point in frames from adjacent cameras as described herein. As such, the vertices comprise virtualized spatial information that may be related to one another via faces, as illustrated in FIG. 4D.
Harmen, 0056, 0064, 0079, emphasis added.
However, Harmen fails to explicitly teach but Fransson teaches: for a predetermined time;
[0029] wherein the focusing unit is arranged to [0030] receive input on whether the camera is operated in the day mode or in the night mode, and, [0031] select a focusing day mode if the camera if operated in the day mode, and a focusing night mode if the camera is operated in the night mode, wherein [0032] in the focusing day mode, the focusing unit is arranged to control the IR laser range meter to continuously measure a reference distance, and [0033] in the focusing night mode, the focusing unit is arranged to control the IR laser range meter to only measure the reference distance in response to a focus trigger signal being activated, and during a predetermined time period,
[0054] The focus trigger signal may be activated based on a movement of the camera being stopped, usually a movement of the camera's field of view in at least one of a pan or tilt direction. A common situation is that the pan-tilt motor has moved the camera and then stopped, so that the field of view is covering a new part of the scene. The focus setting from before the movement is in this situation often no longer correct, i.e. the camera is delivering unsharp images. In this situation it is important to quickly regain focus, and therefore a focus trigger signal may be activated when it is detected that movement of the pan-tilt motor has stopped. The focus trigger signal may therefore be based on input indicating that a movement, typically a pan-tilt movement, of the camera has stopped. The IR laser range meter then quickly, during the predetermined time period, measures a reference distance, and this distance is used to set the focus distance. An additional passive, contrast-based, focusing may then take place.
Fransson, 0029, 0054, emphasis added.
Accordingly, it would have been obvious to one ordinary skill in the art before the effective filing date to combine the teaching of Fransson with the system of Harmen in order to captured image based on predetermined time, as such, it is valuable to be able to quickly focus the camera so that sharp and useful images can be provided depicting any objects of interest in a scene,,,--0002.
Furthermore, Harmen and Fransson fails to explicitly teaches whether the camera capable of automatically recording and stopping recording. Official Notice is taken that both the concept and the advantage of a camera automatically recording and stopping recording are well known and expected in the art. Thus, it would have been obvious to one skilled in the art, at the time of the applicant’s invention, to utilize said feature within said system taught by Harmen and Fransson, because such incorporation would result in minimizing the process of manually interacting with the recording button or stop recording button.
Claims 28-30 list all similar elements of claims 2, 17 and 3 respectively. Therefore, the supporting rationale of the rejection to claims 2, 17 and 3 applies equally as well to claims 28-30.
In regarding to claim 31 Harmen teaches:
31. The system of claim 26, wherein said digital processor is configured to evaluate distance data received from said distance finder to determine the rate of descent of said module in said manhole, and, if the rate of descent is too fast, said digital processor is configured to cause a message to be transmitted to a user to slow descent.
[0056] The laser range finder 108 allows the bottom unit 102 to automate control of its height (alone or in combination with communication with another unit or system, such as top unit 101). This permits for easy operation of the unit 102 to control its decent at a given rate, stop the unit 102 at a programmed height or distance from the bottom, and ensure that the unit 102 traverses down into the manhole and back up again in a controlled fashion, e.g., to a predetermined height or at a predetermined rate.
Harmen, 0054, 055-0056, emphasis added.
In regarding to claim 33 Harmen teaches:
33. The system of claim 25, wherein said predetermined distance is 0.2 m-0.5 m.
Harmen, 0048, 055-0056.
In regarding to claim 34 Harmen teaches:
34. The system of claim 25, wherein said digital processor is configured to turn off said lights when the acquisition of video data is stopped.
[0055] Referring to FIG. 1C, the bottom of the bottom unit 102 includes one or more LED panels 107, a camera 104e, as well as a laser range finder 108. The LED panel(s) 107 (similar LED panels may be included on the side(s) of the top unit 101 or bottom unit 102) permit the bottom unit 102 to illuminate dark interiors such as manholes to provide adequate lighting for visual image capture by the cameras 104, 104b, 104c, 104d, 104e. All or some of the LED panels may be automated, e.g., to adjust their brightness or output based on software control, such as using a feedback mechanism based on ambient light, time of day, type of mission, type of infrastructure, etc. Similarly, settings of cameras 104-104d may be adjusted, such as automating white balance in response to time, ambient light, infrastructure type, material construction, size or environmental condition, etc. In one example, camera and/or light settings may be automated, e.g., using presets for a mission type that are thereafter adjusted based on conditions encountered in the field.
Harmen, 0054, 055-0056, emphasis added.
In regarding to claim 35 Harmen teaches:
35. The system of claim 25, wherein said digital processor is configured to automatically transmit said inspection data once the acquisition of video data is stopped.
[0046] In the example system 100 of FIG. 1, the tripod-based system 100 is illustrated in which a tripod 103 or another stand supports an upper unit 101 and a lower unit 102. The lower unit 102 is attached by a cable such that it may be lowered from the upper unit 101, e.g., down into a manhole or like infrastructure asset. The upper unit 101 and lower unit 102 may communicate via a suitable mechanism, e.g., wireless communication may be conducted between upper unit 101 and lower unit 102 to communicate control data as well as a sensor data such as image data (which may take the form of video data, HD video data, 4K video data, 8K video data, such as obtained by one or more 13 megapixel cameras, as further described herein) and laser data obtained during an inspection. Alternatively or in addition, the upper unit 101 and lower unit 102 may communicate over a suitable wire, such as USB, HDMI, a combination thereof, or any suitable wireline communication.
Harmen, 0046, 055-0056, emphasis added.
In regarding to claim 36 Harmen teaches:
36. The system of claim 25, wherein multiple lights are disposed along said at least one edge.
[0055] Referring to FIG. 1C, the bottom of the bottom unit 102 includes one or more LED panels 107, a camera 104e, as well as a laser range finder 108. The LED panel(s) 107 (similar LED panels may be included on the side(s) of the top unit 101 or bottom unit 102) permit the bottom unit 102 to illuminate dark interiors such as manholes to provide adequate lighting for visual image capture by the cameras 104, 104b, 104c, 104d, 104e. All or some of the LED panels may be automated, e.g., to adjust their brightness or output based on software control, such as using a feedback mechanism based on ambient light, time of day, type of mission, type of infrastructure, etc. Similarly, settings of cameras 104-104d may be adjusted, such as automating white balance in response to time, ambient light, infrastructure type, material construction, size or environmental condition, etc. In one example, camera and/or light settings may be automated, e.g., using presets for a mission type that are thereafter adjusted based on conditions encountered in the field.
Harmen, 0054, 055-0056, emphasis added.
In regarding to claim 37 Harmen teaches:
37. The system of claim 25, wherein said distance finder is a laser distance finder.
[0047] As shown in FIG. 1B and FIG. 1C, the lower unit 102 includes a compliment of sensors, such as four cameras, one of which is indicated at 104, noting that more or fewer cameras may be utilized. The lower unit 102 may also include additional or other sensors, for example a laser range finder, a laser profiler for collecting laser point cloud data, etc. In one example, one or more sensors, such as a sonar unit, may be included in the lower unit 102 to determine the depth of the lower unit 102 or to determine if the lower unit 102 has reached the bottom of a shaft.
Harmen, 0047 and Fig. 2, emphasis added.
In regarding to claim 38 Harmen teaches:
38. The system of claim 25, wherein said at least one edge comprises two side edges in a bottom edge.
Harmen, Figs. 1B-1C
In regarding to claim 39 Harmen teaches:
39. The system of claim 38, wherein said distance finder is disposed along said bottom edge such that said distance finder is directed to the bottom of the manhole when the system is positioned in a manhole.
[0047] As shown in FIG. 1B and FIG. 1C, the lower unit 102 includes a compliment of sensors, such as four cameras, one of which is indicated at 104, noting that more or fewer cameras may be utilized. The lower unit 102 may also include additional or other sensors, for example a laser range finder, a laser profiler for collecting laser point cloud data, etc. In one example, one or more sensors, such as a sonar unit, may be included in the lower unit 102 to determine the depth of the lower unit 102 or to determine if the lower unit 102 has reached the bottom of a shaft.
Harmen, 0047, emphasis added.
In regarding to claim 40 Harmen teaches:
40. The system of claim 38, wherein at least one light is disposed on each of said side edges and said bottom edge.
[0055] Referring to FIG. 1C, the bottom of the bottom unit 102 includes one or more LED panels 107, a camera 104e, as well as a laser range finder 108. The LED panel(s) 107 (similar LED panels may be included on the side(s) of the top unit 101 or bottom unit 102) permit the bottom unit 102 to illuminate dark interiors such as manholes to provide adequate lighting for visual image capture by the cameras 104, 104b, 104c, 104d, 104e. All or some of the LED panels may be automated, e.g., to adjust their brightness or output based on software control, such as using a feedback mechanism based on ambient light, time of day, type of mission, type of infrastructure, etc. Similarly, settings of cameras 104-104d may be adjusted, such as automating white balance in response to time, ambient light, infrastructure type, material construction, size or environmental condition, etc. In one example, camera and/or light settings may be automated, e.g., using presets for a mission type that are thereafter adjusted based on conditions encountered in the field.
Harmen, 0054, 055-0056, emphasis added.
In regarding to claim 41 Harmen teaches:
41. The system of claim 25, further comprising a power supply in said housing.
[0052] As with the top unit 101, the bottom unit 102 may be battery powered, with a battery included in a compartment. Further, the bottom unit 102 may include an additional camera 104e and LED panel(s) 107 for lighting, as well as a laser range finder 108 for controlling the height of the bottom unit 102.
Harmen, 0052, emphasis added.
In regarding to claim 42 Harmen teaches:
42. The system of claim 41, further comprising: a port on said housing for charging said power supply.
[0052] As with the top unit 101, the bottom unit 102 may be battery powered, with a battery included in a compartment. Further, the bottom unit 102 may include an additional camera 104e and LED panel(s) 107 for lighting, as well as a laser range finder 108 for controlling the height of the bottom unit 102.
Harmen, 0052, emphasis added.
In regarding to claim 43 Harmen teaches:
43. The system of claim 42, wherein said port is a USB charging cable or a wall outlet charging adapter.
Harmen, 0046, 0071
In regarding to claim 44 Harmen teaches:
44. The system of claim 25, wherein said housing further comprises an interface for connecting to an elongated member to facilitate extension down into a manhole.
[0046] In the example system 100 of FIG. 1, the tripod-based system 100 is illustrated in which a tripod 103 or another stand supports an upper unit 101 and a lower unit 102. The lower unit 102 is attached by a cable such that it may be lowered from the upper unit 101, e.g., down into a manhole or like infrastructure asset. The upper unit 101 and lower unit 102 may communicate via a suitable mechanism, e.g., wireless communication may be conducted between upper unit 101 and lower unit 102 to communicate control data as well as a sensor data such as image data (which may take the form of video data, HD video data, 4K video data, 8K video data, such as obtained by one or more 13 megapixel cameras, as further described he