DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/12/2025 has been entered.
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No. DE10 2020 130 280.0, filed on 11/17/2020.
Information Disclosure Statement
The information disclosure statement (IDS) submitted on 10/29/2025 is in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered by the examiner.
Response to Amendment
This action is in response to amendments and remarks filed on 12/12/2025. The examiner notes the following adjustments to the claims by the applicant:
Claims 1 and 15 are amended;
Claim 17 is new;
No additional claims are cancelled.
Therefore, Claims 1-6 and 8-17 are pending examination, in which Claims 1 and 15 are independent claims.
In light of the instant amendments and arguments:
Further examination resulted in a new rejection of Claims 1-6 and 8-17 under 35 U.S.C. § 103, as detailed below.
THIS ACTION IS MADE FINAL. Necessitated by amendment.
Response to Arguments
Applicant presents the following arguments regarding the previous office action:
To overcome the 35 U.S.C. § 103 rejection, the applicant has amended each independent claim to include the additional underlined limitations: "evaluating, using the receiving apparatus, a last arriving signal echo of the plurality of signal echoes assigned to the same solid angle, wherein the last arriving signal echo assigned to the same solid angle is determined using the receiving time; and identifying the object based on the evaluated last arriving signal echo assigned to the same solid angle";
“The claimed method and industrial truck is an improvement on current LIDAR systems and is more reliable at identifying objects that are concealed by at least partially transparent obstacles. The claimed method and device solve this problem by measuring both the solid angle and the receiving time of each signal echo and identifying the object and an at least partially transparent object in front of the object on the basis of the measured solid angles and receiving times.”;
“Hoeferlin discloses locating an industrial truck with respect to its environment. Accordingly, Hoeferlin fails to disclose, tach, or suggest:
"recognizing an at least partially transparent object at least partially in front of the object by a plurality of signal echoes assigned the same solid angle; evaluating, using the receiving apparatus, a last arriving signal echo of the plurality of signal echoes assigned to the same solid angle, wherein the last arriving signal echo assigned to the same solid angle is determined using the receiving time; and identifying the object based on the evaluated last arriving signal echo assigned to the same solid angle...," as recited in amended independent claim 1”;
“Applicant respectfully disagrees as Kopelke is silent with regard to the second data set including any measurement of a solid angle, a receiving time, or a determination of a last arriving signal echo of the plurality of signal echoes assigned the same solid angle. Therefore, the prior art taken alone or in combination fails to disclose, teach, or suggest each element recited in amended independent claims 1 and 15.”.
Applicant's arguments A., B., C. and D. appear to be directed to the instantly amended subject matter. Accordingly, they have been addressed in the rejections below.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-3, 8-10, 12-17 are rejected under 35 U.S.C. §103 as being unpatentable over the combination of Hoeferlin et al. (DE102016224764A1, henceforth Hoeferlin), and Armburst et al. (US 2023/0286565 A, henceforth Armburst).
Regarding Claim 1, Hoeferlin explicitly recites the limitations: a method for identifying objects in a warehouse using an industrial truck {forklift 100, Figs. 1-3}, the method comprising: emitting an identification signal into an environment using a transmitting apparatus positioned on the industrial truck {laser scanner 102 emitting laser beam 104, Figs. 1-3}; detecting, by a receiving apparatus positioned on the industrial truck, reflected identification signals {Fig. 4} by the object in the environment as signal echoes {“a detection unit for detecting an intensity of a reflection of the laser beam in an intensity curve”, ¶[0017], and detection unit 506, Fig. 5: “The laser scanner 102 essentially corresponds to the laser scanner in FIG 5”, ¶[0041]}; measuring, using the receiving apparatus, a solid angle and a receiving time for each of the signal echoes {the operation of laser scanner 102, Figs. 1-3, in terms of angular deflection and transit-time is described in ¶[0022], for which one skilled in the art will appreciate that tracking/measuring angles is an inherent feature of laser scanning systems; transit time plot of data, Fig. 4, registered by detection unit: “The intensity curve 400 is plotted in a diagram having on its abscissa a transit time between the emission of the laser beam and the reception of the reflected light.”, ¶[0032]}; monitoring, using the receiving apparatus, whether a same solid angle has been assigned to multiple signal echoes {laser scanner 102, Figs. 1-3, and detection unit 506, Fig. 4, make up a complete laser scanning system (i.e., elements to emit the laser signal, receive reflected signals and track returning signal angle and time of arrival); see also ¶[0041]}; evaluating a last arriving signal echo of the plurality of signal echoes {“Local maxima 402, 404, 406 of the intensity curve or peaks 402, 404, 406 indicate distances to impact points of the laser beam.”, ¶[0032]} with the same solid angle ; evaluating, using the receiving apparatus, a last arriving signal echo {application of filtering enables determination of last signal received: “the second peak 404 is discarded, i.e., not used for distance calculation, because the second peak 404 is lower than a minimum value 410. The reference value 408 and/or minimum value 410 may also be proportional to the time of flight, since a lower light intensity may be reflected from the point of impact due to the scattering of light over a greater distance.”, ¶[0035-0036]} of the plurality of signal echoes {“Local maxima 402, 404, 406 of the intensity curve or peaks 402, 404, 406 indicate distances to impact points of the laser beam.”, ¶[0032]} assigned to the same solid angle {local maxima 406 is at the largest time interval, Fig. 4, and reflect the signal bouncing off the wall versus maxima’s 402, 404 that reflect off the pallets 108: “The distances to the impact points 110 are imaged by local maxima or peaks of the intensity profile in the intensity profile.”, ¶[0030]}, wherein the last arriving signal echo assigned to the same solid angle is determined using the receiving time {¶[0030] describes different transit times being associated with different distances from which the return signal originates; one skilled in the art will appreciate that, for a specific time, the laser beam 104 emitted by laser scanner 102, Figs. 1-2, will arrive back at the same the same angle emitted, but the amplitude of the returning signal and the delay time will vary depending on how far away the object is and the type of object, or objects, the signal bounces off}; and identifying the object based on the evaluated last arriving signal echo {identification of pallet 108 or openings in pallet, resulting from signals bouncing off wall 202, as represented in Figs. 1-3}.
Hoeferlin does not appear to explicitly recite the limitations: recognizing an at least partially transparent object at least partially in front of the object by a plurality of signal echoes assigned the same solid angle; and identifying the object based on the evaluated last arriving signal echo assigned to the same solid angle, wherein the industrial truck is automatically operated based on the at least partially transparent object and the identified object.
However, Armburst explicitly recites limitations: recognizing an at least partially transparent object {robotic cart platform 440/autonomous cart 445, embodiment in Figs. 17-30, uses radar, ¶[00164], to detect objects including partially transparent object that the vehicle can pass through, ¶[00165]; Fig. 33 shows the radar sensor detection fields detecting object surrounding an autonomous cart} at least partially in front of the object by a plurality of signal echoes assigned the same solid angle; and identifying the object based on the evaluated last arriving signal echo assigned to the same solid angle {in Fig. 37, cart 445 approaches a transparent hanging curtain with a person working behind it, the cart will not pass through the curtain if the person is too close to the curtain, which is consistent with the description in ¶[0165] that discusses detecting a transparent object, and moving through such objects but also avoiding obstacles}, wherein the industrial truck is automatically operated based on the at least partially transparent object and the identified object {when necessary the cart will deviate from a planned route based on obstacle detection: “the RCP 440 can determine (e.g., calculate via radar sensor data and preselected memory settings) when the user wants the cart 445 to avoid a certain type of obstacle 262, 262a, (e.g., wall, moving person or other moving cart, etc.) go through a certain type of obstacle 262h (e.g., plastic hanging strips, cloth curtain, etc.), or go over a certain type of obstacle 262d laying on the floor surface 1a (e.g., debris, paper, liquid, etc.). From its internal map 260′ of the environment 260 stored in the RCP 440 memory 103, the processor 102, 106 of the RCP 440 calculate or otherwise determines a planned route 149 for the cart 445 and navigates this route, and when necessary an alter route 149a, via its sensors 140, 150, 550, 640 to avoid fixed and temporary obstructions 262, 262a and reach a desired destination 172.”, ¶[0165]}.
Hoeferlin and Armburst are analogous art because they both deal with sensors systems for enabling an autonomous vehicle to operate safely in a work environment.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Hoeferlin and Armburst before them, to modify the teachings of Hoeferlin to include the teachings of Armburst to improve autonomous vehicle safety by providing the ability to detect semi-transparent objects that can obscure worker activity, and act, when necessary, to take evasive action.
Regarding Claim 2, the combination of Hoeferlin and Armburst discloses the limitations of Claim 1, as discussed supra. In addition, Hoeferlin explicitly recites the limitation: wherein signal echoes which arrive in the receiving apparatus outside a predetermined time period after the emission of the identification signal are not evaluated for purposes of identifying the object {with regard to Fig. 4, the rolling-off of the far end of intensity curve 400 represents an end to object detection period corresponding no additional object identification, i.e., no additional local maximums occur; “The distances to the impact points 110 are imaged by local maxima or peaks of the intensity profile in the intensity profile.”, ¶[0030]}.
Regarding Claim 3, the combination of Hoeferlin and Armburst discloses the limitations of Claim 2, as discussed supra. In addition, Hoeferlin explicitly recites the limitation: wherein signal echoes which arrive in the receiving apparatus outside a predetermined time period after the emission of the identification signal are not evaluated for purposes of identifying the object {with regard to Fig. 4, the rolling-off of the far end of intensity curve 400 represents an end to object detection period corresponding no additional object identification, i.e., no additional local maximums occur, and is associated with the scanning signal impinging on wall 202; “The distances to the impact points 110 are imaged by local maxima or peaks of the intensity profile in the intensity profile.”, ¶[0030]}.
Regarding Claim 8, the combination of Hoeferlin and Armburst discloses the limitations of Claim 1, as discussed supra. Hoeferlin does not seem to explicitly recites the limitation: wherein the second object is identified as one of a: (1) film; and (2) strip curtain.
However, Armburst explicitly recites the limitation: wherein the second object is identified as one of a: (1) film; and (2) strip curtain {“The radar sensors 640 also allow the cart 445 to go through or over certain obstacles along the navigation path 149 of the cart. For example, the radar sensors 640 can detect a thin flexible lightweight material, such as thin one-eighth inch (⅛”) transparent plastic strips 262ps hanging in a doorway 262dw (FIG. 36). These thin flexible lightweight materials are used for ventilation, dust or noise control, or to divide two areas of a room for privacy control, but are not intended to prevent a person or cart from passing through them.”, ¶[0165]}.
Regarding Claim 9, the combination of Hoeferlin and Armburst discloses the limitations of Claim 1, as discussed supra. Hoeferlin does not seem to explicitly recites the limitation: wherein a speed of the industrial truck is adjusted if the second object is identified as one of: (1) film; and (2) strip curtain.
However, Armburst explicitly recites the limitation: wherein a speed of the industrial truck is adjusted if the second object is identified as one of: (1) film; and (2) strip curtain {“the radar devices 640 sense through the optical obstructions 262, 262a (e.g., walls forming corners or straddling doorways, people by corners or in doorways, etc.) to gather moving object data that includes obscured adjacent moving object data to allow additional time and distance for the cart 445 to avoid collisions with moving objects 262m, 262p by altering the navigation path 149 a or speed of the cart. The radar sensors 640 also allow the cart 445 to go through or over certain obstacles along the navigation path 149 of the cart. For example, the radar sensors 640 can detect a thin flexible lightweight material, such as thin one-eighth inch (⅛”) transparent plastic strips 262 ps hanging in a doorway 262 dw (FIG. 36).”, ¶[0164-0165]}.
Regarding Claim 10, the combination of Hoeferlin and Armburst discloses the limitations of Claim 1, as discussed supra. Hoeferlin does not seem to explicitly recites the limitation: wherein the identifying of the object further considers individual movement of the industrial truck.
However, Armburst explicitly recites the limitation: wherein the identifying of the object further considers individual movement of the industrial truck {cart 45 in Fig. 8 and “When the “GO” button 185 is pressed, the cart 45 will autonomously move to the next predetermined destination 172 selected by the worker”, ¶[0111], and “the radar devices 640 sense through the optical obstructions 262, 262 a (e.g., walls forming corners or straddling doorways, people by corners or in doorways, etc.) to gather moving object data that includes obscured adjacent moving object data to allow additional time and distance for the cart 445 to avoid collisions with moving objects 262 m, 262 p by altering the navigation path 149 a or speed of the cart. The radar sensors 640 also allow the cart 445 to go through or over certain obstacles along the navigation path 149 of the cart. For example, the radar sensors 640 can detect a thin flexible lightweight material”, ¶[0164-0165]}.
Regarding Claim 12, the combination of Hoeferlin and Armburst discloses the limitations of Claim 1, as discussed supra. Hoeferlin does not seem to explicitly recites the limitation: wherein steering of the industrial truck is accomplished by automatic orientation of the industrial truck based on the identified object.
However, Armburst explicitly recites the limitation: wherein steering of the industrial truck is accomplished by automatic orientation of the industrial truck based on the identified object {cart 45 in Fig. 8 and “When the “GO” button 185 is pressed, the cart 45 will autonomously move to the next predetermined destination 172 selected by the worker”, ¶[0111], and “the radar devices 640 sense through the optical obstructions 262, 262 a (e.g., walls forming corners or straddling doorways, people by corners or in doorways, etc.) to gather moving object data that includes obscured adjacent moving object data to allow additional time and distance for the cart 445 to avoid collisions with moving objects 262 m, 262 p by altering the navigation path 149 a or speed of the cart. The radar sensors 640 also allow the cart 445 to go through or over certain obstacles along the navigation path 149 of the cart. For example, the radar sensors 640 can detect a thin flexible lightweight material”, ¶[0164-0165]}.
Regarding Claim 13, the combination of Hoeferlin and Armburst discloses the limitations of Claim 12, as discussed supra. Hoeferlin does not seem to explicitly recites the limitation: further comprising generating at least one of an (1) optical and (2) an acoustic displays of an orientation of the industrial truck to be carried out by an operator of the industrial truck based on the identified object.
However, Armburst explicitly recites the limitation: generating at least one of an (1) optical and (2) an acoustic displays of an orientation of the industrial truck to be carried out by an operator of the industrial truck based on the identified object {the operator can visualize the location and orientation of Remote Cart Platform/RCP 40, as shown in Fig. 37 and described in ¶[0119], and, as described in ¶[00165], the RCP can be programmed to account for hanging, transparent strips, thus the operator will see motion associated with dealing with this type of obstacle on the display in Fig. 37}.
Regarding Claim 14, the combination of Hoeferlin and Armburst discloses the limitations of Claim 1, as discussed supra. Hoeferlin does not seem to explicitly recites the limitation: wherein identifying the object further considers surrounding environmental elements.
However, Armburst explicitly recites the limitation: wherein identifying the object further considers surrounding environmental elements {cart 45 in Fig. 8 and “the radar devices 640 sense through the optical obstructions 262, 262 a (e.g., walls forming corners or straddling doorways, people by corners or in doorways, etc.) to gather moving object data that includes obscured adjacent moving object data to allow additional time and distance for the cart 445 to avoid collisions with moving objects 262 m, 262 p by altering the navigation path 149 a or speed of the cart. The radar sensors 640 also allow the cart 445 to go through or over certain obstacles along the navigation path 149 of the cart. For example, the radar sensors 640 can detect a thin flexible lightweight material”, ¶[0164-0165]}.
Regarding Claim 15, Hoeferlin teaches of an industrial truck {forklift 100, Figs. 1-3} comprising: a transmitting apparatus configured to emit an identification signal into an environment {laser scanner 102 emitting laser beam 104, Figs. 1-3}; a receiving apparatus configured to detect the identification signal as a signal echo {Fig. 4} reflected by an object in the environment {“a detection unit for detecting an intensity of a reflection of the laser beam in an intensity curve”, ¶[0017], and detection unit 506, Fig. 5: “The laser scanner 102 essentially corresponds to the laser scanner in FIG. 5”, ¶[0041]}; an evaluation apparatus configured to, measure a solid angle and a receiving time to each of the signal echoes {the operation of laser scanner 102, Figs. 1-3, in terms of angular deflection and transit-time is described in ¶[0022], for which one skilled in the art will appreciate that tracking/measuring angles is an inherent feature of laser scanning systems; transit time plot of data, Fig. 4, registered by detection unit: “The intensity curve 400 is plotted in a diagram having on its abscissa a transit time between the emission of the laser beam and the reception of the reflected light.”, ¶[0032]}; evaluate the signal echo assigned to the same solid angle {¶[0030] describes different transit times being associated with different distances from which the return signal originates; one skilled in the art will appreciate that, for a specific time, the laser beam 104 emitted by laser scanner 102, Figs. 1-2, will arrive back at the same the same angle emitted, but the amplitude of the returning signal and the delay time will vary depending on how far away the object is and the type of object, or objects, the signal bounces off} which arrived last in the receiving apparatus {application of filtering enables determination of last signal received: “the second peak 404 is discarded, i.e., not used for distance calculation, because the second peak 404 is lower than a minimum value 410. The reference value 408 and/or minimum value 410 may also be proportional to the time of flight, since a lower light intensity may be reflected from the point of impact due to the scattering of light over a greater distance.”, ¶[0035-0036]} according to the receiving time {“Local maxima 402, 404, 406 of the intensity curve or peaks 402, 404, 406 indicate distances to impact points of the laser beam.”, ¶[0032]; local maxima 406 is at the largest time interval, Fig. 4, and reflect the signal bouncing off the wall versus maxima’s 402, 404 that reflect off the pallets 108: “The distances to the impact points 110 are imaged by local maxima or peaks of the intensity profile in the intensity profile.”, ¶[0030]}; identify the object based on the last arriving signal echo {identification of pallet 108 or openings in pallet, resulting from signals bouncing off wall 202, as represented in Figs. 1-3}.
Hoeferlin does not appear to explicitly recite the limitations: identify an at least partially transparent object in front of the object by a plurality of signal echoes assigned to a same solid angle; and identify the object based on the last arriving signal echo assigned to the same solid angle, and automatically operate the industrial truck based on the at least partially transparent object and the identified object.
However, Armburst explicitly recites limitations: identify an at least partially transparent object {robotic cart platform 440/autonomous cart 445, embodiment in Figs. 17-30, uses radar, ¶[00164], to detect objects including partially transparent object that the vehicle can pass through, ¶[00165]; Fig. 33 shows the radar sensor detection fields detecting object surrounding an autonomous cart} in front of the object by a plurality of signal echoes assigned to a same solid angle {in Fig. 37, cart 445 approaches a transparent hanging curtain with a person working behind it, the cart will not pass through the curtain if the person is too close to the curtain, which is consistent with the description in ¶[0165] that discusses detecting a transparent object, and moving through such objects but also avoiding obstacles}; and identify the object based on the last arriving signal echo assigned to the same solid angle {in Fig. 37, cart 445 approaches a transparent hanging curtain with a person working behind it, the cart will not pass through the curtain if the person is too close to the curtain, which is consistent with the description in ¶[0165] that discusses detecting a transparent object, and moving through such objects but also avoiding obstacles}, and automatically operate the industrial truck based on the at least partially transparent object and the identified object {when necessary the cart will deviate from a planned route based on obstacle detection: “the RCP 440 can determine (e.g., calculate via radar sensor data and preselected memory settings) when the user wants the cart 445 to avoid a certain type of obstacle 262, 262a, (e.g., wall, moving person or other moving cart, etc.) go through a certain type of obstacle 262h (e.g., plastic hanging strips, cloth curtain, etc.), or go over a certain type of obstacle 262d laying on the floor surface 1a (e.g., debris, paper, liquid, etc.). From its internal map 260′ of the environment 260 stored in the RCP 440 memory 103, the processor 102, 106 of the RCP 440 calculate or otherwise determines a planned route 149 for the cart 445 and navigates this route, and when necessary an alter route 149a, via its sensors 140, 150, 550, 640 to avoid fixed and temporary obstructions 262, 262a and reach a desired destination 172.”, ¶[0165]}.
Regarding Claim 16, the combination of Hoeferlin and Armburst discloses the limitations of Claim 1, as discussed supra. Hoeferlin does not appear to explicitly recite the limitations: further comprising automatically operating the industrial truck to drive through the at least partially transparent object to engage the identified object.
However, Armburst explicitly recites limitations: automatically operating the industrial truck to drive through the at least partially transparent object {robotic cart platform 440/autonomous cart 445, embodiment in Figs. 17-30, uses radar, ¶[00164], to detect objects including partially transparent object that the vehicle can pass through, ¶[00165]; Fig. 33 shows the radar sensor detection fields detecting object surrounding an autonomous cart} to engage the identified object {when necessary the cart will deviate from a planned route based on obstacle detection: “the RCP 440 can determine (e.g., calculate via radar sensor data and preselected memory settings) when the user wants the cart 445 to avoid a certain type of obstacle 262, 262a, (e.g., wall, moving person or other moving cart, etc.) go through a certain type of obstacle 262h (e.g., plastic hanging strips, cloth curtain, etc.), or go over a certain type of obstacle 262d laying on the floor surface 1a (e.g., debris, paper, liquid, etc.). From its internal map 260′ of the environment 260 stored in the RCP 440 memory 103, the processor 102, 106 of the RCP 440 calculate or otherwise determines a planned route 149 for the cart 445 and navigates this route, and when necessary an alter route 149a, via its sensors 140, 150, 550, 640 to avoid fixed and temporary obstructions 262, 262a and reach a desired destination 172.”, ¶[0165]}.
Regarding Claim 17, the combination of Hoeferlin and Armburst discloses the limitations of Claim 1, as discussed supra. In addition, Hoeferlin explicitly recites the limitations: further comprising: determining an anticipated value based on a geometry of a load carrier when the last arriving signal echo assigned to the same solid angle only differs slightly from a signal noise {use of filtering (see Abstract and paragraphs 5-6, 12 and 53-60) and the ability to determine “a measure of variance” (see ¶[0012 & 0061]) provide the ability to distinguish between wanted and unwanted signals; in addition, basic signal processing techniques are well known in the art to restrict far-away or extraneous signals, and noise}; and identifying the object based on the anticipated value {“a high variance corresponds to an object with low recognition repeatability, while a low variance indicates a massive obstacle…A massive, flat wall, however, has an echo profile 400 with a few pulses lying close together 402, 404, 406 which give a small measurement variance”, ¶[0061]}.
Claims 4-6 and 11 are rejected under 35 U.S.C. §103 as being unpatentable over the combination of Hoeferlin, Armburst and Kopelke et al. (US 10,430,969 B2, henceforth Kopelke).
Regarding Claim 4, the combination of Hoeferlin and Armburst discloses the limitations of Claim 1, as discussed supra. In addition, Hoeferlin explicitly recites the limitation: wherein identifying the object based on the evaluated signal echoes comprises, generating {lateral scanning of laser 102, Figs. 1-3: “The laser scanner 102 emits a point-shaped laser beam 104, which is deflected laterally in a measuring plane 106 so that the laser beam 104 covers an angular range behind the forklift truck 100.”, ¶[0022]} a view of the object based on the signal echoes {Fig. 1 shows an unloaded, level vehicle, whereas Figs. 2&3 show a loaded, tilted vehicle, thus, providing two separate vehicle positions corresponding to two different laser scanning perspectives of the pallet stack 108 and wall 202, one of which is represented in Fig. 4}.
The combination of Hoeferlin and Armburst does not appear to explicitly recite the limitations: wherein identifying the object based on the evaluated signal echoes comprises comparing the view with a template, wherein the object is identified when there is sufficient correspondence between the view and the template
However, Kopelke explicitly recites limitations: wherein identifying the object based on the evaluated signal echoes comprises comparing the view with a template, wherein the object is identified when there is sufficient correspondence between the view and the template {“after this preliminary recognition of an object, a second template matching step is carried out….The additional data set is then compared with a predefined depth profile pattern (in other words: with a second template).”, ¶[0009]}.
The combination of Hoeferlin and Armburst along with Kopelke are analogous art because they all deal with sensors systems for enabling an autonomous vehicle to operate safely in a work environment.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention, having the teachings of Hoeferlin, Armburst and Kopelke before them, to modify the teachings of the combination of Hoeferlin and Armburst to include the teachings of Kopelke to accurately identify the type of object in the vicinity of a warehouse vehicle that may present an obstacle to the vehicle’s movement.
Regarding Claim 5, Hoeferlin in view of Armburst and Kopelke discloses the limitations of Claim 4, as discussed supra. The combination of Hoeferlin and Armburst does not appear to explicitly recite the limitations: wherein the generating the view of the object comprises creating a two-dimensional data set by projecting pixels onto a projection plane.
However, Kopelke explicitly recites limitations: wherein generating a view of the object comprises creating a two-dimensional data set by projecting pixels onto a projection plane {“Capturing image data with a 3D camera attached to an industrial truck…creating a two-dimensional data set by projecting pixels onto a projection plane, comparing the two-dimensional data set with a predefined pattern representing an object and provisionally detecting an object at a position on the projection plane if a match is found during the comparison”, ¶[0005]}.
Regarding Claim 6, Hoeferlin in view of Armburst and Kopelke discloses the limitations of Claim 4, as discussed supra. The combination of Hoeferlin and Armburst does not appear to explicitly recite the limitations: creating a further data set which comprises a depth profile along a line at a position of the projection plane, comparing the further data set with a predefined depth profile pattern assigned to the object, and identifying the object when a sufficient correspondence is established during the comparing.
However, Kopelke explicitly recites limitations: creating a further data set which comprises a depth profile along a line at a position of the projection plane, comparing the further data set with a predefined depth profile pattern assigned to the object, and identifying the object when a sufficient correspondence is established during the comparing {“Capturing image data with a 3D camera attached to an industrial truck…creating a two-dimensional data set by projecting pixels onto a projection plane, comparing the two-dimensional data set with a predefined pattern representing an object and provisionally detecting an object at a position on the projection plane if a match is found during the comparison, creating another dataset that includes a depth profile along a line at the position of the projection plane, comparing the additional data set with a predefined depth profile pattern associated with the object and finally identifying the object if a match is found during the comparison.”, ¶[0005]}.
Regarding Claim 11, the combination of Hoeferlin and Armburst discloses the limitations of Claim 1, as discussed supra. In addition, Hoeferlin further explicitly recites the limitations: generating a first view of the object {lateral scanning of laser 102, Figs. 1-3: “The laser scanner 102 emits a point-shaped laser beam 104, which is deflected laterally in a measuring plane 106 so that the laser beam 104 covers an angular range behind the forklift truck 100.”, ¶[0022]} based on signal echoes detected in a first position of an industrial truck; generating a second view of the object {lateral scanning of laser 102, Figs. 1-3: “The laser scanner 102 emits a point-shaped laser beam 104, which is deflected laterally in a measuring plane 106 so that the laser beam 104 covers an angular range behind the forklift truck 100.”, ¶[0022]} based on signal echoes detected in a second position of the industrial truck {Fig. 1 shows an unloaded, level vehicle, whereas Figs. 2-3 show a loaded, tilted vehicle, thus, providing two separate vehicle positions corresponding to two different views, or scanned results such as Fig. 4, of the pallet stack 108 and wall 202}.
The combination of Hoeferlin and Armburst does not appear to explicitly recite the limitations: wherein the identifying of the object comprises comparing the first view with a first template, comparing the second view with a second template, and identifying the object if there is sufficient correlation between: (1) the first view and the first template; and (2) the second view and second template.
However, Kopelke explicitly recites limitations: an object comprises, comparing the first view with a first template, comparing the second view with a second template {“after this preliminary recognition of an object, a second template matching step is carried out….The additional data set is then compared with a predefined depth profile pattern (in other words: with a second template).”, ¶[0009]}, and identifying the object if there is sufficient correlation between: (1) the first view and the first template; and (2) the second view and second template {“Capturing image data with a 3D camera attached to an industrial truck…creating a two-dimensional data set by projecting pixels onto a projection plane, comparing the two-dimensional data set with a predefined pattern representing an object and provisionally detecting an object at a position on the projection plane if a match is found during the comparison, creating another dataset that includes a depth profile along a line at the position of the projection plane, comparing the additional data set with a predefined depth profile pattern associated with the object and finally identifying the object if a match is found during the comparison.”, ¶[0005], and “The additional data set is then compared with a predefined depth profile pattern (in other words: with a second template).”, ¶[0009]}.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
CN 106918820 B – Detection of both near and far objects, such as detecting a primary object when rain is positioned between the detector and the object {see 23 relative to 24 in Fig. 1c; also, Pg. 7, Ln. 8 - Pg. 8, Ln. 12, discussion of detecting “abnormal obstacle 23”}.
EP 2703837 B1 – A laser pulse system capable of distinguishing between an intermediated “soft”, or opaque, object and a farther away hard target {Abstract and Fig. 1}, which can be identified as having different time-of-flight {¶[0038]}.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to RICHARD EDWIN GEIST whose telephone number is (703)756-5854. The examiner can normally be reached Monday-Friday, 9am-6pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Christian Chace can be reached at (571) 272-4190. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/R.E.G./Examiner, Art Unit 3665
/CHRISTIAN CHACE/ Supervisory Patent Examiner, Art Unit 3665