DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1 and 3 are provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claim 4 (depending on claims 3 and 1) of copending Application No. 17975943 (reference application). Although the claims at issue are not identical, they are not patentably distinct from each other because:
Regarding claim 1, 17975943 claim 4 teaches a computing system, comprising:
a processor (processor in claim 1); and
memory (memory in claim 1) that stores computer-executable instructions that, when executed by the processor, cause the processor to perform acts comprising:
receiving a stream of frames outputted by a sensor of a time-of-flight sensor system, the stream of frames comprises a series of frame sequences, wherein a frame sequence comprises a set of frames where the frames in the set have different frame types (receiving in claim 1), and wherein a frame type of a frame signifies sensor parameters of the time-of-flight sensor system when the frame is captured by the time-of-flight sensor system such that the different frame types signify different sensor parameters (frame types in claim 1);
identifying a pair of non-adjacent frames in the stream of frames identifying in claim 1);
calculating computed optical flow data based on the pair of non- adjacent frames in the stream of frames calculating in claim 1);
generating estimated optical flow data for at least one differing frame other than the pair of non-adjacent frames in the stream of frames based on the computed optical flow data generating in claim 3); and
realigning the at least one differing frame based on the estimated optical flow data realigning in claim 4).
Regarding claim 3, Kholodenko teaches the computing system of claim 2, wherein the set of frames in the frame sequence are captured by the time-of-flight sensor system over a period of time between 1 milliseconds and 100 milliseconds (claim 4).
Claim 2 is provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claim 14 (depending on claim 12) of copending Application No. 17975943 in view of Kholodenko US 20160232684 A1 and Pazhayampallil US 20230213635 A1.
Regarding claim 2, 17975943 claim 4 teaches the computing system of claim 1,
17975943 claim 4 does not explicitly teach the acts further comprising: computing object depth data based on realigned frames in the frame sequence; and outputting a point cloud comprising the object depth data.
Kholodenko teaches computing object depth data based on realigned frames in the frame sequence ([0064-65, 76-78, 81, 83]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 4 to include computing object depth data based on realigned frames in the frame sequence similar to Kholodenko with a reasonable expectation of success. This would have the predictable result of allowing the depth images to be used by other systems (such as collision avoidance in a vehicle).
Pazhayampallil teaches outputting a point cloud comprising the object depth data ([0021]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 4 to include outputting a point cloud comprising the object depth data similar to Pazhayampallil with a reasonable expectation of success. This would have the predictable result of allowing the depth images to be used by other systems (such as collision avoidance in a vehicle).
Claims 4-10 and 13 are rejected on the ground of nonstatutory double patenting as being unpatentable over claim 4 (depending on claims 3 and 1) of copending Application No. 17975943 in view of Kholodenko US 20160232684 A1.
Regarding claim 4, 17975943 claim 4 teaches the computing system of claim 1,
17975943 claim 4 does not explicitly teach but Kholodenko teaches wherein the sensor parameters of the time-of-flight sensor system when the frame is captured comprise at least one of:
an illumination state of the time-of-flight sensor system, such that the time-of-flight sensor system either emits or is inhibited from emitting light for the frame;
a relative phase delay between a transmitter system and a receiver system of the time-of-flight sensor system for the frame (different phases in Fig. 5, [0061]); or
an integration time of the sensor of the time-of-flight sensor system for the frame.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 4 such that the sensor parameters of the time-of-flight sensor system when the frame is captured comprise at least one of: an illumination state of the time-of-flight sensor system, such that the time-of-flight sensor system either emits or is inhibited from emitting light for the frame; a relative phase delay between a transmitter system and a receiver system of the time-of-flight sensor system for the frame; or an integration time of the sensor of the time-of-flight sensor system for the frame similar to Kholodenko with a reasonable expectation of success. This would have the predictable result of helping determine motion of objects in the environment.
Regarding claim 5, 17975943 claim 4 teaches the computing system of claim 1,
17975943 claim 4 does not explicitly teach but Kholodenko teaches wherein generating the estimated optical flow data for the at least one differing frame other than the pair of non-adjacent frames in the stream comprises interpolating the estimated optical flow data for at least one intermediate frame between the pair of non-adjacent frames based on the computed optical flow data ([0081, 83]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 4 such that generating the estimated optical flow data for the at least one differing frame other than the pair of non-adjacent frames in the stream comprises interpolating the estimated optical flow data for at least one intermediate frame between the pair of non-adjacent frames based on the computed optical flow data similar to Kholodenko with a reasonable expectation of success. This would have the predictable result of helping determine motion of objects in the environment and decreasing blur in images.
Regarding claim 6, 17975943 claim 4 teaches the computing system of claim 1,
17975943 claim 4 does not explicitly teach but Kholodenko teaches wherein generating the estimated optical flow data for the at least one differing frame other than the pair of non-adjacent frames in the stream comprises interpolating the estimated optical flow data for intermediate frames between the pair of non-adjacent frames based on the computed optical flow data ([0039, 81, 83]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 4 such that generating the estimated optical flow data for the at least one differing frame other than the pair of non-adjacent frames in the stream comprises interpolating the estimated optical flow data for intermediate frames between the pair of non-adjacent frames based on the computed optical flow data similar to Kholodenko with a reasonable expectation of success. This would have the predictable result of helping determine motion of objects in the environment and decreasing blur in images.
Regarding claim 7, 17975943 claim 4 teaches the computing system of claim 1,
17975943 claim 4 does not explicitly teach but Kholodenko teaches wherein generating the estimated optical flow data for the at least one differing frame other than the pair of non-adjacent frames in the stream comprises extrapolating the estimated optical flow data for at least one successive frame subsequent to the pair of non-adjacent frames based on the computed optical flow data (transform all of the phase images, [0038, 81, 83]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 4 such that generating the estimated optical flow data for the at least one differing frame other than the pair of non-adjacent frames in the stream comprises extrapolating the estimated optical flow data for at least one successive frame subsequent to the pair of non-adjacent frames based on the computed optical flow data similar to Kholodenko with a reasonable expectation of success. This would have the predictable result of helping determine motion of objects in the environment and decreasing blur in images.
Regarding claim 8, 17975943 claim 4 teaches the computing system of claim 1,
17975943 claim 4 does not explicitly teach but Kholodenko teaches wherein the estimated optical flow data for the at least one differing frame is further generated based on timestamp information for the at least one differing frame (phase image transformations based on time of images, [0081]).
Regarding claim 9, 17975943 claim 4 teaches the computing system of claim 1,
17975943 claim 4 does not explicitly teach but Kholodenko teaches wherein the pair of non-adjacent frames in the stream comprises successive frames of the same frame type (Figs. 2, 5, [0038-40, 60]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 4 such that the pair of non-adjacent frames in the stream comprises successive frames of the same frame type similar to Kholodenko with a reasonable expectation of success. This would have the predictable result of helping determine motion of objects in the environment and decreasing blur in images.
Regarding claim 10, Kholodenko teaches the computing system of claim 1,
17975943 claim 4 does not explicitly teach but Kholodenko teaches wherein the computed optical flow data is calculated for each pair of non-adjacent frames of the same frame type in successive frame sequences in the stream of frames ([0060]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 4 such that the computed optical flow data is calculated for each pair of non-adjacent frames of the same frame type in successive frame sequences in the stream of frames similar to Kholodenko with a reasonable expectation of success. This would have the predictable result of helping determine motion of objects in the environment and decreasing blur in images.
Regarding claim 13, 17975943 claim 4 teaches the computing system of claim 1,
17975943 claim 4 does not explicitly teach but Kholodenko teaches wherein the time-of-flight sensor system comprises the computing system (Fig. 1).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 4 such that the time-of-flight sensor system comprises the computing system similar to Kholodenko with a reasonable expectation of success. This would have the predictable result of not requiring additional equipment to determine motion of objects in the environment and decrease the blur of images.
Claim 11 is rejected on the ground of nonstatutory double patenting as being unpatentable over claim 4 (depending on claims 3 and 1) of copending Application No. 17975943 in view of Plank US 20200182984 A1.
Regarding claim 11, 17975943 claim 4 teaches the computing system of claim 1,
17975943 claim 4 does not explicitly teach but Plank teaches wherein the pair of non-adjacent frames in the stream for which the computed optical flow data is calculated comprises successive passive frames for which the time-of-flight sensor system is inhibited from emitting light ([0005, 88, 97]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 4 such that the pair of non-adjacent frames in the stream for which the computed optical flow data is calculated comprises successive passive frames for which the time-of-flight sensor system is inhibited from emitting light similar to Plank with a reasonable expectation of success. This would help identify phase offset values (Plank: [0088]).
Claim 12 is rejected on the ground of nonstatutory double patenting as being unpatentable over claim 4 (depending on claims 3 and 1) of copending Application No. 17975943 in view of Sekiguchi US 20240129630 A1.
Regarding claim 12, 17975943 claim 4 teaches the computing system of claim 1,
17975943 claim 4 does not explicitly teach but Sekiguchi teaches wherein the pair of non-adjacent frames in the stream for which the computed optical flow data is calculated comprises successive frames having relative phase delays that are 180 degrees out of phase (Fig. 6, [0072-75]).
Additionally, Kholodenko does teach “multiple phase images, corresponding to respective predefined phase shifts T_n” ([0049]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 4 such that the pair of non-adjacent frames in the stream for which the computed optical flow data is calculated comprises successive frames having relative phase delays that are 180 degrees out of phase similar to Sekiguchi with a reasonable expectation of success. This would have the predictable result of helping obtain coarse estimates of motion using fewer frames and requiring less processing due to fewer frames.
Claim 12 is rejected on the ground of nonstatutory double patenting as being unpatentable over claim 4 (depending on claims 3 and 1) of copending Application No. 17975943 in view of Heidrich US 20190154834 A1.
Regarding claim 14, 17975943 claim 4 teaches the computing system of claim 1,
17975943 claim 4 does not explicitly teach but Heidrich teaches wherein an autonomous vehicle comprises the time-of-flight sensor system and the computing system (use in autonomous vehicles, [0144]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 4 such that an autonomous vehicle comprises the time-of-flight sensor system and the computing system similar to Heidrich with a reasonable expectation of success. This would have the predictable result of helping provide a collision avoidance system for the autonomous vehicle.
Claim 15 is provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claim 14 (depending on claim 12) of copending Application No. 17975943 (reference application). Although the claims at issue are not identical, they are not patentably distinct from each other because:
Regarding claim 15, 17975943 claim 14 teaches a method of mitigating motion misalignment of a time-of-flight sensor system, comprising:
receiving a stream of frames outputted by a sensor of the time-of-flight sensor system, the stream of frames comprises a series of frame sequences, wherein a frame sequence comprises a set of frames where the frames in the set have different frame types (receiving in claim 12), and wherein a frame type of a frame signifies sensor parameters of the time-of- flight sensor system when the frame is captured by the time-of-flight sensor system such that the different frame types signify different sensor parameters (frame types in claim 12);
identifying a pair of non-adjacent frames in the stream of frames (identifying in claim 12);
calculating computed optical flow data based on the pair of non-adjacent frames in the stream of frames (calculating in claim 12);
generating estimated optical flow data for at least one differing frame other than the pair of non-adjacent frames in the stream of frames based on the computed optical flow data (generating in claim 14); and
realigning the at least one differing frame based on the estimated optical flow data (realigning in claim 14).
Claim 16 is provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claim 14 (depending on claim 12) of copending Application No. 17975943 in view of Kholodenko US 20160232684 A1 and Pazhayampallil US 20230213635 A1.
Regarding claim 16, see rejection to claim 2 above.
Claims 17-19 are rejected on the ground of nonstatutory double patenting as being unpatentable over claim 14 (depending on claim 12) of copending Application No. 17975943 in view of Kholodenko US 20160232684 A1.
Regarding claim 17, see rejection to claim 5 above.
Regarding claim 18, see rejection to claim 6 above.
Regarding claim 19, see rejection to claim 7 above.
Claim 20 is provisionally rejected on the ground of nonstatutory double patenting as being unpatentable over claim 20 of copending Application No. 17975943 in view of Kholodenko US 20160232684 A1 and Pazhayampallil US 20230213635 A1.
Regarding claim 20, 17975943 claim 20 teaches a time-of-flight sensor system, comprising:
a receiver system comprising a sensor (receiver of claim 20); and
a computing system in communication with the receiver system (computing system of claim 20), comprising:
a processor (processor of claim 20); and
memory (memory of claim 20) that stores computer-executable instructions that, when executed by the processor, cause the processor to perform acts comprising:
receiving a stream of frames outputted by the sensor of the receiver system of the time-of-flight sensor system, the stream of frames comprises a series of frame sequences, wherein a frame sequence comprises a set of frames where the frames in the set have different frame types (receiving of claim 20), and wherein a frame type of a frame signifies sensor parameters of the time-of-flight sensor system when the frame is captured by the time-of-flight sensor system such that the different frame types signify different sensor parameters (frame types of claim 20
identifying a pair of non-adjacent frames in the stream of frames (identifying of claim 20);
calculating computed optical flow data based on the pair of non- adjacent frames in the stream of frames (calculating of claim 20);
generating estimated optical flow data for at least one differing frame other than the pair of non-adjacent frames in the stream of frames based on the computed optical flow data (generating of claim 20); and
realigning the at least one differing frame based on the estimated optical flow data (realigning of claim 20);
17975943 claim 20 does not explicitly teach the acts further comprising: computing object depth data based on realigned frames in the frame sequence; and outputting a point cloud comprising the object depth data.
Kholodenko teaches computing object depth data based on realigned frames in the frame sequence ([0064-65, 76-78, 81, 83]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 20 to include computing object depth data based on realigned frames in the frame sequence similar to Kholodenko with a reasonable expectation of success. This would have the predictable result of allowing the depth images to be used by other systems (such as collision avoidance in a vehicle).
Pazhayampallil teaches outputting a point cloud comprising the object depth data ([0021]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified 17975943 claim 20 to include outputting a point cloud comprising the object depth data similar to Pazhayampallil with a reasonable expectation of success. This would have the predictable result of allowing the depth images to be used by other systems (such as collision avoidance in a vehicle).
This is a provisional nonstatutory double patenting rejection because the patentably indistinct claims have not in fact been patented.
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1, 3-10, 13, 15, and 17-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Kholodenko US 20160232684 A1.
Regarding claim 1, Kholodenko teaches a computing system, comprising:
a processor (102 and 120, [0028]); and
memory (122, [0028]) that stores computer-executable instructions that, when executed by the processor, cause the processor to perform acts comprising:
receiving a stream of frames outputted by a sensor of a time-of-flight sensor system, the stream of frames comprises a series of frame sequences, wherein a frame sequence comprises a set of frames where the frames in the set have different frame types (Fig. 5, [0017]), and wherein a frame type of a frame signifies sensor parameters of the time-of-flight sensor system when the frame is captured by the time-of-flight sensor system such that the different frame types signify different sensor parameters (Fig. 5, different time window parameters and phases, [0036, 38, 49]);
identifying a pair of non-adjacent frames in the stream of frames ([0038, 73]);
calculating computed optical flow data based on the pair of non- adjacent frames in the stream of frames ([0038, 73]);
generating estimated optical flow data for at least one differing frame other than the pair of non-adjacent frames in the stream of frames based on the computed optical flow data ([0037, 73, 81]); and
realigning the at least one differing frame based on the estimated optical flow data ([0064-65]).
Regarding claim 3, Kholodenko teaches the computing system of claim 2, wherein the set of frames in the frame sequence are captured by the time-of-flight sensor system over a period of time between 1 milliseconds and 100 milliseconds (Fig. 5, [0071]).
Regarding claim 4, Kholodenko teaches the computing system of claim 1, wherein the sensor parameters of the time-of-flight sensor system when the frame is captured comprise at least one of:
an illumination state of the time-of-flight sensor system, such that the time-of-flight sensor system either emits or is inhibited from emitting light for the frame;
a relative phase delay between a transmitter system and a receiver system of the time-of-flight sensor system for the frame (different phases in Fig. 5, [0061]); or
an integration time of the sensor of the time-of-flight sensor system for the frame.
Regarding claim 5, Kholodenko teaches the computing system of claim 1, wherein generating the estimated optical flow data for the at least one differing frame other than the pair of non-adjacent frames in the stream comprises interpolating the estimated optical flow data for at least one intermediate frame between the pair of non-adjacent frames based on the computed optical flow data ([0081, 83]).
Regarding claim 6, Kholodenko teaches the computing system of claim 1, wherein generating the estimated optical flow data for the at least one differing frame other than the pair of non-adjacent frames in the stream comprises interpolating the estimated optical flow data for intermediate frames between the pair of non-adjacent frames based on the computed optical flow data ([0039, 81, 83]).
Regarding claim 7, Kholodenko teaches the computing system of claim 1, wherein generating the estimated optical flow data for the at least one differing frame other than the pair of non-adjacent frames in the stream comprises extrapolating the estimated optical flow data for at least one successive frame subsequent to the pair of non-adjacent frames based on the computed optical flow data (transform all of the phase images, [0038, 81, 83]).
Regarding claim 8, Kholodenko teaches the computing system of claim 1, wherein the estimated optical flow data for the at least one differing frame is further generated based on timestamp information for the at least one differing frame (phase image transformations based on time of images, [0081]).
Regarding claim 9, Kholodenko teaches the computing system of claim 1, wherein the pair of non-adjacent frames in the stream comprises successive frames of the same frame type (Figs. 2, 5, [0038-40, 60]).
Regarding claim 10, Kholodenko teaches the computing system of claim 1, wherein the computed optical flow data is calculated for each pair of non-adjacent frames of the same frame type in successive frame sequences in the stream of frames ([0060]).
Regarding claim 13, Kholodenko teaches the computing system of claim 1, wherein the time-of-flight sensor system comprises the computing system (Fig. 1).
Regarding claim 15, Kholodenko teaches a method of mitigating motion misalignment of a time-of-flight sensor system, comprising:
receiving a stream of frames outputted by a sensor of the time-of-flight sensor system, the stream of frames comprises a series of frame sequences, wherein a frame sequence comprises a set of frames where the frames in the set have different frame types (Fig. 5, [0017]), and wherein a frame type of a frame signifies sensor parameters of the time-of- flight sensor system when the frame is captured by the time-of-flight sensor system such that the different frame types signify different sensor parameters (Fig. 5, [0036, 38, 49]);
identifying a pair of non-adjacent frames in the stream of frames ([0038, 73]);
calculating computed optical flow data based on the pair of non-adjacent frames in the stream of frames ([0038, 73]);
generating estimated optical flow data for at least one differing frame other than the pair of non-adjacent frames in the stream of frames based on the computed optical flow data ([0037, 73, 81]); and
realigning the at least one differing frame based on the estimated optical flow data ([0064-65]).
Regarding claim 17, see rejection to claim 5 above.
Regarding claim 18, see rejection to claim 6 above.
Regarding claim 19, see rejection to claim 7 above.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 2, 16, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Kholodenko US 20160232684 A1 in view of Pazhayampallil US 20230213635 A1.
Regarding claim 2, Kholodenko teaches the computing system of claim 1, the acts further comprising: computing object depth data based on realigned frames in the frame sequence ([0064-65, 76-78, 81, 83]);
Kholodenko does not explicitly teach but Pazhayampallil teaches outputting a point cloud comprising the object depth data ([0021]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kholodenko to include outputting a point cloud comprising the object depth data similar to Pazhayampallil with a reasonable expectation of success. This would have the predictable result of allowing the depth images to be used by other systems (such as collision avoidance in a vehicle).
Regarding claim 20, Kholodenko teaches a time-of-flight sensor system, comprising:
a receiver system comprising a sensor (104 in Fig. 1, [0014]); and
a computing system in communication with the receiver system (102 and 120, [0028]), comprising:
a processor (102 and 120, [0028]); and
memory (122, [0028]) that stores computer-executable instructions that, when executed by the processor, cause the processor to perform acts comprising:
receiving a stream of frames outputted by the sensor of the receiver system of the time-of-flight sensor system, the stream of frames comprises a series of frame sequences, wherein a frame sequence comprises a set of frames where the frames in the set have different frame types (Fig. 5, [0017]), and wherein a frame type of a frame signifies sensor parameters of the time-of-flight sensor system when the frame is captured by the time-of-flight sensor system such that the different frame types signify different sensor parameters (Fig. 5, different time window parameters and phases, [0036, 38, 49]);
identifying a pair of non-adjacent frames in the stream of frames ([0038, 73]);
calculating computed optical flow data based on the pair of non- adjacent frames in the stream of frames ([0038, 73]);
generating estimated optical flow data for at least one differing frame other than the pair of non-adjacent frames in the stream of frames based on the computed optical flow data ([0037, 73, 81]); and
realigning the at least one differing frame based on the estimated optical flow data ([0064-65]);
computing object depth data based on realigned frames in the frame sequence ([0064-65, 76-78, 81, 83]);
Kholodenko does not explicitly teach but Pazhayampallil teaches outputting a point cloud comprising the object depth data ([0021]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kholodenko to include outputting a point cloud comprising the object depth data similar to Pazhayampallil with a reasonable expectation of success. This would have the predictable result of allowing the depth images to be used by other systems (such as collision avoidance in a vehicle).
Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over Kholodenko US 20160232684 A1 in view of Plank US 20200182984 A1.
Regarding claim 11, Kholodenko teaches the computing system of claim 1,
Kholodenko does not explicitly teach but Plank teaches wherein the pair of non-adjacent frames in the stream for which the computed optical flow data is calculated comprises successive passive frames for which the time-of-flight sensor system is inhibited from emitting light ([0005, 88, 97]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kholodenko such that the pair of non-adjacent frames in the stream for which the computed optical flow data is calculated comprises successive passive frames for which the time-of-flight sensor system is inhibited from emitting light similar to Plank with a reasonable expectation of success. This would help identify phase offset values (Plank: [0088]).
Regarding claim 16, see rejection to claim 2 above.
Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Kholodenko US 20160232684 A1 in view of Sekiguchi US 20240129630 A1.
Regarding claim 12, Kholodenko teaches the computing system of claim 1,
Kholodenko does not explicitly teach but Sekiguchi teaches wherein the pair of non-adjacent frames in the stream for which the computed optical flow data is calculated comprises successive frames having relative phase delays that are 180 degrees out of phase (Fig. 6, [0072-75]).
Additionally, Kholodenko does teach “multiple phase images, corresponding to respective predefined phase shifts T_n” ([0049]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kholodenko such that the pair of non-adjacent frames in the stream for which the computed optical flow data is calculated comprises successive frames having relative phase delays that are 180 degrees out of phase similar to Sekiguchi with a reasonable expectation of success. This would have the predictable result of helping obtain coarse estimates of motion using fewer frames and requiring less processing due to fewer frames.
Claim 14 is rejected under 35 U.S.C. 103 as being unpatentable over Kholodenko US 20160232684 A1 in view of Heidrich US 20190154834 A1.
Regarding claim 14, Kholodenko teaches the computing system of claim 1,
Kholodenko does not explicitly teach but Heidrich teaches wherein an autonomous vehicle comprises the time-of-flight sensor system and the computing system (use in autonomous vehicles, [0144]).
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified Kholodenko such that an autonomous vehicle comprises the time-of-flight sensor system and the computing system similar to Heidrich with a reasonable expectation of success. This would have the predictable result of helping provide a collision avoidance system for the autonomous vehicle.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Shimizu US 20240163549 A1 teaches calculating a shift of a pixel per phase using optical flow calculations ([0289])
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JOSEPH C FRITCHMAN whose telephone number is (571)272-5533. The examiner can normally be reached M-F 8:00 am - 5:00 pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Isam Alsomiri can be reached on 571-272-6970. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/J.C.F./Examiner, Art Unit 3645
/ISAM A ALSOMIRI/ Supervisory Patent Examiner, Art Unit 3645