DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Information Disclosure Statement
The information disclosure statements (IDS) submitted on 1/20/2026 and 2/19/2026 are in compliance with the provisions of 37 CFR 1.97. Accordingly, the information disclosure statement is being considered and attached by the examiner.
Response to Arguments
Applicant’s response to the last Office Action, filed 1/20/2026, has been entered and made of record.
Applicant has amended claims 1, 5, and 15. Claims 4 and 16 are cancelled. Claims 21-22 are new. Claims 1-22 are currently pending.
Applicant's arguments filed 1/20/2026, with respect to the rejection of claims 1 and 15 under 35 U.S.C. 103 have been fully considered but they are not persuasive.
Regarding claims 1 and 15, applicant argues that Douillard “fails to disclose encoding, via the control system, the data of the three-dimensional environment into a first/second data structure at a first/second time, t1/t2, wherein the first/second data structure includes each voxelized object.” Applicant also that neither Douillard nor Sugio teach the limitation of “the signature identifier is a unique signature identifier assigned to each voxel of the plurality of voxels comprising each voxelized object in the voxelized map.”
Examiner respectfully disagrees. Douillard processes 3D data to determine objects at a first time and optionally a second time (¶91). Douillard does not explicitly disclose this is done by encoding, However Sugio teaches the encoding 3D data. The combination of Douillard and Sugio was used to reject claim 1. See previous rejection of claim 1.
Douillard also teaches in ¶85 individual objects associated with a unique identifier which may then be associated or assigned to individual voxels associated with the object.
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are:
A system for monitoring in claim 15 described in ¶22-23.
A control system operable to in claim 15 and 17-19 described in ¶22-23.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-3, 5, 7-11, 13-15, and 17-22 are rejected under 35 U.S.C. 103 as being unpatentable over Douillard (U.S. Patent Pub. No. 2018/0364717) in view of Sugio (U.S. Patent Pub. No. 2023/0100085).
Regarding Claim 1, Douillard teaches a method for monitoring a three-dimensional environment, the method comprising (Fig. 1; Abstract: Systems, methods, and apparatuses described herein are directed to performing segmentation on voxels representing three-dimensional data to identify static and dynamic objects:) Douillard 2018/0364717
receiving, via a control system, data including position information associated with one or more objects in a three-dimensional environment (Fig. 1, 104; ¶23 the operation 102 may include receiving RADAR data (or other sensor data) and associating the RADAR data with the LIDAR data to generate a more detailed representation of an environment. An example of a LIDAR dataset is illustrated in an example 104, which may include LIDAR data (e.g., a point cloud) associated with various objects in an urban environment, such as cars, trucks, roads, buildings, bikes, pedestrians, etc;)
generating, via the control system, a voxelized map based on the data, wherein the voxelized map represents the one or more objects in the three-dimensional environment as a voxelized object comprising a plurality of voxels (Fig. 1, 124; ¶32 At operation 122, the process can include determining voxels associated with objects … An example 124 illustrates a top view representation of the voxel space 108′, which may correspond to the voxel space illustrated in the example 108. In some instances, the voxel space 108′ includes LIDAR data 126 and 128 representing objects in an environment. In some instances, the operation 122 can include clustering to determine that the LIDAR data points 126 are associated with an object 130, and to determine that the LIDAR data points 128 are associated with an object 13;)
determining, via the control system, a signature identifier associated with each voxelized object in the voxelized map (¶50 The object determination module 220 may assign an object identifier to all voxels associated with a particular object, and in some instances, the object identifier assigned or determined by the object determination module 220 may be propagated to LIDAR data associated with voxels comprising the particular object;)
processing, via the control system, the data of the three-dimensional environment into a first data structure at a first time, t1, wherein the first data structure includes each voxelized object (Fig. 8, 802; ¶91 At operation 802, the process can include determining that an object is associated with a first subset of voxels at a first time. In some instances, the operation 802 can include identifying one or more objects, as discussed herein. In some instances, the operation 802 can include receiving an object identifier identifying an object and identifying the voxels that are associated with the object identifier;)
processing, via the control system, the data of the three-dimensional environment into a second data structure at a second time, t2, wherein the second data structure includes each voxelized object (Fig. 8, 804; ¶91 Similarly, at operation 804, the process can include determining that the object is associated with a second subset of voxels at a second time;)
determining, via the control system, a coordinate position of each voxelized object in the first data structure and the second data structure based on the signature identifier associated with each voxelized object (¶92 At operation 806, the process can include determining that a position of at least a portion of the first subset of voxels is different than a position of at least a portion of the second subset of voxels;)
determining, via the control system, a change in coordinate position associated with each voxelized object based on a comparison of the first data structure and the second data structure; and (¶92 voxels that are not occupied by an object at a first time and that are occupied by the object at a second time (or vice versa) may represent motion of the object. This operation 806 can include comparing the positions and/or occupancy of the first subset of voxels with the second subset of voxels to identify the voxels that are not common between the subsets of voxels.)
determining, via the control system, a motion measurement of each voxelized object based on the change in coordinate position (¶93 At operation 808, the process can include determining that the object is a dynamic object. For example, by determining the occupancy and/or positions of the voxels associated with the object over time, the operation 808 can determine that the change in voxel occupancy corresponds to motion, for example, and that the object is a dynamic object. That is, in some instances, the differences in the positions and/or locations of voxels in the voxel space between first subset of voxels and the second subset of voxels can correspond to movement of the object. In some instances, the operation 808 include determining that the difference is above a threshold amount, to prevent jitter, noise, or errors in data capture or processing from triggering an identification of the object as a dynamic object. In some instances, the object is determined as a dynamic object based at least in part on the difference determined in the operation 806.)
wherein the signature identifier is a unique signature identifier assigned to each voxel of the plurality of voxels comprising each voxelized object in the voxelized map (Douillard, ¶85 At operation 704, the process can include determining that a first subset of voxels associated with the first LIDAR data is associated with an object ... As discussed above, individual objects may be associated with a unique object identifier, which may be associated and/or assigned to individual voxels and/or LIDAR data associated with the individual object.)
Douillard does not explicitly disclose encoding, via the control system, the data of the three-dimensional environment.
Sugio is in the same field of art of image analysis. Further, Sugio teaches encoding, via the control system, the data of the three-dimensional environment (¶143 Three-dimensional data encoding system 4601 generates encoded data or multiplexed data by encoding point cloud data, which is three-dimensional data. Three-dimensional data encoding system 4601 may be a three-dimensional data encoding device implemented by a single device or a system implemented by a plurality of devices.)
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Douillard by specifically encoding 3D data that is taught by Sugio; thus, one of ordinary skilled in the art would be motivated to combine the references to improve coding efficiency (Sugio ¶8).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention.
Regarding Claim 2, Douillard in view of Sugio discloses the method of claim 1, further comprising obtaining, via one or more sensors arranged to monitor the three-dimensional environment, the data associated with the one or more objects (Douillard, ¶23 At operation 102, the process can include receiving a LIDAR dataset. Though illustrated in FIG. 1 as a LIDAR dataset, such a dataset may comprise any form of depth data from any one or more sensors as described in detail above. In some instances, the operation 102 may include receiving a plurality of LIDAR datasets from a plurality of LIDAR sensors operating in connection with a perception system of an autonomous vehicle. In some instances, the operation 102 may include combining or fusing data from two or more LIDAR sensors into a single LIDAR dataset (also referred to as a “meta spin”). In some instances, the operation 102 may include extracting a portion of the LIDAR data for processing, such as over a period of time. In some instances, the operation 102 may include receiving RADAR data (or other sensor data) and associating the RADAR data with the LIDAR data to generate a more detailed representation of an environment.)
Regarding Claim 3, Douillard in view of Sugio discloses the method of claim 2, wherein the one or more sensors are three-dimensional sensors (Douillard, ¶14 a three-dimensional dataset may include data captured by a LIDAR system,) and wherein the voxelized map is generated via three-dimensional point cloud data obtained by the one or more sensors (Douillard, ¶23 An example of a LIDAR dataset is illustrated in an example 104, which may include LIDAR data (e.g., a point cloud) associated with various objects in an urban environment, such as cars, trucks, roads, buildings, bikes, pedestrians, etc.)
Regarding Claim 5, Douillard in view of Sugio discloses the method of claim 1, wherein determining the unique signature identifier for each voxel of the plurality of voxels further comprises determining, via the control system, a number of adjacent voxels along a direction normal to each face of each voxel (Douillard, ¶162 cluster a group of voxels to determine a cluster of voxels, wherein individual voxels of the group of voxels are adjacent to at least one other voxel of the group of voxels; determine that a number of voxels associated with the cluster of voxels is below a threshold number of voxels; and determine that the cluster of voxels is not an object (identifier).)
Regarding Claim 7, Douillard in view of Sugio discloses the method of claim 1, further comprising determining, via the control system, a target portion for each voxelized object, and wherein the signature identifier is based on the target portion for each voxelized object (Douillard, ¶85 At operation 704, the process can include determining that a first subset of voxels associated with the first LIDAR data is associated with an object ... As discussed above, individual objects may be associated with a unique object identifier, which may be associated and/or assigned to individual voxels and/or LIDAR data associated with the individual object.)
Regarding Claim 8, Douillard in view of Sugio discloses the method of claim 7, wherein the target portion includes a center of gravity or a segment of the voxelized object (Douillard, ¶85 At operation 704, the process can include determining that a first subset of voxels associated with the first LIDAR data is associated with an object ... As discussed above, individual objects may be associated with a unique object identifier, which may be associated and/or assigned to individual voxels (target segment of object) and/or LIDAR data associated with the individual object.)
Regarding Claim 9, Douillard in view of Sugio discloses the method of claim 1, further comprising: analyzing, via the control system, the motion measurement associated with each voxelized object; and transmitting, via the control system, instructions to at least one of the one or more objects in the three-dimensional environment, wherein the instructions alter an action associated with the at least one of the one or more objects (Douillard, ¶78 FIG. 6 depicts an example process 600 for generating a trajectory for an autonomous vehicle based on object identification and segmentation, as described herein; ¶82 At operation 608, the process can include generating a sequence of commands to command the autonomous vehicle to drive along the trajectory generated in operation 606. In some instances, the commands generated in the operation 608 can be relayed to a controller onboard an autonomous vehicle to control the autonomous vehicle to drive the trajectory.)
Regarding Claim 10, Douillard in view of Sugio discloses the method of claim 1, further comprising: analyzing, via the control system, the motion measurement associated with each voxelized object (Douillard, ¶93 the differences in the positions and/or locations of voxels in the voxel space between first subset of voxels and the second subset of voxels can correspond to movement of the object. In some instances, the operation 808 include determining that the difference is above a threshold amount, to prevent jitter, noise, or errors in data capture or processing from triggering an identification of the object as a dynamic object;) and generating, via the control system, an alarm signal in response to the motion measurement for any of the voxelized objects exceeding a threshold value (Douillard, ¶ At operation 810, the process can include providing an indication of the dynamic object to a tracker and/or planner. For example, the object identifier associated with the dynamic object can be provided to a tracker and/or planner for subsequent operations, as discussed herein. For example, the operations can track the occupancy of voxels over time associated with the various objects to determine speed, positions, velocities, etc. of the tracked objects.); (Sugio, ¶654 Three-dimensional data creation device 810 may also transmit, to the following vehicle, meta-data on a risk avoidance behavior of the own vehicle such as hard breaking warning, before transmitting three-dimensional data of the space in which a change has occurred.)
Regarding Claim 11, Douillard in view of Sugio discloses the method of claim 1, wherein the step of determining a change in coordinate position associated with each voxelized object, further comprises comparing a coordinate position at time t1 of each voxel of the voxelized object in the first data structure with a coordinate position at time t2 of the corresponding voxel of the voxelized object in the second data structure (Douillard, ¶92 At operation 806, the process can include determining that a position of at least a portion of the first subset of voxels is different than a position of at least a portion of the second subset of voxels. That is, as an object moves through space, and accordingly, as the LIDAR data (for example) representing the object is updated throughout the voxel space over time, an occupancy of voxels associated with the object will change over time. For example, voxels that are not occupied by an object at a first time and that are occupied by the object at a second time (or vice versa) may represent motion of the object. This operation 806 can include comparing the positions and/or occupancy of the first subset of voxels with the second subset of voxels to identify the voxels that are not common between the subsets of voxels.)
Regarding Claim 13, Douillard in view of Sugio discloses the method of claim 1, wherein the motion measurement includes one or both of a speed and a velocity associated with the voxelized object (Douillard, ¶94 For example, the operations can track the occupancy of voxels over time associated with the various objects to determine speed, positions, velocities, etc. of the tracked objects.)
Regarding Claim 14, Douillard in view of Sugio discloses the method of claim 1, wherein the first data structure and the second data structure are each octrees (Sugio, ¶400 First, the three-dimensional data encoding device encodes geometry information (geometry) (S3001). For example, the three-dimensional data encoding is performed using octree representation.)
The reasons for combining Douillard and Sugio are similar to that stated in the rejection of claim 1.
Regarding claim 15, claim 15 has been analyzed with regard to claim 1 and is rejected for the same reasons of obviousness as used above as well as in accordance with Douillard further teaching on: A system for monitoring a three-dimensional environment, the system comprising: one or more sensors arranged to monitor the three-dimensional environment (Fig. 2, 200)
Claim 17 recites limitations similar to claim 11 and is rejected for the same rationale and reasoning.
Claim 18 recites limitations similar to claim 9 and is rejected for the same rationale and reasoning.
Claim 19 recites limitations similar to claim 10 and is rejected for the same rationale and reasoning.
Regarding Claim 20, Douillard in view of Sugio discloses the system of claim 15, wherein the one or more sensors include laser scanners, three-dimensional time-of-flight cameras, stereo vision cameras, three-dimensional LIDAR sensor or other radar-based sensors (Douillard, ¶14 a three-dimensional dataset may include data captured by a LIDAR system.)
Regarding Claim 21, Douillard in view of Sugio discloses the wherein the control system includes a processing unit, a memory, and a network interface interconnected via a bus (Douillard, ¶106 FIG. 11 illustrates an environment 1100 in which the disclosures may be implemented in whole or in part. The environment 1100 depicts one or more computer systems 1102 that comprise a storage 1104, one or more processor(s) 1106, a memory 1108, and an operating system 1110. The storage 1104, the processor(s) 1106, the memory 1108, and the operating system 1110 may be communicatively coupled over a communication infrastructure 1112. Optionally, the computer system 1102 may interact with a user, or environment, via input/output (I/O) device(s) 1114.)
Regarding Claim 22, Douillard in view of Sugio discloses the wherein the network interface is configured to communicate with the one or more sensors via one of a wired or wireless connection (Douillard, ¶124 Communications between the nodes may be made possible by a communications network; ¶125 The communications network can include wireline communications capability, wireless communications capability, or a combination of both, at any frequencies, using any type of standard, protocol or technology.)
Claim 6 is rejected under 35 U.S.C. 103 as being unpatentable over Douillard (U.S. Patent Pub. No. 2018/0364717) in view of Sugio (U.S. Patent Pub. No. 2023/0100085) in view of Teranishi (U.S. Patent Pub. No. 2024/0362860).
Regarding Claim 6, Douillard in view of Sugio teaches The method of claim 5.
Douillard in view of Sugio does not explicitly disclose wherein the unique signature identifier comprises a six-dimensional configuration.
Teranishi is in the same field of art of image analysis. Further, Teranishi teaches wherein the unique signature identifier comprises a six-dimensional configuration (¶157 region detection unit 114 classifies, on a per-voxel basis, the three-dimensional points included in the voxel based on positions of faces defining the voxel.)
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Douillard in view of Sugio by identifying the voxel based on the faces of the voxel that is taught by Teranishi; thus, one of ordinary skilled in the art would be motivated to combine the references to accurately calculate which group 3D points belong to (Teranishi ¶7).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention.
Claim 12 is rejected under 35 U.S.C. 103 as being unpatentable over Douillard (U.S. Patent Pub. No. 2018/0364717) in view of Sugio (U.S. Patent Pub. No. 2023/0100085) in view of Tang (U.S. Patent Pub. No. 2024/0193788).
Regarding Claim 12, Douillard in view of Sugio teaches The method of claim 11.
Douillard in view of Sugio does not explicitly disclose averaging the change in coordinate position for each voxel along an x-axis, y-axis, and z-axis; and calculating the motion measurement along the x-axis, y-axis, and z-axis based on the average change in coordinate position.
Tang is in the same field of art of image analysis. Further, Tang teaches averaging the change in coordinate position for each voxel along an x-axis, y-axis, and z-axis; and calculating the motion measurement along the x-axis, y-axis, and z-axis based on the average change in coordinate position (See screenshot below.)
PNG
media_image1.png
762
585
media_image1.png
Greyscale
Therefore, it would have been obvious to one having ordinary skill in the art before the effective filing date of the claimed invention to modify the invention of Douillard in view of Sugio by using the average coordinate values for the voxels that is taught by Tang; thus, one of ordinary skilled in the art would be motivated to combine the references in order to improve pedestrian detection (Tang ¶3).
Thus, the claimed subject matter would have been obvious to a person having ordinary skill in the art before the effective filing date of the claimed invention.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DUSTIN BILODEAU whose telephone number is (571)272-1032. The examiner can normally be reached 9am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jennifer Mehmood can be reached at (571) 272-2976. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DUSTIN BILODEAU/Examiner, Art Unit 2664
/JENNIFER MEHMOOD/Supervisory Patent Examiner, Art Unit 2664