Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claim(s) 1-3, 12-20 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by US20250295480A1, hereinafter, “Jensen”.
Regarding claim 1, Jensen teaches the following as claimed: A data-generating apparatus (Figs. 1, 7; pars. 0028, 0073, 0151, 0177) configured to generate video data, the data-generating apparatus comprising: input processing circuitry configured to receive upper-jaw tooth row data showing a three-dimensional shape of a row of teeth in an upper-jaw, lower-jaw tooth row data showing a three-dimensional shape of a row of teeth in a lower jaw, and jaw motion data showing a jaw motion;
and generation processing circuitry configured to generate, based on the upper-jaw tooth row data, the lower-jaw tooth row data, and the jaw motion data, video data of at least one of the upper-jaw tooth row data and the lower-jaw tooth row data
“[0028] In one embodiment the relative jaw motion data set may be obtained by obtaining bite scans. A bite scan is a single discrete scan of the static occlusion of the upper and lower jaw of a patient. Accordingly, multiple (e.g. first, second, third, fourth, etc.) bite scans may represent frames, which in a sequence may be provided as a video stream, e.g. if a dynamic occlusion is obtained by recording it using an intraoral scanner.”
“[0073] In one embodiment, method may further comprise obtaining at least a first digital 3D representation of at least a part of the upper and a part of the lower jaw of the patient.”
“[0151] Alternatively or additionally, the monitoring information may be visualised by showing a color/heat map indicating where critical changes occur or some discrepancies in the jaw motion, while playing a 3D digital motion video of the upper and lower jaw.”
“[0177] Alternatively or additionally, the monitoring information may be visualised by showing a color/heat map indicating where critical changes occur or some discrepancies are present in the jaw motion while playing a 3D digital motion video of the upper and lower jaw.”
to which an indicator is added, the indicator indicating a positional relation between the row of teeth in the upper jaw and the row of teeth in the lower jaw, the positional relation changing in accordance with the jaw motion (par. 0083).
“[0083] Accordingly, by determining framework correspondences which are both in the primary and the secondary reference framework, a common coordinate system can be established, which subsequently allows for the primary model parameters and the secondary model parameters to be compared in order to determine the monitoring information. The correspondences may for example be landmarks, data points or other features.”
Regarding claim 2, Jensen also teaches: The data-generating apparatus according to claim 1, wherein the generation processing circuitry is configured to add the indicator to each frame of a video of the jaw motion (see par. 0083 above. E.g., the landmarks, data points, or other features are applied to each frame of video).
Regarding claim 3, Jensen also teaches: The data-generating apparatus according to claim 1, wherein the generation processing circuitry is configured to add the indicator that is different according to a distance between each point in an upper jaw point cloud constituting the row of teeth in the upper jaw and each point in a lower jaw point cloud constituting the row of teeth in the lower jaw (par. 0189).
“[0189] The scan data may comprise any of: 2D images, 3D point clouds, depth data, texture data, intensity data, color data, and/or combinations thereof. As an example, the scan data may comprise one or more point clouds, wherein each point cloud comprises a set of 3D points describing the three-dimensional dental object. … The depth information may be used to generate 3D point clouds comprising a set of 3D points in space, e.g., described by cartesian coordinates (x, y, z). The 3D point clouds may be generated by the processor or by another processing unit. Each 2D/3D point may furthermore comprise a timestamp that indicates when the 2D/3D point was recorded, i.e., from which image in the stack of 2D images the point originates. The timestamp is correlated with the z-coordinate of the 3D points, i.e., the z-coordinate may be inferred from the timestamp.”
Regarding claim 12, Jensen further teaches (pars. 0013, 0028): The data-generating apparatus according to claim 1, wherein a video of the jaw motion includes at least one of a video showing the row of teeth in the upper jaw and the row of teeth in the lower jaw, when viewed from a side-surface side, and a video showing the row of teeth in the upper jaw and the row of teeth in the lower jaw, when viewed from an occlusal-surface side of the upper jaw or from an occlusal- surface side of the lower jaw.
“[0013] Accordingly, there exists a need for assessing occlusal changes faster and more accurate.”
“[0028] In one embodiment the relative jaw motion data set may be obtained by obtaining bite scans. A bite scan (i.e., surface side) is a single discrete scan of the static occlusion of the upper and lower jaw of a patient. Accordingly, multiple (e.g. first, second, third, fourth, etc.) bite scans may represent frames, which in a sequence may be provided as a video stream, e.g. if a dynamic occlusion is obtained by recording it using an intraoral scanner.”.
Regarding claim 13, Jensen further teaches (pars. 0024, 0102, 0151): The data-generating apparatus according to claim 1, wherein the upper-jaw tooth row data includes at least one of three-dimensional data including position information about each point in an upper jaw surface point cloud constituting a surface of the row of teeth in the upper jaw, the three-dimensional data being acquired by a three-dimensional scanner, and three-dimensional data obtained by computed tomography of the row of teeth in the upper jaw, and the lower-jaw tooth row data includes at least one of three-dimensional data including position information about each point in a lower jaw surface point cloud constituting a surface of the row of teeth in the lower jaw, the three-dimensional data being acquired by the three-dimensional scanner, and three-dimensional data obtained by computed tomography of the row of teeth in the lower jaw.
“[0024] Thus, as will be disclosed and discussed further, the method disclosed enables early identification of occlusal disturbances by quantifying and monitoring mobility of the masticatory system specific jaw movement of a patient over time. By obtaining 3D information of a patient's dentition such as a 3D digital model of the upper and lower jaw it is possible to quantify mandibular movements.”
“[0102] The acquisition of multiple bite configurations may comprise a consecutive sequence of at least two 3D representations of bite configurations of the patient's jaws in respective occlusions.”
“[0151] Alternatively or additionally, the monitoring information may be visualised by showing a color/heat map indicating where critical changes occur or some discrepancies in the jaw motion, while playing a 3D digital motion video of the upper and lower jaw.”
Regarding claim 14, Jensen further teaches (Fig. 1; pars. 0075, 0077): The data-generating apparatus according to claim 1, wherein the jaw motion data includes at least one of data showing the jaw motion measured by a jaw motion measuring device, and data obtained by simulating the jaw motion based on the upper-jaw tooth row data and the lower-jaw tooth row data.
“[0075] In an embodiment, the CBCT scan of the upper and/or lower jaw may be used as the reference framework during the fitting procedure of the jaw motion data. The CBCT scan may improve the mapping of the jaw motion data to the model class as correspondences between jaw motion data and the CBCT scan may easily be established by an alignment process.
[0077] The digital 3D representation may additionally be obtained by importing the digital 3D representation of at least a part of the patient dentition into the software generated a priori and/or by other means such as the intraoral scanner used to obtain the relative jaw motion data sets.”
PNG
media_image1.png
708
610
media_image1.png
Greyscale
Regarding claim 15, Jensen further teaches (par. 0151 e.g., color/heat map indication): The data-generating apparatus according to claim 1, wherein the generation processing circuitry is configured to add the indicator in a video of the jaw motion, the indicator indicating a positional relation between jaw joints, the positional relation changing in accordance with the jaw motion.
“[0151] Alternatively or additionally, the monitoring information may be visualised by showing a color/heat map indicating where critical changes occur or some discrepancies in the jaw motion, while playing a 3D digital motion video of the upper and lower jaw.”
Regarding claim 16, Jensen further teaches (par. 0165): The data-generating apparatus according to claim 1, wherein the generation processing circuitry is configured to add the indicator to a video of the jaw motion, the video showing a cross section of at least one of the row of teeth in the upper jaw and the row of teeth in the lower jaw.
“[0165] Also in some embodiments, respective parts of the primary and secondary border envelope are compared, this can for example be cross sections thereof, border movement during occlusion or border movement while opening and closing the jaws.”
Regarding claim 17, rejected as set forth in claim 1. Claim 17 is a method corresponding to apparatus of claim 1 (See also Abstract).
Regarding claim 18, rejected as set forth in claim 1. Claim 18 is a non-transitory computer readable medium claim corresponding to apparatus of claim 1 (see also par. 0058).
“[0058] In an embodiment of the disclosure, a computer program product embodied in the non-transitory computer readable medium is disclosed, the computer program product comprising instructions which, when executed by a computer, may cause the computer to carry out the method according to any of the presented embodiments.”
Claim 19 is a method corresponding to apparatus claim 3 and thus, rejected for the same reason.
Claim 20 is the non-transitory computer readable medium corresponding to apparatus claim 3 and thus, rejected for the same reason.
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
Claim(s) 4-11 are rejected under 35 U.S.C. 103 as being unpatentable over US20250295480A1, hereinafter, “Jensen” as applied to claims 1 and 3 above and further in view of US20020150859A1, hereinafter, “Imgrund”.
Regarding claim 4, the rejections of claims 1 and 3 are incorporated herein. Jensen fails to teach the following limitations as further recited. However, Imgrund teaches (Figs. 4-6; par. 0052): The data-generating apparatus according to claim 3, wherein the distance is a first distance between a prescribed point constituting the row of teeth in the upper jaw and a point constituting the row of teeth in the lower jaw existing in a direction corresponding to a direction of the jaw motion starting from the prescribed point (Fig. 5, e.g. depicts by 18s), a second distance between the prescribed point constituting the row of teeth in the upper jaw and a point closest to the prescribed point in the lower jaw point cloud (pars. 0040-41 disclose known 3D models from cited prior arts that incorporate (e.g., 3D mesh or point cloud) with points that define contiguous three-dimensional triangular shaped surface segments) constituting the row of teeth in the lower jaw (Fig. 5, e.g. depicts by closest points 24, 26), or a third distance between the prescribed point constituting the row of teeth in the upper jaw and a point constituting the row of teeth in the lower jaw existing in a direction substantially perpendicular to a plane of a planar model selected by a user from among a plurality of planar models (Fig. 4, e.g. depicts by perpendicular direction of 18s).
[0052] With reference to FIG. 4, from each vertex point 20 (corner point of a triangle making up one element of the surface), a ray or vector 18 is constructed running along a pre-defined direction. In a preferred implementation, this direction is parallel to the tooth axis, which is defined for each tooth as a central axis extending the length of the tooth through the center of the cusp. Other options are discussed further below. Each ray 18 is sent both to the outside and the inside of the tooth, as shown in FIG. 4. Then, we determine where the vector intersects the surface of opposing teeth 10'. These points of intersection (actually, small triangle surfaces) are shown as points 22 in FIG. 4. The distance from the vertex point 20 to the intersection 22 is calculated. According to the color scheme, the triangles associated to the vertex point 20 are assigned a color depending on the distance and the mathematical sign (positive/negative). For example, the surfaces associated with vertex point 20 with a negative value are colored red. All surfaces 20 with a positive value less than 0.45 mm are colored green. The surface where the distance is greater than or equal to 0.45 mm are not colored. The virtual models of the upper and lower arches are displayed (typically separately) with the color information on a user interface of the workstation implementing the program. See, for example, the views shown in FIGS. 1-3.”
PNG
media_image2.png
680
860
media_image2.png
Greyscale
Jensen and Imgrund are analogous prior arts. Taking the combined teachings as a whole, one of ordinary skill in the art at the time of effective filing of the present invention would have found it obvious to incorporate the teaching of Imgrund into Jensen to derive at the invention of claim 4. Doing so would facilitate evaluation of occlusal contacts using a computer and 3D virtual models of teeth (Imgrund, par. 0003).
Regarding claim 5, Imgrund in the combination further teaches: The data-generating apparatus according to claim 4, wherein the generation processing circuitry is configured to add the indicator that is different according to a distance selected by the user from among the first distance and the second distance (See aforementioned par. 0052, e.g., a color scheme serves as indicator for varying distance).
Regarding claim 6, Imgrund in the combination further teaches: The data-generating apparatus according to claim 1, wherein the generation processing circuitry is configured to add the indicator that is different according to a distance between an upper jaw-side mesh generated based on a upper jaw point cloud constituting the row of teeth in the upper jaw and a lower jaw-side mesh generated based on a lower jaw point cloud constituting the row of teeth in the lower jaw (Rejections of claims 4-5 above cover this claim. E.g., different color scheme (indicator) for different distance among point pairs along the upper and lower jaws, see pars. 0040-0041, 0052; Figs. 4-5 above).
Regarding claim 7, Imgrund in the combination further teaches: The data-generating apparatus according to claim 6, wherein the upper jaw-side mesh has an upper jaw-side plane having a triangular shape having vertices each represented by a corresponding point in the upper jaw point cloud constituting the row of teeth in the upper jaw, the lower jaw-side mesh has a lower jaw-side plane having a triangular shape having vertices each represented by a corresponding point in the lower jaw point cloud constituting the row of teeth in the lower jaw, and the distance is a distance between a first vertex of the upper jaw-side mesh and a second vertex of the lower jaw-side mesh, a distance between a first arbitrary point (Figs. 4-5 above depict point pairs as “arbitrary” points) on the upper jaw-side plane of the upper jaw-side mesh and a second arbitrary point on the lower jaw-side plane of the lower jaw-side mesh, a distance between the first vertex of the upper jaw-side mesh and the second arbitrary point on the lower jaw-side plane of the lower jaw-side mesh, or a distance between the first arbitrary point on the upper jaw-side plane of the upper jaw-side mesh and the second vertex of the lower jaw-side mesh (Rejections of claims 4-6 above cover this claim. E.g., different color scheme (indicator) for different distance among point pairs along the upper and lower jaws, see pars. 0040-0041, 0052; Figs. 4-5 above).
Regarding claim 8, Imgrund in the combination further teaches: The data-generating apparatus according to claim 7, wherein the first arbitrary point on the upper jaw-side plane of the upper jaw-side mesh includes a center of gravity (Par. 0052 cited above discusses “each tooth as a central axis extending the length of the tooth through the center of the cusp”), an incenter, or a circumcenter of an upper jaw-side triangle having vertices each represented by a corresponding point in the point cloud constituting the row of teeth in the upper jaw, and the second arbitrary point on the lower jaw-side plane of the lower jaw-side mesh includes a center of gravity, an incenter, or a circumcenter of a lower jaw-side triangle having vertices each represented by a corresponding point in the point cloud constituting the row of teeth in the lower jaw (Rejections of claims 4-7 above cover this claim. E.g., different color scheme (indicator) for different distance among point pairs along the upper and lower jaws, see pars. 0040-0041, 0052; Figs. 4-5 above).
Regarding claim 9, Imgrund in the combination further teaches: The data-generating apparatus according to claim 3, wherein the generation processing circuitry is configured to add a different color according to the distance as the indicator to each point in the upper jaw point cloud constituting the row of teeth in the upper jaw and each point in the lower jaw point cloud constituting the row of teeth in the lower jaw, or an upper jaw-side mesh generated based on the upper jaw point cloud constituting the row of teeth in the upper jaw and a lower jaw-side mesh generated based on the lower jaw point cloud constituting the row of teeth in the lower jaw (Rejections of claims 4-6 above cover this claim. E.g., different color scheme (indicator) for different distance among point pairs along the upper and lower jaws, see pars. 0040-0041, 0052; Figs. 4-5 above).
Regarding claim 10, Imgrund in the combination further teaches: The data-generating apparatus according to claim 3, wherein the generation processing circuitry is configured to add the distance as the indicator to a point designated by a user from among the upper jaw point cloud constituting the row of teeth in the upper jaw and the lower jaw point cloud constituting the row of teeth in the lower jaw, or a mesh designated by the user from among an upper jaw-side mesh generated based on the upper jaw point cloud constituting the row of teeth in the upper jaw and a lower jaw-side mesh generated based on the lower jaw point cloud constituting the row of teeth in the lower jaw (Rejections of claims 4-5 above cover this claim. E.g., “distances” depict by 18s (Figs. 4-5) as indicators to point pairs, see also par. 0052).
Regarding claim 11, Imgrund in the combination further teaches: The data-generating apparatus according to claim 3, wherein the generation processing circuitry is configured to add, as the indicator, information specifying a portion corresponding to a point at which the distance exceeds a threshold value from among the upper jaw point cloud constituting the row of teeth in the upper jaw and the lower jaw point cloud constituting the row of teeth in the lower jaw.
“[0092] The term "contrasting color or shading" in the claims, in reference to the display of contact information, is intended to encompass the situation in which a transition of color occurs between portions of the tooth which are not contacts and portions which are (i.e. distance to the opposing or adjacent tooth is less than a threshold), as well as the situation in which no transition occurs and the portions below the threshold are illustrated in a contrasting color or shading and the portions above the threshold are illustrated in the usual manner (e.g., as white objects or in natural color).”
Also, Imgrund (claim 12) teaches “wherein said portions of said teeth are displayed in a contrasting color relative to other portions of said teeth in which said calculations of distances results in distances greater than said threshold”.
Contact Information
Any inquiry concerning this communication or earlier communications from the examiner should be directed to VU LE whose telephone number is (571)272-7332. The examiner can normally be reached M-F 8:00 - 17:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
Supervisor, Vu Le can be reached at 2-7332. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/VU LE/ Supervisory Patent Examiner, Art Unit 2668