DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/30/25 has been entered.
Double Patenting
The nonstatutory double patenting rejection is based on a judicially created doctrine grounded in public policy (a policy reflected in the statute) so as to prevent the unjustified or improper timewise extension of the “right to exclude” granted by a patent and to prevent possible harassment by multiple assignees. A nonstatutory double patenting rejection is appropriate where the conflicting claims are not identical, but at least one examined application claim is not patentably distinct from the reference claim(s) because the examined application claim is either anticipated by, or would have been obvious over, the reference claim(s). See, e.g., In re Berg, 140 F.3d 1428, 46 USPQ2d 1226 (Fed. Cir. 1998); In re Goodman, 11 F.3d 1046, 29 USPQ2d 2010 (Fed. Cir. 1993); In re Longi, 759 F.2d 887, 225 USPQ 645 (Fed. Cir. 1985); In re Van Ornum, 686 F.2d 937, 214 USPQ 761 (CCPA 1982); In re Vogel, 422 F.2d 438, 164 USPQ 619 (CCPA 1970); In re Thorington, 418 F.2d 528, 163 USPQ 644 (CCPA 1969).
A timely filed terminal disclaimer in compliance with 37 CFR 1.321(c) or 1.321(d) may be used to overcome an actual or provisional rejection based on nonstatutory double patenting provided the reference application or patent either is shown to be commonly owned with the examined application, or claims an invention made as a result of activities undertaken within the scope of a joint research agreement. See MPEP § 717.02 for applications subject to examination under the first inventor to file provisions of the AIA as explained in MPEP § 2159. See MPEP § 2146 et seq. for applications not subject to examination under the first inventor to file provisions of the AIA . A terminal disclaimer must be signed in compliance with 37 CFR 1.321(b).
The filing of a terminal disclaimer by itself is not a complete reply to a nonstatutory double patenting (NSDP) rejection. A complete reply requires that the terminal disclaimer be accompanied by a reply requesting reconsideration of the prior Office action. Even where the NSDP rejection is provisional the reply must be complete. See MPEP § 804, subsection I.B.1. For a reply to a non-final Office action, see 37 CFR 1.111(a). For a reply to final Office action, see 37 CFR 1.113(c). A request for reconsideration while not provided for in 37 CFR 1.113(c) may be filed after final for consideration. See MPEP §§ 706.07(e) and 714.13.
The USPTO Internet website contains terminal disclaimer forms which may be used. Please visit www.uspto.gov/patent/patents-forms. The actual filing date of the application in which the form is filed determines what form (e.g., PTO/SB/25, PTO/SB/26, PTO/AIA /25, or PTO/AIA /26) should be used. A web-based eTerminal Disclaimer may be filled out completely online using web-screens. An eTerminal Disclaimer that meets all requirements is auto-processed and approved immediately upon submission. For more information about eTerminal Disclaimers, refer to www.uspto.gov/patents/apply/applying-online/eterminal-disclaimer.
Claims 1-20 are rejected on the ground of nonstatutory double patenting as being unpatentable over claims 11-20 of U.S. Patent No. 12,144,661. Although the claims at issue are not identical, they are not patentably distinct from each other because the instant claims are broader than the patented claims and use synonyms for the same terms such as “scanning optics of the wand” or “non-contact optical component of the wand” instead of “the wand” even though these are synonyms of the same concept. The exact ways the claims correspond to each other are shown in the table below.
Current Claim
Patented Claim
1. An intraoral scanning system, the system comprising: an intraoral scanner; and a controller configured to operate the intraoral scanning system to: scan a subject's intraoral cavity using scanning optics of the intraoral scanner in an intraoral scanning mode and to display and/or store a model of the subject's dentition based on the scan; identify one or more of a user's fingers within a field of view of the scan; detect a non-contact finger gesture of one or more of the one or more user's fingers using the scanning optics of the intraoral scanner; identify the finger gesture; and execute a command control based on the finger gesture.
11. An intraoral scanning system, the system comprising: an intraoral scanner; and a controller configured to operate the intraoral scanning system to: scan a subject's intraoral cavity with a non-contact optical component of the intraoral scanner when the wand operating in an intraoral scanning mode and to display a model of the subject's dentition based on the scan; identify one or more of a user's fingers within a field of view of the scan; detect a non-contact finger gesture of the one or more of the user's fingers using the non-contact optical component of the intraoral scanner; scan, with the non-contact optical component of the intraoral scanner, the one or more of the user's fingers with the non-contact optical component of the intraoral scanner based on the detection; and execute a command control based on the user's finger gesture.
15. The system of claim 14, wherein switching to the finger gesture scanning mode disables the intraoral scanning mode.
20. An intraoral scanning system, the system comprising: an intraoral scanner; and a controller configured to operate the intraoral scanning system to: scan a subject's intraoral cavity with a non-contact optical component of the intraoral scanner in an intraoral scanning mode and to display a model of the subject's dentition based on the scan; identify one or more fingers using the same non-contact optical component of the intraoral scanner, to detect non-contact motion of one or more of a user's fingers relative to the non-contact optical component; scan, with the non-contact optical component of the intraoral scanner, a user's finger gesture with the intraoral scanner in the finger gesture scanning mode; and execute a command control based on the user's finger gesture.
11. An intraoral scanning system, the system comprising: a wand configured for intraoral scanning; a display; and a controller configured to operate the intraoral scanning system to: scan a subject's intraoral cavity with the wand in an intraoral scanning mode and to display a model of the subject's dentition based on the scan; switch from the intraoral scanning mode to a finger gesture scanning mode in which the system is adapted to detect non-contact motion of one or more of a user's fingers relative to the wand, wherein switching to the finger gesture scanning mode disables the intraoral scanning mode; scan, with a non-contact optical component, a user's finger gesture with the wand in the finger gesture scanning mode; and execute a command control based on the user's finger gesture.
2. The system of claim 1, wherein the controller is further configured to identify the command control from the scan of the user's fingers.
12. The system of claim 11, wherein the controller is further configured to identify the command control from the scan of the user's finger gesture.
12. The system of claim 11, wherein the controller is further configured to identify the command control from the scan of the user's finger gesture.
3. The system of claim 1, wherein the user's finger gesture comprises a position, a movement, or the position and the movement of the user's fingers.
13. The system of claim 11, wherein the user's finger gesture comprises a position, a movement, or the position and the movement of the user's fingers.
13. The system of claim 11, wherein the user's finger gesture comprises a position, a movement, or the position and the movement of the user's fingers.
4. The system of claim 1, wherein the controller is configured to switch from the intraoral scanning mode to a finger gesture scanning mode when the controller detects one or more fingers in the intraoral scanning mode.
14. The system of claim 11, wherein the controller is configured to switch from the intraoral scanning mode to a finger gesture scanning mode when the controller detects one or more fingers in the intraoral scanning mode.
14. The system of claim 11, wherein the controller is configured to switch from the intraoral scanning mode to the finger gesture scanning mode when the controller detects one or more fingers in the intraoral scanning mode.
5. The system of claim 1, wherein the controller is configured to manually switch from the intraoral scanning mode to a finger gesture scanning mode.
16. The system of claim 11, wherein the controller is configured to manually switch from the intraoral scanning mode to a finger gesture scanning mode.
15. The system of claim 11, wherein the controller is configured to manually switch from the intraoral scanning mode to the finger gesture scanning mode.
6. The system of claim 1, wherein the controller is configured to display the user's finger gesture, the command control, or the user's finger gesture and command control on the display.
17. The system of claim 11, wherein the controller is configured to display the user's finger gesture, the command control, or the user's finger gesture and command control on the display.
16. The system of claim 11, wherein the controller is configured to display the user's finger gesture, the command control, or the user's finger gesture and command control on the display.
7. The system of claim 1, wherein the command control comprises one of: zoom in, zoom out, translate, and rotate.
18. The system of claim 11, wherein the command control comprises one of: zoom in, zoom out, translate, and rotate.
17. The system of claim 11, wherein the command control comprises one of: zoom in, zoom out, translate, and rotate.
8. The system of claim 1, wherein the command control comprises a control modifying a displayed image of the subject's dentition.
19. The system of claim 11, wherein the command control comprises a control modifying a displayed image of the subject's dentition.
18. The system of claim 11, wherein the command control comprises a control modifying a displayed image of the subject's dentition.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-9 and 11-20 are rejected under 35 U.S.C. 103 as being unpatentable over Sabina et al. US Patent Publication 2016/0259515 in view of Dal Mutto et al., US Patent Publication 2015/0316996.
Regarding independent claim 1, Sabina et al. teaches an intraoral scanning system, the system comprising:
an intraoral scanner (depicted as scanner 150 of figure 1 and intraoral scanner 200 of figure 2 as given in paragraph 0057); and
a controller configured to operate the intraoral scanning system (depicted as computing device 105 of figure 1 as given in paragraph 0024) to:
scan a subject's intraoral cavity using the scanning optics of the intraoral scanner in an intraoral scanning mode (as given in paragraphs 0023, 0026, and 0028) and to display and/or store a model of the subject's dentition based on the scan (paragraphs 0045 describes how a scan is used to create a 3D rendering of the subject’s dentition as shown in figure 4 and paragraph 0028 explains how the data is stored in the data store 110);
detect a finger gesture of one or more of the one or more user's fingers using the scanning of the intraoral scanner (paragraph 0069 explains how touch sensor 230 of the intraoral scanner 200 of figure 2 may detect or scan for a user’s finger gesture); and
execute a command control based on the finger gesture (paragraph 0069 explains how the finger gesture may be used to control the intraoral scan application 108, based on the specific gesture made).
While all of the features are taught by Sabina et al., they are taught across various embodiments that are not explicitly said to be generic for all embodiments. However, it would be obvious to one of ordinary skill in the art before the effective filing date to combine the embodiments as described. The rationale to combine would be so the invention can be practiced with modification and alteration within the spirit and scope of the desired invention (paragraph 0109 of Sabina et al.).
While Sabina et al. discusses finger input, Sabina et al. is more focused on the intraoral scanning and does not provide full details about the finger scanning. Sabina et al. does not explicitly teach finger gesture scanning in which the system is adapted to identify one or more of a user’s fingers within a field of view of the scan and detect a non-contact finger gesture of one or more of a user’s fingers and that the user’s finger gesture is scanned with non-contact scanning optics to identify the user’s finger gesture. Dal Mutto et al. teaches finger gesture scanning in which the system is adapted to identify one or more of a user’s fingers within a field of view of the scan (paragraph 0053 explains how a gesture is received by an acquisition system 15 that may include the non-contact optical component of a camera scanning a body part like a finger without contact to determine an input) and detect a non-contact finger gesture of one or more of a user’s fingers and that the user’s finger gesture is scanned with non-contact scanning optics (paragraph 0053 explains how a gesture is received by an acquisition system 15 scanning a body part like a finger without contact to determine an input where the optical components of cameras 14 and 16 of figure 1 and paragraph 0053 are not contacted) to identify the user’s finger gesture (paragraph 0053 explains how a gesture is received by an acquisition system 15 that may include the non-contact optical component of a camera scanning a body part like a finger without contact to determine an input). It would have been obvious to one of ordinary skill in the art before the effective filing date to include the concept of contactless scanning and recognition of gestures as taught by Dal Mutto et al. into the system of Sabina et al. The rationale to combine would be for better user experiences due to reduced lag and better responsiveness due to computational and energy efficiency of the system (paragraph 0004 of Dal Mutto et al.).
Regarding claim 2, Sabina et al. teaches the system of claim 1, wherein the controller is further configured to identify the command control from the scan of the user's fingers (paragraph 0069 explains how touch sensor 230 of the intraoral scanner 200 of figure 2 may detect a user’s finger gesture to control the intraoral scan application 108, based on the specific gesture made).
Regarding claim 3, Sabina et al. teaches the system of claim 1, wherein the user's finger gesture comprises a position, a movement, or the position and the movement of the user's fingers (paragraph 0069 describes the gestures detected to include movement and position as the various swipe gestures incorporate both).
Regarding claim 4, Sabina et al. teaches the system of claim 1, wherein the controller is configured to switch from the intraoral scanning mode to a finger gesture scanning mode when the controller detects one or more fingers in the intraoral scanning mode (paragraph 0045 explains that “touch input module 122 disables the touch sensor of the touch sensitive scanner 150 while a scan is being performed” such that the device switches between the modes and enters a mode that accepts touch input after the scan ends to switch modes as given in paragraph 0046).
Regarding claim 5, Sabina et al. teaches the system of claim 1, wherein the controller is configured to manually switch from the intraoral scanning mode to the finger gesture scanning mode (paragraph 0045 explains that “touch input module 122 disables the touch sensor of the touch sensitive scanner 150 while a scan is being performed” such that the device switches between the modes and enters a mode that accepts touch input after the scan ends to switch modes as given in paragraph 0046).
Regarding claim 6, Sabina et al. teaches the system of claim 1, wherein the controller is configured to display the user's finger gesture, the command control, or the user's finger gesture and command control on the display (paragraph 0102 describes the depiction of figure 7 of a display with an overlay of the gesture and instructions of the command control associated with the gesture).
Regarding claim 7, Sabina et al. teaches the system of claim 1, wherein the command control comprises one of: zoom in, zoom out, translate, and rotate (paragraph 0069 recites that “holding the button 240 in conjunction with a swipe up gesture, swipe down gesture or side swipe gesture may cause the 3D rendering to zoom in or out”).
Regarding claim 8, Sabina et al. teaches the system of claim 1, wherein the command control comprises a control modifying a displayed image of the subject's dentition (paragraph 0069 recites that “The touch sensor 230 and at least one button 240 may be used in conjunction to perform additional control of the user interface, medical images, and/or representations generated from the medical images. For example, upon holding the button 240 in conjunction with a swipe up gesture, swipe down gesture or side swipe gesture on the touch sensor 230, the intraoral scan application 108 may launch an overlay mode similar to that shown in FIG. 7. In another example, holding the button 240 in conjunction with a swipe up gesture, swipe down gesture or side swipe gesture may cause the 3D rendering to zoom in or out.”).
Regarding claim 9, Sabina et al. teaches the system of claim 1, further comprising a control configured to switch between intraoral scanning mode and the finger gesture scanning mode (paragraph 0045 explains that “touch input module 122 disables the touch sensor of the touch sensitive scanner 150 while a scan is being performed” such that the device switches between the modes and enters a mode that accepts touch input after the scan ends as given in paragraph 0046).
Regarding independent claim 11, Sabina et al. teaches an intraoral scanning system, the system comprising:
an intraoral scanner (depicted as scanner 150 of figure 1 and intraoral scanner 200 of figure 2 as given in paragraph 0057); and
a controller configured to operate the intraoral scanning system (depicted as computing device 105 of figure 1 as given in paragraph 0024) to:
scan a subject's intraoral cavity with a non-contact optical component of the intraoral scanner when operating in an intraoral scanning mode (as given in paragraphs 0023, 0026, and 0028) and to display a model of the subject's dentition based on the scan (paragraphs 0045 describes how a scan is used to create a 3D rendering of the subject’s dentition as shown in figure 4);
scan, with the optical component of the intraoral scanner, the one or more of the user's fingers gesture with the component of the intraoral scanner based on the detection of motion (paragraph 0069 explains how touch sensor 230 of the intraoral scanner 200 of figure 2 may detect or scan for a user’s finger gesture); and
execute a command control based on the user's finger gesture (paragraph 0069 explains how the finger gesture may be used to control the intraoral scan application 108, based on the specific gesture made).
While all of the features are taught by Sabina et al., they are taught across various embodiments that are not explicitly said to be generic for all embodiments. However, it would be obvious to one of ordinary skill in the art before the effective filing date to combine the embodiments as described. The rationale to combine would be so the invention can be practiced with modification and alteration within the spirit and scope of the desired invention (paragraph 0109 of Sabina et al.).
While Sabina et al. discusses finger input, Sabina et al. is more focused on the intraoral scanning and does not provide full details about the finger scanning. Sabina et al. does not explicitly teach finger gesture scanning in which the system is adapted to identify one or more of a user’s fingers within a field of view of the scan and to detect a non-contact finger gesture of the one or more of the user’s fingers relative to the non-contact optical component and that the user’s finger gesture is scanned with a non-contact optical component of the scanner. Dal Mutto et al. teaches finger gesture scanning in which the system is adapted to identify one or more of a user’s fingers within a field of view of the scan (paragraph 0053 explains how a gesture is received by an acquisition system 15 that may include the non-contact optical component of a camera scanning a body part like a finger without contact to determine an input) and to detect a non-contact finger gesture of the one or more of the user’s fingers relative to the device (paragraph 0053 explains how a gesture is received by an acquisition system 15 scanning a body part like a finger without contact to determine an input where the optical components of cameras 14 and 16 of figure 1 and paragraph 0053 are not contacted) and that the user’s finger gesture is scanned with a non-contact optical component of the scanner (paragraph 0053 explains how a gesture is received by an acquisition system 15 that may include the non-contact optical component of a camera scanning a body part like a finger without contact to determine an input). It would have been obvious to one of ordinary skill in the art before the effective filing date to include the concept of contactless scanning and recognition of gestures as taught by Dal Mutto et al. into the system of Sabina et al. The rationale to combine would be for better user experiences due to reduced lag and better responsiveness due to computational and energy efficiency of the system (paragraph 0004 of Dal Mutto et al.).
Regarding claim 12, Sabina et al. teaches the system of claim 11, wherein the controller is further configured to identify the command control from the scan of the user's finger gesture (paragraph 0069 explains how touch sensor 230 of the intraoral scanner 200 of figure 2 may detect a user’s finger gesture to control the intraoral scan application 108, based on the specific gesture made).
Regarding claim 13, Sabina et al. teaches the system of claim 11, wherein the user's finger gesture comprises a position, a movement, or the position and the movement of the user's fingers (paragraph 0069 describes the gestures detected to include movement and position as the various swipe gestures incorporate both).
Regarding claim 14, Sabina et al. teaches the system of claim 11, wherein the controller is configured to switch from the intraoral scanning mode to a finger gesture scanning mode when the controller detects one or more fingers in the intraoral scanning mode (paragraph 0045 explains that “touch input module 122 disables the touch sensor of the touch sensitive scanner 150 while a scan is being performed” such that the device switches between the modes and enters a mode that accepts touch input after the scan ends to switch modes as given in paragraph 0046).
Regarding claim 16, Sabina et al. teaches the system of claim 11, wherein the controller is configured to manually switch from the intraoral scanning mode to the finger gesture scanning mode (paragraph 0045 explains that “touch input module 122 disables the touch sensor of the touch sensitive scanner 150 while a scan is being performed” such that the device switches between the modes and enters a mode that accepts touch input after the scan ends to switch modes as given in paragraph 0046).
Regarding claim 17, Sabina et al. teaches the system of claim 11, wherein the controller is configured to display the user's finger gesture, the command control, or the user's finger gesture and command control on the display (paragraph 0102 describes the depiction of figure 7 of a display with an overlay of the gesture and instructions of the command control associated with the gesture).
Regarding claim 18, Sabina et al. teaches the system of claim 11, wherein the command control comprises one of: zoom in, zoom out, translate, and rotate (paragraph 0069 recites that “holding the button 240 in conjunction with a swipe up gesture, swipe down gesture or side swipe gesture may cause the 3D rendering to zoom in or out”).
Regarding claim 19, Sabina et al. teaches the system of claim 11, wherein the command control comprises a control modifying a displayed image of the subject's dentition (paragraph 0069 recites that “The touch sensor 230 and at least one button 240 may be used in conjunction to perform additional control of the user interface, medical images, and/or representations generated from the medical images. For example, upon holding the button 240 in conjunction with a swipe up gesture, swipe down gesture or side swipe gesture on the touch sensor 230, the intraoral scan application 108 may launch an overlay mode similar to that shown in FIG. 7. In another example, holding the button 240 in conjunction with a swipe up gesture, swipe down gesture or side swipe gesture may cause the 3D rendering to zoom in or out.”).
Regarding independent claim 20, Sabina et al. teaches an intraoral scanning system, the system comprising:
an intraoral scanner (depicted as scanner 150 of figure 1 and intraoral scanner 200 of figure 2 as given in paragraph 0057); and
a controller configured to operate the intraoral scanning system (depicted as computing device 105 of figure 1 as given in paragraph 0024) to:
scan a subject's intraoral cavity with a non-contact optical component of the intraoral scanner in an intraoral scanning mode (as given in paragraphs 0023, 0026, and 0028) and to display a model of the subject's dentition based on the scan (paragraphs 0045 describes how a scan is used to create a 3D rendering of the subject’s dentition as shown in figure 4);
identify one or more fingers using the same component of the intraoral scanner (paragraph 0045 explains that “touch input module 122 disables the touch sensor of the touch sensitive scanner 150 while a scan is being performed” such that the device switches between the modes and enters a mode that accepts touch input on the wand after the scan ends as given in paragraph 0046);
scan, with the optical component of the intraoral scanner, a user's finger gesture with the intraoral scanner in the finger gesture scanning mode (paragraph 0069 explains how touch sensor 230 of the intraoral scanner 200 of figure 2 may detect or scan for a user’s finger gesture); and
execute a command control based on the user's finger gesture (paragraph 0069 explains how the finger gesture may be used to control the intraoral scan application 108, based on the specific gesture made).
While all of the features are taught by Sabina et al., they are taught across various embodiments that are not explicitly said to be generic for all embodiments. However, it would be obvious to one of ordinary skill in the art before the effective filing date to combine the embodiments as described. The rationale to combine would be so the invention can be practiced with modification and alteration within the spirit and scope of the desired invention (paragraph 0109 of Sabina et al.).
While Sabina et al. discusses finger input, Sabina et al. is more focused on the intraoral scanning and does not provide full details about the finger scanning. Sabina et al. does not explicitly teach finger gesture scanning in which the system is adapted to identify one or more fingers in the intraoral scanning mode to detect non-contact motion of one or more of a user’s fingers relative to the non-contact optical component and that the user’s finger gesture is scanned with a non-contact optical component. Dal Mutto et al. teaches finger gesture scanning in which the system is adapted to identify one or more fingers in the intraoral scanning mode to detect non-contact motion of one or more of a user’s fingers relative to the device (paragraph 0053 explains how a gesture is received by an acquisition system 15 scanning a body part like a finger without contact to determine an input where the optical components of cameras 14 and 16 of figure 1 and paragraph 0053 are not contacted) and that the user’s finger gesture is scanned with a non-contact optical component (paragraph 0053 explains how a gesture is received by an acquisition system 15 that may include the non-contact optical component of a camera scanning a body part like a finger without contact to determine an input). It would have been obvious to one of ordinary skill in the art before the effective filing date to include the concept of contactless scanning and recognition of gestures as taught by Dal Mutto et al. into the system of Sabina et al. The rationale to combine would be for better user experiences due to reduced lag and better responsiveness due to computational and energy efficiency of the system (paragraph 0004 of Dal Mutto et al.).
Claim 10 is rejected under 35 U.S.C. 103 as being unpatentable over Sabina et al. US Patent Publication 2016/0259515 in view of Dal Mutto et al., US Patent Publication 2015/0316996, further in view of Pulido et al., US Patent 8,989,567.
Regarding claim 10, Sabina et al. and Dal Mutto et al. teach the system of claim 11 and Sabina et al. teaches further use of the intraoral scanner as the device (depicted as scanner 150 of figure 1 and intraoral scanner 200 of figure 2 as given in paragraph 0057). Sabina et al. and Dal Mutto et al. do not teach a system further comprising a removable sleeve configured to at least partially cover the portion of the intraoral scanner. Pulido et al. teaches a system further comprising a removable sleeve configured to at least partially cover the device (column 6, lines 60-65 explain how a removable sleeve is used to cover the device). It would have been obvious to one of ordinary skill to cover the intraoral scanning device of Sabina et al. as used with Dal Mutto et al. with a removable sleeve as taught by Pulido et al. The rationale to combine would be for maintaining sanitary conditions (column 6, lines 60-65 of Pulido et al.).
Allowable Subject Matter Over Prior Art
Claim 15 is objected to as being dependent upon a rejected base claim, but would be allowable over the prior art if rewritten in independent form including all of the limitations of the base claim and any intervening claims after overcoming the double patenting rejection.
The following is a statement of reasons for the indication of allowable subject matter: none of the prior art, taken alone or in combination, teaches the limitations of claim 15, especially “wherein switching to the finger gesture scanning mode disables the intraoral scanning mode” as recited.
Response to Arguments
Applicant's arguments filed 12/15/25 have been fully considered but they are not persuasive.
Applicant contends that the proposed amendments overcome the current rejection as the prior art does not teach the given details including that the device identifies the fingers within a field of view of the scan, rendering the claims to be allowable. The examiner disagrees Since it is well-known to 1) scan teeth optically and 2) scan fingers optically to determine a gesture input and 3) use the same optics to scan different parts of a user's body, the combination of teachings would have been obvious at the time of the filing. Sabina teaches detection of teeth and fingers using the same device and Dal Mutto specifies the use of optical scanning of the finger. Thus, it would be known to use the optical scanning of the fingers of Dal Mutto in the system of Sabina. Further amendments are necessary to explain the finger gesture scanning mode and what is different about the mode from the intraoral scanning mode. Merely storing different images to compare against the input is not sufficient for patentability but the different software and techniques used in each mode may be. In response to applicant's arguments against the references individually, one cannot show nonobviousness by attacking references individually where the rejections are based on combinations of references. See In re Keller, 642 F.2d 413, 208 USPQ 871 (CCPA 1981); In re Merck & Co., 800 F.2d 1091, 231 USPQ 375 (Fed. Cir. 1986).
It is noted that the applicant plans to address the double patenting issues at a later time but they are still applicable with this current claims set.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. The most relevant prior art is made of record in the attached notice of references cited.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PARUL H GUPTA whose telephone number is (571)272-5260. The examiner can normally be reached Monday through Friday, from 10 AM to 7 PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Ke Xiao can be reached on 571-272-7776. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PARUL H GUPTA/Primary Examiner, Art Unit 2627