Prosecution Insights
Last updated: April 19, 2026
Application No. 18/290,581

WIRELESS SCANNING SYSTEM AND WIRELESS SCANNING METHOD

Non-Final OA §103
Filed
Jan 19, 2024
Examiner
AFRIFA-KYEI, ANTHONY D
Art Unit
2686
Tech Center
2600 — Communications
Assignee
Medit Corp.
OA Round
3 (Non-Final)
65%
Grant Probability
Moderate
3-4
OA Rounds
3y 0m
To Grant
78%
With Interview

Examiner Intelligence

Grants 65% of resolved cases
65%
Career Allow Rate
353 granted / 546 resolved
+2.7% vs TC avg
Moderate +14% lift
Without
With
+13.5%
Interview Lift
resolved cases with interview
Typical timeline
3y 0m
Avg Prosecution
39 currently pending
Career history
585
Total Applications
across all art units

Statute-Specific Performance

§101
3.4%
-36.6% vs TC avg
§103
71.3%
+31.3% vs TC avg
§102
11.9%
-28.1% vs TC avg
§112
8.4%
-31.6% vs TC avg
Black line = Tech Center average estimate • Based on career data from 546 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. Status of Claims In the amendment filed on December 22nd, 2025, claims 1 has been amended, claim 8 has been cancelled and no new claim has been added. Therefore, claims 1-7, 9-15 are pending for examination. A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/22/2025 has been entered. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 1, 4, 9 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kucharczyk et al. (US 20220133445 A1) in view of Kobayashi et al. (US 20190191132 A1) and Troy et al. (US 20160318625 A1). In regards to claim 1, Kucharczyk teaches a wireless scanning system comprising a wireless scanner configured to scan an object to acquire image data (Paragraphs 22, 25,32 55), i.e. The present invention may provide a method and system for providing feedback during intraoral scanning by images or light patterns projected onto an object such teeth. The invention may therefore enable intraoral scanning in which a display/monitoring screen may not be needed and wherein unsuccessful registrations gaps/holes may be corrected by informing a user to repeat scans at corresponding locations in the intra-oral cavity.[P-22] A shape of the inner region 44 and/or outer region 42 may be pre-determined, for example, square, rectangle, circle etc. In an exemplary embodiment of the present invention as shown in FIG. 2, an intraoral camera 32 may be electrically paired with a separate projector 10a with the projector 10a projecting at least a portion of projection image 48 on a tooth surface 28 for user feedback. The intra-oral camera 32 itself may also have a projector 10 for projecting at least another portion of the projection image 48 for 3D measurement. Said pairing may be achieved by using a common controller such as a computer processor 122 to simultaneously control illumination of the tooth surface 28 by the illumination beam 20 from the projector 10a and intra-oral camera 32 and recording of the reflected monitoring beam 50[P-25] The invention may include producing and updating a 3D model through a scanning process by a user moving an intraoral camera head 30 over a location (e.g. a tooth surface 28). The scan may begin as shown in Step S100 wherein a memory of a computer system 100 may store a volume of the intra-oral cavity 52 as an empty volume which may be subsequently updated as scanning progresses. After placing the intra-oral camera head 30 over the tooth surface in Step S200, a projector 10 of the intra-oral camera 32 may project an initial default projection image over a region of the intra-oral cavity 52, to show that image data have not been acquired yet for processing. [P-32] The computer system 100 also may include a communications interface 146 that enables software and data to be transferred between the computer system 100 and external devices. Such an interface may include a modem, a network interface (e.g., an Ethernet card or an IEEE 802.11 wireless LAN interface), a communications port (e.g., a Universal Serial Bus (“USB”) port or a FireWire® port), a Personal Computer Memory Card International Association (“PCMCIA”) interface, Bluetooth®, and the like. Software and data transferred via the communications interface 146 may be in the form of signals, which may be electronic, electromagnetic, optical or another type of signal that may be capable of being transmitted and/or received by the communications interface 146. Signals may be provided to the communications interface 146 via a communications path 148 (e.g., a channel). The communications path 148 may carry signals and may be implemented using wire or cable, fiber optics, a telephone line, a cellular link, a radio-frequency (“RF”) link, or the like. The communications interface 146 may be used to transfer software or data or other information between the computer system 100 and a remote server or cloud-based storage [P-55] Though the wireless component of the scanner is not explicitly disclosed in Kucharczyk’s teaching, however, by Kucharczyk teaching the intraoral camera may be electrically paired with a separate projector, with the said pairing may be achieved by using a common controller such as a computer processor, and further elaborating the computer system communicates using wireless communication protocols with external devices( which may include the projector and the scanner) (Paragraphs 25, 55, see above), then it would be obvious to one of ordinary skill in the art that the scanner would have to be wirelessly compatible in order to communication with the computer processor and in turn the projector. The wireless scanning (via the cited communication interface) in Kucharczyk is done/captured by an intraoral camera, to which the scanned area has an image is projected on thereafter. Kucharczyk then teaches a communication hub configured to receive the image data transmitted from the wireless scanner; the communication hub in this case being a controller/processor (Paragraph 25), i.e. A shape of the inner region 44 and/or outer region 42 may be pre-determined, for example, square, rectangle, circle etc. In an exemplary embodiment of the present invention as shown in FIG. 2, an intraoral camera 32 may be electrically paired with a separate projector 10a with the projector 10a projecting at least a portion of projection image 48 on a tooth surface 28 for user feedback. The intra-oral camera 32 itself may also have a projector 10 for projecting at least another portion of the projection image 48 for 3D measurement. Said pairing may be achieved by using a common controller such as a computer processor 122 to simultaneously control illumination of the tooth surface 28 by the illumination beam 20 from the projector 10a and intra-oral camera 32 and recording of the reflected monitoring beam 50 [P-25] Kucharczyk teaches a processing device which is connected to the communication hub and is configured to display the image data received by the communication hub; the processing device in this disclosure being a projector (Paragraph 25), i.e. A shape of the inner region 44 and/or outer region 42 may be pre-determined, for example, square, rectangle, circle etc. In an exemplary embodiment of the present invention as shown in FIG. 2, an intraoral camera 32 may be electrically paired with a separate projector 10a with the projector 10a projecting at least a portion of projection image 48 on a tooth surface 28 for user feedback. The intra-oral camera 32 itself may also have a projector 10 for projecting at least another portion of the projection image 48 for 3D measurement. Said pairing may be achieved by using a common controller such as a computer processor 122 to simultaneously control illumination of the tooth surface 28 by the illumination beam 20 from the projector 10a and intra-oral camera 32 and recording of the reflected monitoring beam 50 [P-25] A pairer which is disposed in at least one of the wireless scanner or the communication hub and is configured to connect the wireless scanner and the communication hub to each other (Paragraph 25), i.e. A shape of the inner region 44 and/or outer region 42 may be pre-determined, for example, square, rectangle, circle etc. In an exemplary embodiment of the present invention as shown in FIG. 2, an intraoral camera 32 may be electrically paired with a separate projector 10a with the projector 10a projecting at least a portion of projection image 48 on a tooth surface 28 for user feedback. The intra-oral camera 32 itself may also have a projector 10 for projecting at least another portion of the projection image 48 for 3D measurement. Said pairing may be achieved by using a common controller such as a computer processor 122 to simultaneously control illumination of the tooth surface 28 by the illumination beam 20 from the projector 10a and intra-oral camera 32 and recording of the reflected monitoring beam 50 [P-25] Kucharczyk further fails to teach the wireless scanner includes a light projector configured, after a connection between the wireless scanner and the communication hub is established, to emit red light for one unit time, green light for one unit time, and blue light for N unit times, where N is an integer equal to or greater than 2. Kobayashi on the other hand teaches the wireless scanner includes a light projector configured, after a connection between the wireless scanner and the communication hub is established, to emit red light for one unit time, green light for one unit time, and blue light for N unit times, where N is an integer equal to or greater than 2. (Paragraph 113, 116, 117) FIG. 14 is a view illustrating an outline of one example of the control example of the light sources 150 according to the present sixth embodiment. A control for the light sources 150 illustrated in FIG. 14 is an example in which an emitting period for mixed color (white (W) in the example of FIG. 14) is added into the example according to the third embodiment described above (FIG. 5). As illustrated in FIG. 14, for example, in a video display mode #2, white (W) is inserted between red (R) and green (G) and between green (G) and blue (B), and setting of a combination of a lighting ratio and a current value per unit time is set to a desired relationship [P-113] FIG. 16 is a view illustrating an outline of one example of the control example of the light sources 150 according to the present sixth embodiment. A control for the light sources 150 illustrated in FIG. 16 is an example in which an emitting period for mixed color (white (W) in the example of FIG. 15) is added into the example according to the fifth embodiment described above (FIG. 7). As illustrated in FIG. 16, for example, in a normal light source control mode of a case where an OSD exists on a screen by turning the OSD ON, white (W) is inserted between red (R) and green (G) and between green (G) and blue (B), and setting of a combination of a lighting ratio and a current value per unit time is set to a desired relationship. Controls other than the addition of the emitting period for the mixed color are similar to the controls that have already been explained in the fifth embodiment. Therefore, their explanation will be omitted. By providing the emitting period for the mixed color in this manner, it is possible to provide the projection video display apparatus in which the maximum brightness of projected video is larger (brighter) while obtaining the effects according to the fifth embodiment.[P-116] As explained above, according to the projection video display apparatus 100 of the present embodiment, by adding the mixed color generated by mixing red (R), green (G) and blue (B) to the lighting ratios per the unit of time (Δt) in addition to red (R), green (G) and blue (B), it is possible to make the maximum brightness of the projected video larger (or brighter) in a case where the input video is bright video while securing darkness of the projected video in a case where the input video is dark video.[P-117] Here, we see a projector system to which the radiated reb, green, an blue time units are set to a desirable ratio, illustrating their customizable configuration, an thereby, by obviousness being able to be set/configured at red light for one unit time, green light for one unit time, and blue light for N unit times, where N is an integer equal to or greater than 2. Thereby, it would have been obvious to one of ordinary skill in the art during the time of the filing date to combine Kobayashi’s teaching with Kucharcyk’s teaching in order to effectively configure a desired output configuration of the projector component. Kucharczyk modified fails to explicitly teach wireless scanner and the communication hub further comprise communication modules for data communication, respectively, and wherein the communication modules are configured to exchange unique information of the wireless scanner and unique information of the communication hub with each other to cause the wireless scanner and the communication hub to be paired with each other. Troy on the other hand teaches a method for pairing devices that includes the exchange of unique identifier information to pair the devices through RFID protocol (Paragraphs 4, 16), i.e. The method includes identifying a selected control unit from the plurality of control units that will control a selected object from the plurality of objects, placing a hand-held scanner in close proximity to a first machine-readable tag on the selected control unit to acquire a first unique ID for only the selected control unit, placing the hand-held scanner in close proximity to a second machine-readable tag on the selected object to acquire a second unique ID for only the selected object, and associating the first unique ID with the second unique ID to pair the selected control unit with the selected object. [P-4] to perform the pairing method, user 202 positions scanner 100 such that scanner detects control unit 201 on an associated seat 204. For an RFID scanner, this involves positioning scanner 100 such that the strongest RF signal detected by scanner 100 is a signal from control unit 201 of seat 204. For an optical (e.g., barcode or QR code) scanner 100, this involves directing a beam or view direction of scanner 100 such that scanner 100 reads an optical label on control unit 201 of seat 204. Once control unit 201 is scanned to acquire a unique ID, scanner 100 may give an indication (e.g., a visual, audio, or tactile alert) to user 202 to confirm control 201 has been scanned. [P-16] Hence, Troy’s device pairing protocol is applicable to the communication modules are configured to exchange unique information of the wireless scanner and unique information of the communication hub with each other to cause the wireless scanner and the communication hub to be paired with each other. It would be obvious to one of ordinary skill in the art during the filing date of the invention to combine Troy’s teaching with Kucharczyk’s teaching, substituting Kucharczyk’s pairing method with Troy’s method in order to enable a cost effective, yet secure method of pairing devices accordingly. In regards to claim 4, Kucharczyk modified teaches the pairer is a light projector configured to emit a predetermined light pattern from inside of the wireless scanner to outside of the wireless scanner (Paragraphs 32; Figure 6), i.e. The invention may include producing and updating a 3D model through a scanning process by a user moving an intraoral camera head 30 over a location (e.g. a tooth surface 28). The scan may begin as shown in Step S100 wherein a memory of a computer system 100 may store a volume of the intra-oral cavity 52 as an empty volume which may be subsequently updated as scanning progresses. After placing the intra-oral camera head 30 over the tooth surface in Step S200, a projector 10 of the intra-oral camera 32 may project an initial default projection image over a region of the intra-oral cavity 52, to show that image data have not been acquired yet for processing. [P-32] wherein the communication hub comprises a pattern sensor which senses the light pattern wherein the pattern sensor is configured to sense the light pattern to cause the wireless scanner and the communication hub to be paired with each other (Paragraph 35; Figure 6), i.e. As shown in Step 300, the inner rays 24 of the illumination beam may be substantially reflected off the current measurement surface 54 towards an image sensor 18 of the intra-oral camera 32 for further processing. Herein, 3D coordinates of the current measurement surface 54 may be extracted in Step S400 to determine if all or part of the current measurement surface 54 has been previously registered (Step S500). If the current measurement surface 54 or any portion of the current measurement surface has not been previously acquired, the corresponding new 3D data (e.g. xyz coordinates) of unregistered portions may be stored by determining if the new 3D data and/or accumulated 3D data are sufficient according to one or more predetermined recording criteria/conditions (Step S600) said predetermined recording criteria including for example, whether or not the new 3D data and/or accumulated 3D data have a desired predetermined resolution, predetermined noise level, predetermined 3D point density, and/or inconsistencies between individual optical 3D measurements. If the predetermined recording criteria is satisfied, the new 3D data may be stored in an image stack for post processing or the 3D data may be used to create a real-time 3D reconstruction of the intraoral cavity as shown in Step S800. The outer region 42 of the projection image 48 corresponding to the new 3D data may then be updated in Step S1000 to relay to the user when the intra-oral camera 32 camera position changes to a new measurement surface that the previous measurement surface has been sufficiently recorded according to predetermined recording criteria. Herein, a need to constantly look at a monitor to track the progress of an ongoing intra-oral scan and make necessary adjustments may be eliminated or substantially eliminated as a visual feedback for the user may be shown in the vicinity of the intra-oral surfaces being scanned. [P-35] Kucharczyk teaches the communication hub comprises a pattern sensor which senses the light pattern wherein the pattern sensor is configured to sense the light pattern to cause the wireless scanner and the communication hub to be paired with each other; the light pattern used by the scanning procedure by the intraoral camera is received processed and paired with the projector, such that based on the scanned data the projector may superimpose an image adequately based on the data scanned from the scanning procedure. In regards to claim 9, Kucharczyk teaches the processing device is configured to generate a three-dimensional model of the object on the basis of the image data transmitted from the communication hub, and wherein the processing device is configured to display at least one of the image data or the three-dimensional model (Paragraphs 24, 33), i.e. The projector may illuminate a 3D measuring field as well as surfaces outside the measuring field. Herein, information about already acquired/scanned surfaces, surfaces not yet acquired as well as other 3D measurement information (such as scan-body registration) may be visually superimposed by using a projected pattern or color onto surfaces outside the measuring field. In this way the user can control the scan process while looking into the mouth of the patient. Said projection may be dynamic, wherein an illumination beam 20 from the projector 10 may be controlled to produce said one or more projection images 48 at preferably predetermined time intervals and wherein at least a part of the illumination beam 20 may be reflected into a monitoring beam 50 for 3D measurement. [P-24] the projector 10 may be configured to provide a successful registration feedback signifying a successful registration of the current measurement surface 54 and a non-successful registration feedback signifying a non-successful registration of the current measurement surface 54 e.g. a blue colored rectangle or striped pattern for 3D measurement (the inner region 44 of the projection image 48, FIG. 5, which also corresponds to a current measurement surface 54) may be inside a larger red colored rectangle (outer region 42 of the projection image 48 corresponding to other surfaces 56 of the cavity), the red colored rectangle signifying a non-successful registration of intra-oral surfaces on which said feedback is projected. Upon any successful registration, corresponding surfaces 46 of the successful registration in the outer region 42 may receive a successful registration feedback, e.g. green light, signifying that said corresponding surfaces of the successful registration 46 have been successfully registered. [P-33] Claim(s) 2, 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kucharczyk et al. (US 20220133445 A1) in view of Kobayashi et al. (US 20190191132 A1) and Troy et al. (US 20160318625 A1), applied in claim 1 above, in further view of Lee et al. (US 20160307539 A1). In regards to claim 2, Kucharczyk modified teaches the pairer is a light projector configured to emit a predetermined form of light from inside of the wireless scanner to outside of the wireless scanner (Paragraphs 24, 25; Figure 1), i.e. FIG. 1 illustrates a block diagram of a light guidance system 101 comprising at least one projector 10 having a light source 12, at least one image sensor 18 (e.g. a 3D image sensor) and a computer system 100 with at least one computer processor 122 (FIG. 7). The projector 10 and image sensor 18 may be in communication with the computer system 100. The projector 10 may be a projector housed within a camera such as an intraoral camera 32. Alternatively, the projector 10 may be a separate projector such as a digital light projector outside the intra-oral camera 32. Projectors 10 may work on a principle of filtering a light source 12 based on an image to be displayed. A lens may then be used to transfer the image to a surface on which the image may be displayed. Different kinds of projectors may be used, including Digital Light Processing (DLP) projectors which may be based on Digital Micromirror Device (DMD) technology wherein an array of microscopic mirrors may be configured to tilt either toward the light source 12 in the projector 10 or away from it in order to create a light or dark pixel on a projection surface. Other kinds of projectors may include Light Emitting Diode (LED) projectors, Laser projectors, Liquid Crystal on Silicon (LCoS) projectors and Liquid Crystal Display (LCD) projectors. One or more projectors 10 may be used for projecting one or more projection images 48 (FIG. 5) on a surface such as a tooth surface 28 (FIG. 4) and may be constructed and operated in accordance with at least one exemplary embodiment herein. The projector may illuminate a 3D measuring field as well as surfaces outside the measuring field. Herein, information about already acquired/scanned surfaces, surfaces not yet acquired as well as other 3D measurement information (such as scan-body registration) may be visually superimposed by using a projected pattern or color onto surfaces outside the measuring field. In this way the user can control the scan process while looking into the mouth of the patient. Said projection may be dynamic, wherein an illumination beam 20 from the projector 10 may be controlled to produce said one or more projection images 48 at preferably predetermined time intervals and wherein at least a part of the illumination beam 20 may be reflected into a monitoring beam 50 for 3D measurement. [P-25] Kucharczyk fails to teach the communication hub comprises an illuminance sensor which senses the light, and wherein the illuminance sensor configured to sense at least one of a brightness of the light or a color change of the light to cause the wireless scanner and the communication hub to be paired with each other. Lee on the other hand teaches the communication hub comprises an illuminance sensor which senses the light, and wherein the illuminance sensor configured to sense at least one of a brightness of the light or a color change of the light to cause the wireless scanner and the communication hub to be paired with each other (Paragraphs 4, 12), i.e. a device having a display; n terminals; and a server configured to connect the device having a display with the n terminals, wherein the server provides an ID to the device having a display according to an access request from the device, wherein the device having a display calculates a brightness value change pattern corresponding to the ID, and outputs the pattern to the display, and wherein each of the n terminals detects the pattern output to the display using a proximity sensor and a light sensor, then recognizes the ID, transmits the ID to the server, and requests for access. [P-4] the purpose of the present invention, as embodied and broadly described herein, there is also provided a mobile terminal, including: a proximity sensor configured to sense a proximity degree with a device having a display; a light sensor configured to sense a brightness value change pattern output to the display; and a controller configured to recognize an ID corresponding to the sensed brightness value change pattern, to transmit the recognized ID to a server, and to request for access, wherein the device having a display requests to the server for access to thus be provided with the ID, calculates a brightness value change pattern corresponding to the ID, and outputs the pattern to the display, and wherein the server connects the device having a display with the mobile terminal according to the access request from the controller. [P-12] In other words, after the sensor senses a brightness the sensed ID corresponding to the brightness is then transmitted to a communication paring hub, to which there after sends the ID corresponding to the brightness to the display device such that the brightness is accounted for. It would have been obvious to a person of ordinary skill in the art before the effective filing of the invention to combine Lee’s teaching with Kucharczyk modified’s teaching in order to optimize the transmission and display of detail of the scanned image brightness configuration. In regards to claim 3, Kucharcyzk modified teaches the light projector is configured to emit light having a form different from light emitted when scanning the object (Paragraphs 29, 33, Kucharcyzk), i.e. the intra-oral camera 32 may be configured so that the reflected monitoring beam 50 for 3D measurement includes all or substantially all portions of the illumination beam 20 that are configured for 3D measurement. Further portions of the illumination beam 20 configured for 3D measurement (inner rays of the illumination beam 24) may be configured to be structured illumination patterns and/or may be modulated before illumination. By using various structured illumination patterns, 3D surface profiles of tooth surfaces 28 may be measured. Moreover, by modulating with a predetermined frequency, only signals corresponding to that frequency may be detected by the image sensor 18 for further processing. [P-29] the projector 10 may be configured to provide a successful registration feedback signifying a successful registration of the current measurement surface 54 and a non-successful registration feedback signifying a non-successful registration of the current measurement surface 54 e.g. a blue colored rectangle or striped pattern for 3D measurement (the inner region 44 of the projection image 48, FIG. 5, which also corresponds to a current measurement surface 54) may be inside a larger red colored rectangle (outer region 42 of the projection image 48 corresponding to other surfaces 56 of the cavity), the red colored rectangle signifying a non-successful registration of intra-oral surfaces on which said feedback is projected. Upon any successful registration, corresponding surfaces 46 of the successful registration in the outer region 42 may receive a successful registration feedback, e.g. green light, signifying that said corresponding surfaces of the successful registration 46 have been successfully registered. In an embodiment herein, the outer region 42 may receive an initial non-successful registration feedback signifying that scanning has not begun. Of course other implementations of the shapes and/or colors of the different components of the projection image 48 may be realized without departing from the scope of the invention.[P-33] The projector emits color coded light frequencies for the purpose of display, as opposed to the scanning process that emits reflected modulated light for the purpose of measurement. Claim(s) 5, 6, 10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kucharczyk et al. (US 20220133445 A1) in view of Kobayashi et al. (US 20190191132 A1) and Troy et al. (US 20160318625 A1), applied in claim 1 above, in further view of Mincher (US 8913955 B1) In regards to claim 5, Kucharczyk modified teaches the pairer is a magnetic unit configured to generate a predetermined magnetic field, wherein one of the wireless scanner and the communication hub comprises a magnetic sensor configured to sense the magnetic field generated from the magnetic unit (Paragraphs 25, 55), A shape of the inner region 44 and/or outer region 42 may be pre-determined, for example, square, rectangle, circle etc. In an exemplary embodiment of the present invention as shown in FIG. 2, an intraoral camera 32 may be electrically paired with a separate projector 10a with the projector 10a projecting at least a portion of projection image 48 on a tooth surface 28 for user feedback. The intra-oral camera 32 itself may also have a projector 10 for projecting at least another portion of the projection image 48 for 3D measurement. Said pairing may be achieved by using a common controller such as a computer processor 122 to simultaneously control illumination of the tooth surface 28 by the illumination beam 20 from the projector 10a and intra-oral camera 32 and recording of the reflected monitoring beam 50[P-25] The computer system 100 also may include a communications interface 146 that enables software and data to be transferred between the computer system 100 and external devices. Such an interface may include a modem, a network interface (e.g., an Ethernet card or an IEEE 802.11 wireless LAN interface), a communications port (e.g., a Universal Serial Bus (“USB”) port or a FireWire® port), a Personal Computer Memory Card International Association (“PCMCIA”) interface, Bluetooth®, and the like. Software and data transferred via the communications interface 146 may be in the form of signals, which may be electronic, electromagnetic, optical or another type of signal that may be capable of being transmitted and/or received by the communications interface 146. Signals may be provided to the communications interface 146 via a communications path 148 (e.g., a channel). The communications path 148 may carry signals and may be implemented using wire or cable, fiber optics, a telephone line, a cellular link, a radio-frequency (“RF”) link, or the like. The communications interface 146 may be used to transfer software or data or other information between the computer system 100 and a remote server or cloud-based storage [P-55] Kucharczyk teaches the pairing of the scanner device and projector being done by a controller/processing device which communicates between the components using electromagnetic methods including RF frequencies. However, Kucharczyk fails to specifically disclose the magnetic sensor causes the wireless scanner and the communication hub to be paired with each other on the basis of a change in the magnetic field generated by the magnetic unit. Mincher on the other hand teaches a pairing method of devices, such that when the two devices are brought together in complementary fashion, the magnetic field from the first device is detected by the magnetic sensor in the second device, and vice versa. Upon detection, of the magnetic field by the magnetic sensor the pairing process is initiated within each of the devices. In other implementations other techniques may be used to determine proximity and initiate the pairing process. Infrared signals, acoustic signals, capacitive proximity sensors, radio frequency identification tags, near field communication devices, and so forth may be used to determine proximity of the two devices to one another. Instead of a user pressing various buttons to start the pairing process, the user places the devices near one another, such as placing the second device atop the first device. The first and second devices may have physically complementary shapes such that at least a portion of one may rest or nest within another. For example, the first device may have a declivity or recess within which the second device may sit. The magnets and magnetic sensors may be arranged such that when placed in the recess, the magnetic sensors detect the corresponding magnetic field from a magnetic in the other device and, based on that detection, may initiate the pairing process. (Column 2, lines 1-29). Therefore, it would be obvious to one of ordinary skill in the art during the filing date of the invention to combine the teaching of Mincher and Kucharcyzk modified’s teaching in order that one may utilize the inexpensive method of magnetic pairing such that the magnetic sensor causes the wireless scanner and the communication hub to be paired with each other on the basis of a change in the magnetic field generated by the magnetic unit. In regards to claim 6, Kucharcyzk modified does not explicitly teach the pairer is a near-field communication unit configured to generate a predetermined RF signal, wherein one of the wireless scanner and the communication hub comprises a near-field communication sensor configured to sense the RF signal generated from the near-field communication unit, and wherein the near-field communication sensor causes the wireless scanner and the communication hub to be paired with each other on the basis of the RF signal generated by the near-field communication unit (Paragraphs 25, 55), i.e. A shape of the inner region 44 and/or outer region 42 may be pre-determined, for example, square, rectangle, circle etc. In an exemplary embodiment of the present invention as shown in FIG. 2, an intraoral camera 32 may be electrically paired with a separate projector 10a with the projector 10a projecting at least a portion of projection image 48 on a tooth surface 28 for user feedback. The intra-oral camera 32 itself may also have a projector 10 for projecting at least another portion of the projection image 48 for 3D measurement. Said pairing may be achieved by using a common controller such as a computer processor 122 to simultaneously control illumination of the tooth surface 28 by the illumination beam 20 from the projector 10a and intra-oral camera 32 and recording of the reflected monitoring beam 50[P-25] The computer system 100 also may include a communications interface 146 that enables software and data to be transferred between the computer system 100 and external devices. Such an interface may include a modem, a network interface (e.g., an Ethernet card or an IEEE 802.11 wireless LAN interface), a communications port (e.g., a Universal Serial Bus (“USB”) port or a FireWire® port), a Personal Computer Memory Card International Association (“PCMCIA”) interface, Bluetooth®, and the like. Software and data transferred via the communications interface 146 may be in the form of signals, which may be electronic, electromagnetic, optical or another type of signal that may be capable of being transmitted and/or received by the communications interface 146. Signals may be provided to the communications interface 146 via a communications path 148 (e.g., a channel). The communications path 148 may carry signals and may be implemented using wire or cable, fiber optics, a telephone line, a cellular link, a radio-frequency (“RF”) link, or the like. The communications interface 146 may be used to transfer software or data or other information between the computer system 100 and a remote server or cloud-based storage [P-55] In other words, Kucharczyk teaches intraoral camera 32 may be electrically paired with a separate projector 10a with the projector 10a projecting at least a portion of projection image 48 on a tooth surface 28 for user feedback. The intra-oral camera 32 itself may also have a projector 10 for projecting at least another portion of the projection image 48 for 3D measurement. Said pairing may be achieved by using a common controller such as a computer processor 122. Kucharczyk further elaborates the communicative method/interface to which the controller/computer utilizes Bluetooth protocol or RF link to transmit the signals to external devices (which may include the camera and the projector), and thereby be using in communication protocols related to pairing the external devices(as described above). Hence by using near-field communication protocol (Bluetooth or RF link), Mincher also teaches the pairing of devices using a near-field communication sensor configured to sense the RF signal generated from the near-field communication unit, and wherein the near-field communication sensor causes the wireless device and the communication dev ice to be paired with each other on the basis of the RF signal generated by the near-field communication unit (Column 2, lines 10-29; Column 4, lines 46-55), i.e. Upon detection, of the magnetic field by the magnetic sensor the pairing process is initiated within each of the devices. In other implementations other techniques may be used to determine proximity and initiate the pairing process. Infrared signals, acoustic signals, capacitive proximity sensors, radio frequency identification tags, near field communication devices, and so forth may be used to determine proximity of the two devices to one another. Instead of a user pressing various buttons to start the pairing process, the user places the devices near one another, such as placing the second device atop the first device. The first and second devices may have physically complementary shapes such that at least a portion of one may rest or nest within another. [Col 2, ln 10-29] The pairing module 124 is configured to initiate a pairing process or procedure based at least in part upon receiving a signal. The pairing process comprises establishment of a wireless communication link between two or more devices. The pairing module 124 may be configured to establish frequencies for communication, exchange encryption information, and so forth. Once the pairing process is complete, other modules in the device may communicate using the communication link. [Col 4, ln 46-55], It would be obvious to one of ordinary skill in the art during the filing date of the invention to combine the teaching of Mincher and Kucharcyzk modified’s teaching in order that one may utilize the inexpensive method of magnetic pairing such that the magnetic sensor causes the wireless scanner and the communication hub to be paired with each other on the basis of a change in the magnetic field generated by the magnetic unit. In regards to claim 10, Kucharczyk modified fails to teach at least one of the wireless scanner or the communication hub further comprises a pairing switching which is pressed in one direction to generate a predetermined pairing signal. Mincher on the other hand teaches a pairing switching which is pressed in one direction to generate a predetermined pairing signal (Column 1, lines 55-67), i.e. To establish a communication link between two devices it may be necessary to initiate some sort of setup procedure or "pairing" so the devices recognize one another and can be configured to exchange data with one another. For example, with Bluetooth the user typically has to initiate a pairing process by pressing a particular button, initiating a command, and so forth. This manual intervention may be inconvenient for a user, resulting in an undesired user experience. For example, the pairing initiation buttons may be concealed [Col 1, ln 55-67] It would be obvious to one of ordinary skill in the art during the filing date of the invention to combine the teaching of Mincher and Kucharcyzk modified’s teaching in order that one may utilize the inexpensive effective method of actuating the pairing procedure. Claim(s) 7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kucharczyk et al. (US 20220133445 A1) in view of Kobayashi et al. (US 20190191132 A1) and Troy et al. (US 20160318625 A1), applied in claim 1 above, in further view of Holoubek (US 20050273356 A1). In regards to claim 7, Kucharczyk modified fails to teach the pairer is an identifier attached to the communication hub, and wherein the wireless scanner is configured to acquire an image of the identifier to cause the wireless scanner and the communication hub to be paired with each other. Holoubek on the other hand teaches a scanner device that is paired with an external device using a pairing means that comprises an identifier attached to the extended device that is in turn scanned by the scanning device to generate an image associated with the identifier, which is thereafter used to match and pair the scanning device with the external device (Claim 18), i.e. A system for processing a photograph, comprising: a first scanner device for scanning an identifier and a picture side of the photograph; a second scanner device for scanning the identifier and a back side of the photograph; and a control program operative on a microprocessor in electrical communication with the first and second scanner devices, said control program being operative to pair the scanned picture side of the photograph with the scanned identifier into a first database and pair the scanned back side of the photograph with the scanned identifier into a second database, the control program being further operative to: merge the picture side from the first database and the back side from the second database to create an image by matching the identifier associated with the picture and back side, the image containing the picture and back side of the photograph. [Cl 18] and therefore if applicable to the wireless scanner is configured to acquire an image of the identifier to cause the wireless scanner and the communication hub to be paired with each other. It would be obvious to one of ordinary skill in the art during the filing date of the invention to combine Holoubek’s teaching with Kucharczyk modified’s teaching in order to enable a cost effective method of pairing devices accordingly. Claim(s) 11 and 15 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kucharczyk et al. (US 20220133445 A1) in view of Troy et al. (US 20160318625 A1), Holoubek (US 20050273356 A1) and Kobayashi et al. (US 20190191132 A1) In regards to claim 11, Kucharczyk teaches a wireless scanning system comprising a wireless scanner configured to scan an object to acquire image data (Paragraphs 22, 25,32 55), i.e. The present invention may provide a method and system for providing feedback during intraoral scanning by images or light patterns projected onto an object such teeth. The invention may therefore enable intraoral scanning in which a display/monitoring screen may not be needed and wherein unsuccessful registrations gaps/holes may be corrected by informing a user to repeat scans at corresponding locations in the intra-oral cavity.[P-22] A shape of the inner region 44 and/or outer region 42 may be pre-determined, for example, square, rectangle, circle etc. In an exemplary embodiment of the present invention as shown in FIG. 2, an intraoral camera 32 may be electrically paired with a separate projector 10a with the projector 10a projecting at least a portion of projection image 48 on a tooth surface 28 for user feedback. The intra-oral camera 32 itself may also have a projector 10 for projecting at least another portion of the projection image 48 for 3D measurement. Said pairing may be achieved by using a common controller such as a computer processor 122 to simultaneously control illumination of the tooth surface 28 by the illumination beam 20 from the projector 10a and intra-oral camera 32 and recording of the reflected monitoring beam 50[P-25] The invention may include producing and updating a 3D model through a scanning process by a user moving an intraoral camera head 30 over a location (e.g. a tooth surface 28). The scan may begin as shown in Step S100 wherein a memory of a computer system 100 may store a volume of the intra-oral cavity 52 as an empty volume which may be subsequently updated as scanning progresses. After placing the intra-oral camera head 30 over the tooth surface in Step S200, a projector 10 of the intra-oral camera 32 may project an initial default projection image over a region of the intra-oral cavity 52, to show that image data have not been acquired yet for processing. [P-32] The computer system 100 also may include a communications interface 146 that enables software and data to be transferred between the computer system 100 and external devices. Such an interface may include a modem, a network interface (e.g., an Ethernet card or an IEEE 802.11 wireless LAN interface), a communications port (e.g., a Universal Serial Bus (“USB”) port or a FireWire® port), a Personal Computer Memory Card International Association (“PCMCIA”) interface, Bluetooth®, and the like. Software and data transferred via the communications interface 146 may be in the form of signals, which may be electronic, electromagnetic, optical or another type of signal that may be capable of being transmitted and/or received by the communications interface 146. Signals may be provided to the communications interface 146 via a communications path 148 (e.g., a channel). The communications path 148 may carry signals and may be implemented using wire or cable, fiber optics, a telephone line, a cellular link, a radio-frequency (“RF”) link, or the like. The communications interface 146 may be used to transfer software or data or other information between the computer system 100 and a remote server or cloud-based storage [P-55] Though the wireless component of the scanner is not explicitly disclosed in Kucharczyk’s teaching, however, by Kucharczyk teaching the intraoral camera may be electrically paired with a separate projector, with the said pairing may be achieved by using a common controller such as a computer processor, and further elaborating the computer system communicates using wireless communication protocols with external devices (which may include the projector and the scanner) (Paragraphs 25, 55, see above), then it would be obvious to one of ordinary skill in the art that the scanner would have to be wirelessly compatible in order to communication with the computer processor and in turn the projector. The wireless scanning (via the cited communication interface) in Kucharczyk is done/captured by an intraoral camera, to which the scanned area has an image is projected on thereafter. Kucharczyk then teaches a communication hub configured to receive the image data transmitted from the wireless scanner; the communication hub in this case being a controller/processor (Paragraph 25), i.e. A shape of the inner region 44 and/or outer region 42 may be pre-determined, for example, square, rectangle, circle etc. In an exemplary embodiment of the present invention as shown in FIG. 2, an intraoral camera 32 may be electrically paired with a separate projector 10a with the projector 10a projecting at least a portion of projection image 48 on a tooth surface 28 for user feedback. The intra-oral camera 32 itself may also have a projector 10 for projecting at least another portion of the projection image 48 for 3D measurement. Said pairing may be achieved by using a common controller such as a computer processor 122 to simultaneously control illumination of the tooth surface 28 by the illumination beam 20 from the projector 10a and intra-oral camera 32 and recording of the reflected monitoring beam 50 [P-25] Kucharczyk teaches a processing device which is connected to the communication hub and is configured to display the image data received by the communication hub; the processing device in this disclosure being a projector (Paragraph 25), i.e. A shape of the inner region 44 and/or outer region 42 may be pre-determined, for example, square, rectangle, circle etc. In an exemplary embodiment of the present invention as shown in FIG. 2, an intraoral camera 32 may be electrically paired with a separate projector 10a with the projector 10a projecting at least a portion of projection image 48 on a tooth surface 28 for user feedback. The intra-oral camera 32 itself may also have a projector 10 for projecting at least another portion of the projection image 48 for 3D measurement. Said pairing may be achieved by using a common controller such as a computer processor 122 to simultaneously control illumination of the tooth surface 28 by the illumination beam 20 from the projector 10a and intra-oral camera 32 and recording of the reflected monitoring beam 50 [P-25] A pairer which is disposed in at least one of the wireless scanner or the communication hub and is configured to connect the wireless scanner and the communication hub to each other (Paragraph 25), i.e. A shape of the inner region 44 and/or outer region 42 may be pre-determined, for example, square, rectangle, circle etc. In an exemplary embodiment of the present invention as shown in FIG. 2, an intraoral camera 32 may be electrically paired with a separate projector 10a with the projector 10a projecting at least a portion of projection image 48 on a tooth surface 28 for user feedback. The intra-oral camera 32 itself may also have a projector 10 for projecting at least another portion of the projection image 48 for 3D measurement. Said pairing may be achieved by using a common controller such as a computer processor 122 to simultaneously control illumination of the tooth surface 28 by the illumination beam 20 from the projector 10a and intra-oral camera 32 and recording of the reflected monitoring beam 50 [P-25] Kucharczyk fails to explicitly teach wireless scanner and the communication hub further comprise communication modules for data communication, respectively, and wherein the communication modules are configured to exchange unique information of the wireless scanner and unique information of the communication hub with each other to cause the wireless scanner and the communication hub to be paired with each other. Troy on the other hand teaches a method for pairing devices that includes the exchange of unique identifier information to pair the devices through RFID protocol (Paragraphs 4, 16), i.e. The method includes identifying a selected control unit from the plurality of control units that will control a selected object from the plurality of objects, placing a hand-held scanner in close proximity to a first machine-readable tag on the selected control unit to acquire a first unique ID for only the selected control unit, placing the hand-held scanner in close proximity to a second machine-readable tag on the selected object to acquire a second unique ID for only the selected object, and associating the first unique ID with the second unique ID to pair the selected control unit with the selected object. [P-4] to perform the pairing method, user 202 positions scanner 100 such that scanner detects control unit 201 on an associated seat 204. For an RFID scanner, this involves positioning scanner 100 such that the strongest RF signal detected by scanner 100 is a signal from control unit 201 of seat 204. For an optical (e.g., barcode or QR code) scanner 100, this involves directing a beam or view direction of scanner 100 such that scanner 100 reads an optical label on control unit 201 of seat 204. Once control unit 201 is scanned to acquire a unique ID, scanner 100 may give an indication (e.g., a visual, audio, or tactile alert) to user 202 to confirm control 201 has been scanned. [P-16] Hence, Troy’s device pairing protocol is applicable to the communication modules are configured to exchange unique information of the wireless scanner and unique information of the communication hub with each other to cause the wireless scanner and the communication hub to be paired with each other. It would be obvious to one of ordinary skill in the art during the filing date of the invention to combine Troy’s teaching with Kucharczyk’s teaching, substituting Kucharczyk’s pairing method with Troy’s method in order to enable a cost effective, yet secure method of pairing devices accordingly. Kucharczyk fails to teach the pairer is an identifier attached to the communication hub, and wherein the wireless scanner is configured to acquire an image of the identifier to cause the wireless scanner and the communication hub to be paired with each other. Holoubek on the other hand teaches a scanner device that is paired with an external device using a pairing means that comprises an identifier attached to the extended device that is in turn scanned by the scanning device to generate an image associated with the identifier, which is thereafter used to match and pair the scanning device with the external device (Claim 18), i.e. A system for processing a photograph, comprising: a first scanner device for scanning an identifier and a picture side of the photograph; a second scanner device for scanning the identifier and a back side of the photograph; and a control program operative on a microprocessor in electrical communication with the first and second scanner devices, said control program being operative to pair the scanned picture side of the photograph with the scanned identifier into a first database and pair the scanned back side of the photograph with the scanned identifier into a second database, the control program being further operative to: merge the picture side from the first database and the back side from the second database to create an image by matching the identifier associated with the picture and back side, the image containing the picture and back side of the photograph. [Cl-18] and therefore if applicable to the wireless scanner is configured to acquire an image of the identifier to cause the wireless scanner and the communication hub to be paired with each other. It would be obvious to one of ordinary skill in the art during the filing date of the invention to combine Holoubek’s teaching with Kucharczyk’s teaching in order to enable a cost effective method of pairing devices accordingly. Kucharczyk further fails to teach the wireless scanner includes a light projector configured, after a connection between the wireless scanner and the communication hub is established, to emit red light for one unit time, green light for one unit time, and blue light for N unit times, where N is an integer equal to or greater than 2. Kobayashi on the other hand teaches the wireless scanner includes a light projector configured, after a connection between the wireless scanner and the communication hub is established, to emit red light for one unit time, green light for one unit time, and blue light for N unit times, where N is an integer equal to or greater than 2. (Paragraph 113, 116, 117) FIG. 14 is a view illustrating an outline of one example of the control example of the light sources 150 according to the present sixth embodiment. A control for the light sources 150 illustrated in FIG. 14 is an example in which an emitting period for mixed color (white (W) in the example of FIG. 14) is added into the example according to the third embodiment described above (FIG. 5). As illustrated in FIG. 14, for example, in a video display mode #2, white (W) is inserted between red (R) and green (G) and between green (G) and blue (B), and setting of a combination of a lighting ratio and a current value per unit time is set to a desired relationship [P-113] FIG. 16 is a view illustrating an outline of one example of the control example of the light sources 150 according to the present sixth embodiment. A control for the light sources 150 illustrated in FIG. 16 is an example in which an emitting period for mixed color (white (W) in the example of FIG. 15) is added into the example according to the fifth embodiment described above (FIG. 7). As illustrated in FIG. 16, for example, in a normal light source control mode of a case where an OSD exists on a screen by turning the OSD ON, white (W) is inserted between red (R) and green (G) and between green (G) and blue (B), and setting of a combination of a lighting ratio and a current value per unit time is set to a desired relationship. Controls other than the addition of the emitting period for the mixed color are similar to the controls that have already been explained in the fifth embodiment. Therefore, their explanation will be omitted. By providing the emitting period for the mixed color in this manner, it is possible to provide the projection video display apparatus in which the maximum brightness of projected video is larger (brighter) while obtaining the effects according to the fifth embodiment.[P-116] As explained above, according to the projection video display apparatus 100 of the present embodiment, by adding the mixed color generated by mixing red (R), green (G) and blue (B) to the lighting ratios per the unit of time (Δt) in addition to red (R), green (G) and blue (B), it is possible to make the maximum brightness of the projected video larger (or brighter) in a case where the input video is bright video while securing darkness of the projected video in a case where the input video is dark video.[P-117] Here, we see a projector system to which the radiated reb, green, an blue time units are set to a desirable ratio, illustrating their customizable configuration, an thereby, by obviousness being able to be set/configured at red light for one unit time, green light for one unit time, and blue light for N unit times, where N is an integer equal to or greater than 2. Thereby, it would have been obvious to one of ordinary skill in the art during the time of the filing date to combine Kobayashi’s teaching with Kucharcyk’s teaching in order to effectively configure a desired output configuration of the projector component. In regards to claim 15, Kucharczyk modified teaches the processing device is configured to generate a three-dimensional model of the object on the basis of the image data transmitted from the communication hub, and wherein the processing device is configured to display at least one of the image data or the three-dimensional model (Paragraphs 24, 33), i.e. The projector may illuminate a 3D measuring field as well as surfaces outside the measuring field. Herein, information about already acquired/scanned surfaces, surfaces not yet acquired as well as other 3D measurement information (such as scan-body registration) may be visually superimposed by using a projected pattern or color onto surfaces outside the measuring field. In this way the user can control the scan process while looking into the mouth of the patient. Said projection may be dynamic, wherein an illumination beam 20 from the projector 10 may be controlled to produce said one or more projection images 48 at preferably predetermined time intervals and wherein at least a part of the illumination beam 20 may be reflected into a monitoring beam 50 for 3D measurement. the projector 10 may be configured to provide a successful registration feedback signifying a successful registration of the current measurement surface 54 and a non-successful registration feedback signifying a non-successful registration of the current measurement surface 54 e.g. a blue colored rectangle or striped pattern for 3D measurement (the inner region 44 of the projection image 48, FIG. 5, which also corresponds to a current measurement surface 54) may be inside a larger red colored rectangle (outer region 42 of the projection image 48 corresponding to other surfaces 56 of the cavity), the red colored rectangle signifying a non-successful registration of intra-oral surfaces on which said feedback is projected. Upon any successful registration, corresponding surfaces 46 of the successful registration in the outer region 42 may receive a successful registration feedback, e.g. green light, signifying that said corresponding surfaces of the successful registration 46 have been successfully registered. Claim(s) 12 and 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kucharczyk et al. (US 20220133445 A1) in view of Troy et al. (US 20160318625 A1), Holoubek (US 20050273356 A1) and Kobayashi et al. (US 20190191132 A1) as applied to claim 11 above, and further in view of Lee et al. (US 20160307539 A1). In regards to claim 12, Kucharczyk modified teaches the pairer is a light projector configured to emit a predetermined form of light from inside of the wireless scanner to outside of the wireless scanner (Paragraphs 24, 25; Figure 1, Kucharczyk), i.e. FIG. 1 illustrates a block diagram of a light guidance system 101 comprising at least one projector 10 having a light source 12, at least one image sensor 18 (e.g. a 3D image sensor) and a computer system 100 with at least one computer processor 122 (FIG. 7). The projector 10 and image sensor 18 may be in communication with the computer system 100. The projector 10 may be a projector housed within a camera such as an intraoral camera 32. Alternatively, the projector 10 may be a separate projector such as a digital light projector outside the intra-oral camera 32. Projectors 10 may work on a principle of filtering a light source 12 based on an image to be displayed. A lens may then be used to transfer the image to a surface on which the image may be displayed. Different kinds of projectors may be used, including Digital Light Processing (DLP) projectors which may be based on Digital Micromirror Device (DMD) technology wherein an array of microscopic mirrors may be configured to tilt either toward the light source 12 in the projector 10 or away from it in order to create a light or dark pixel on a projection surface. Other kinds of projectors may include Light Emitting Diode (LED) projectors, Laser projectors, Liquid Crystal on Silicon (LCoS) projectors and Liquid Crystal Display (LCD) projectors. One or more projectors 10 may be used for projecting one or more projection images 48 (FIG. 5) on a surface such as a tooth surface 28 (FIG. 4) and may be constructed and operated in accordance with at least one exemplary embodiment herein. The projector may illuminate a 3D measuring field as well as surfaces outside the measuring field. Herein, information about already acquired/scanned surfaces, surfaces not yet acquired as well as other 3D measurement information (such as scan-body registration) may be visually superimposed by using a projected pattern or color onto surfaces outside the measuring field. In this way the user can control the scan process while looking into the mouth of the patient. Said projection may be dynamic, wherein an illumination beam 20 from the projector 10 may be controlled to produce said one or more projection images 48 at preferably predetermined time intervals and wherein at least a part of the illumination beam 20 may be reflected into a monitoring beam 50 for 3D measurement. [P-25] Kucharczyk fails to teach the communication hub comprises an illuminance sensor which senses the light, and wherein the illuminance sensor configured to sense at least one of a brightness of the light or a color change of the light to cause the wireless scanner and the communication hub to be paired with each other. Lee on the other hand teaches the communication hub comprises an illuminance sensor which senses the light, and wherein the illuminance sensor configured to sense at least one of a brightness of the light or a color change of the light to cause the wireless scanner and the communication hub to be paired with each other (Paragraphs 4, 12), i.e. a device having a display; n terminals; and a server configured to connect the device having a display with the n terminals, wherein the server provides an ID to the device having a display according to an access request from the device, wherein the device having a display calculates a brightness value change pattern corresponding to the ID, and outputs the pattern to the display, and wherein each of the n terminals detects the pattern output to the display using a proximity sensor and a light sensor, then recognizes the ID, transmits the ID to the server, and requests for access. [P-4] the purpose of the present invention, as embodied and broadly described herein, there is also provided a mobile terminal, including: a proximity sensor configured to sense a proximity degree with a device having a display; a light sensor configured to sense a brightness value change pattern output to the display; and a controller configured to recognize an ID corresponding to the sensed brightness value change pattern, to transmit the recognized ID to a server, and to request for access, wherein the device having a display requests to the server for access to thus be provided with the ID, calculates a brightness value change pattern corresponding to the ID, and outputs the pattern to the display, and wherein the server connects the device having a display with the mobile terminal according to the access request from the controller. [P-12] In other words, after the sensor senses a brightness the sensed ID corresponding to the brightness is then transmitted to a communication paring hub, to which there after sends the ID corresponding to the brightness to the display device such that the brightness is accounted for. It would have been obvious to a person of ordinary skill in the art before the effective filing of the invention to combine Lee’s teaching with Kucharczyk’s teaching in order to optimize the transmission and display of detail of the scanned image brightness configuration. In regards to claim 13, Kucharczyk modified teaches the pairer is a light projector configured to emit a predetermined light pattern from inside of the wireless scanner to outside of the wireless scanner (Paragraphs 32; Figure 6, Kucharczyk), i.e. The invention may include producing and updating a 3D model through a scanning process by a user moving an intraoral camera head 30 over a location (e.g. a tooth surface 28). The scan may begin as shown in Step S100 wherein a memory of a computer system 100 may store a volume of the intra-oral cavity 52 as an empty volume which may be subsequently updated as scanning progresses. After placing the intra-oral camera head 30 over the tooth surface in Step S200, a projector 10 of the intra-oral camera 32 may project an initial default projection image over a region of the intra-oral cavity 52, to show that image data have not been acquired yet for processing. [P-32] wherein the communication hub comprises a pattern sensor which senses the light pattern wherein the pattern sensor is configured to sense the light pattern to cause the wireless scanner and the communication hub to be paired with each other (Paragraph 35; Figure 6), i.e. As shown in Step 300, the inner rays 24 of the illumination beam may be substantially reflected off the current measurement surface 54 towards an image sensor 18 of the intra-oral camera 32 for further processing. Herein, 3D coordinates of the current measurement surface 54 may be extracted in Step S400 to determine if all or part of the current measurement surface 54 has been previously registered (Step S500). If the current measurement surface 54 or any portion of the current measurement surface has not been previously acquired, the corresponding new 3D data (e.g. xyz coordinates) of unregistered portions may be stored by determining if the new 3D data and/or accumulated 3D data are sufficient according to one or more predetermined recording criteria/conditions (Step S600) said predetermined recording criteria including for example, whether or not the new 3D data and/or accumulated 3D data have a desired predetermined resolution, predetermined noise level, predetermined 3D point density, and/or inconsistencies between individual optical 3D measurements. If the predetermined recording criteria is satisfied, the new 3D data may be stored in an image stack for post processing or the 3D data may be used to create a real-time 3D reconstruction of the intraoral cavity as shown in Step S800. The outer region 42 of the projection image 48 corresponding to the new 3D data may then be updated in Step S1000 to relay to the user when the intra-oral camera 32 camera position changes to a new measurement surface that the previous measurement surface has been sufficiently recorded according to predetermined recording criteria. Herein, a need to constantly look at a monitor to track the progress of an ongoing intra-oral scan and make necessary adjustments may be eliminated or substantially eliminated as a visual feedback for the user may be shown in the vicinity of the intra-oral surfaces being scanned. [P-35] Kucharczyk teaches the communication hub comprises a pattern sensor which senses the light pattern wherein the pattern sensor is configured to sense the light pattern to cause the wireless scanner and the communication hub to be paired with each other; the light pattern used by the scanning procedure by the intraoral camera is received processed and paired with the projector, such that based on the scanned data the projector may superimpose an image adequately based on the data scanned from the scanning procedure. Claim(s) 14 is/are rejected under 35 U.S.C. 103 as being unpatentable over Kucharczyk et al. (US 20220133445 A1) in view of Troy et al. (US 20160318625 A1), Holoubek (US 20050273356 A1) and Kobayashi et al. (US 20190191132 A1) as applied to claim 11 above, and further in view of Mincher (US 8913955 B1). In regards to claim 14, Kucharczyk modified teaches the pairer is a magnetic unit configured to generate a predetermined magnetic field, wherein one of the wireless scanner and the communication hub comprises a magnetic sensor configured to sense the magnetic field generated from the magnetic unit (Paragraphs 25, 55, Kucharczyk), A shape of the inner region 44 and/or outer region 42 may be pre-determined, for example, square, rectangle, circle etc. In an exemplary embodiment of the present invention as shown in FIG. 2, an intraoral camera 32 may be electrically paired with a separate projector 10a with the projector 10a projecting at least a portion of projection image 48 on a tooth surface 28 for user feedback. The intra-oral camera 32 itself may also have a projector 10 for projecting at least another portion of the projection image 48 for 3D measurement. Said pairing may be achieved by using a common controller such as a computer processor 122 to simultaneously control illumination of the tooth surface 28 by the illumination beam 20 from the projector 10a and intra-oral camera 32 and recording of the reflected monitoring beam 50[P-25] The computer system 100 also may include a communications interface 146 that enables software and data to be transferred between the computer system 100 and external devices. Such an interface may include a modem, a network interface (e.g., an Ethernet card or an IEEE 802.11 wireless LAN interface), a communications port (e.g., a Universal Serial Bus (“USB”) port or a FireWire® port), a Personal Computer Memory Card International Association (“PCMCIA”) interface, Bluetooth®, and the like. Software and data transferred via the communications interface 146 may be in the form of signals, which may be electronic, electromagnetic, optical or another type of signal that may be capable of being transmitted and/or received by the communications interface 146. Signals may be provided to the communications interface 146 via a communications path 148 (e.g., a channel). The communications path 148 may carry signals and may be implemented using wire or cable, fiber optics, a telephone line, a cellular link, a radio-frequency (“RF”) link, or the like. The communications interface 146 may be used to transfer software or data or other information between the computer system 100 and a remote server or cloud-based storage [P-55] Kucharczyk teaches the pairing of the scanner device and projector being done by a controller/processing device which communicates between the components using electromagnetic methods including RF frequencies. However, Kucharczyk fails to specifically disclose the magnetic sensor causes the wireless scanner and the communication hub to be paired with each other on the basis of a change in the magnetic field generated by the magnetic unit. Mincher on the other hand teaches a pairing method of devices, such that when the two devices are brought together in complementary fashion, the magnetic field from the first device is detected by the magnetic sensor in the second device, and vice versa. Upon detection, of the magnetic field by the magnetic sensor the pairing process is initiated within each of the devices. In other implementations other techniques may be used to determine proximity and initiate the pairing process. Infrared signals, acoustic signals, capacitive proximity sensors, radio frequency identification tags, near field communication devices, and so forth may be used to determine proximity of the two devices to one another. Instead of a user pressing various buttons to start the pairing process, the user places the devices near one another, such as placing the second device atop the first device. The first and second devices may have physically complementary shapes such that at least a portion of one may rest or nest within another. For example, the first device may have a declivity or recess within which the second device may sit. The magnets and magnetic sensors may be arranged such that when placed in the recess, the magnetic sensors detect the corresponding magnetic field from a magnetic in the other device and, based on that detection, may initiate the pairing process. (Column 2, lines 1-29). Therefore, it would be obvious to one of ordinary skill in the art during the filing date of the invention to combine the teaching of Mincher and Kucharcyzk modified’s teaching in order that one may utilize the inexpensive method of magnetic pairing such that the magnetic sensor causes the wireless scanner and the communication hub to be paired with each other on the basis of a change in the magnetic field generated by the magnetic unit. Response to Arguments The applicant has amended the independent claim 1, to integrate subject matter from previous claim 8 subject matter to now read as, “A wireless scanning system comprising: a wireless scanner configured to scan an object to acquire image data; a communication hub configured to receive the image data transmitted from the wireless scanner; a processing device which is connected to the communication hub and is configured to display the image data received by the communication hub; and a pairer which is disposed in at least one of the wireless scanner or the communication hub and is configured to connect the wireless scanner and the communication hub to each other, wherein the wireless scanner includes a light projector configured, after a connection between the wireless scanner and the communication hub is established, to emit red light for one unit time, green light for one unit time, and blue light for N unit times, where N is an integer equal to or greater than 2, wherein the wireless scanner and the communication hub further comprise communication modules for data communication, respectively, and wherein the communication modules are configured to exchange unique information of the wireless scanner and unique information of the communication hub with each other to cause the wireless scanner and the communication hub to be paired with each other.” Furthermore, the applicant argues, that Troy teaches acquiring unique IDs and associating them. However, Applicant respectfully submits that Troy is fundamentally different from the claimed invention in terms of configuration, operation, and field of endeavor. Specifically, Troy discloses a method for pairing aircraft control units with objects using a separate "hand-held scanner". Specifically, Troy teaches placing a hand-held scanner in close proximity to a "first machine-readable tag" on a control unit to acquire a first ID, and then placing the hand-held scanner in close proximity to a "second machine-readable tag" on a selected object to acquire a second ID. The system then associates these IDs. The applicant’s claimed invention fails to establish any specific endeavor within the preamble nor the body of the claim the scanning system to moot the aircraft scanning system described in Troy’s disclosure. In addition, the applicant’s claim language fails to describe the structural components of the scanning system in relation to each other, as the applicant’s Figures 1A-C simply illustrates a wireless scanner 10 component separate from a communication hub 20 and a processing device 30, all making up “a scanning system”. Thereby, one of ordinary skill in the art, may interpret a hand held scanner, scanning and securely pairing with a tagged communication device/object, by the exchange of unique exchange information as a scanner pairing with a tagged device/object making up an overall scanning system. Furthermore when combined with Kucharczyk’s teaching of a pairer which is disposed in at least one of the wireless scanner or the communication hub and is configured to connect the wireless scanner and the communication hub to each other [P-25], Therefore, one of ordinary skill in the art may obviously combine Troy’s pairing method with Kucharczyk’s teaching thereby substituting the paring method to ensure a more secure paring method to pair the scanner and the communication hub. Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ANTHONY D AFRIFA-KYEI whose telephone number is (571)270-7826. The examiner can normally be reached Monday-Friday 10am-7pm. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, BRIAN ZIMMERMAN can be reached at 571-272-3059. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ANTHONY D AFRIFA-KYEI/ Examiner, Art Unit 2686 /BRIAN A ZIMMERMAN/ Supervisory Patent Examiner, Art Unit 2686
Read full office action

Prosecution Timeline

Jan 19, 2024
Application Filed
May 30, 2025
Non-Final Rejection — §103
Sep 02, 2025
Response Filed
Sep 30, 2025
Final Rejection — §103
Dec 22, 2025
Request for Continued Examination
Jan 13, 2026
Response after Non-Final Action
Jan 20, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12582360
MEANS TO ACCURATELY PREDICT, ALARM AND HENCE AVOID SPORT INJURIES AND METHODS THEREOF
2y 5m to grant Granted Mar 24, 2026
Patent 12583387
VEHICLE IMAGE RECOGNITION MODULE
2y 5m to grant Granted Mar 24, 2026
Patent 12578251
Engine Health Monitoring System
2y 5m to grant Granted Mar 17, 2026
Patent 12555663
MEDICAL DEVICE AUDIBLE AND VISUAL ALARM SYNCHRONIZATION
2y 5m to grant Granted Feb 17, 2026
Patent 12545179
SYSTEMS, APPARATUSES, AND METHODS FOR ROADSIDE SAFETY EMERGENCIES
2y 5m to grant Granted Feb 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
65%
Grant Probability
78%
With Interview (+13.5%)
3y 0m
Median Time to Grant
High
PTA Risk
Based on 546 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month