DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 102
2. The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale or otherwise available to the public before the effective filing date of the claimed invention.
3. Claim(s) 1, 12 and 22 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Li et al. (US 2005/0231691 A1, hereinafter referred as “Li”).
Regarding claim 1, Li discloses a control method of a projection device (¶0001 discloses this invention relates generally to image projectors, and more particularly, to modifying projected images), comprising the following steps:
transmitting a first original command by a terminal device (¶0044 and ¶0045 discloses the user first enters the keystone adjustment mode by selecting the corresponding option on the on-screen display menu using the remote control unit);
projecting an adjustment image (Fig. 12) by the projection device in response to the first original command corresponding to an image correction operation of the projection device (¶0045 discloses the user first enters the keystone adjustment mode by selecting the corresponding option on the on-screen display menu. The keystone adjustment pattern is then projected), wherein the adjustment image comprises at least one pattern array (¶0042 discloses keystone adjustment pattern that is projected) and at least one adjustment reference point (¶0042 and ¶0045 discloses the keystone adjustment pattern includes corners of the pattern that is to be adjusted);
transmitting a second original command by the terminal device (Fig. 13 and ¶0045 discloses the user then selects any one of the four directions by pressing on the four corresponding directional arrows on the directional pad on the remote control unit); and
adjusting a position of the at least one adjustment reference point of the adjustment image by the projection device in response to the second original command corresponding to adjusting the position of the at least one adjustment reference point of the adjustment image (Fig. 13 and ¶0045 discloses as the user presses the SELECT button, the four-arrow cluster appears on one particular corner signaling the fact that the corner is selected for adjustment. The user then selects any one of the four directions by pressing on the four corresponding directional arrows on the directional pad on the remote control unit), wherein the at least one adjustment reference point is located in the corresponding at least one pattern array (¶0042 and ¶0045 discloses the keystone adjustment pattern includes corners of the pattern that is to be adjusted).
Regarding claim 12, Li discloses a projection system (¶0001 discloses this invention relates generally to image projectors, and more particularly, to modifying projected images), comprising:
a terminal device, configured to transmit a first original command (¶0044 and ¶0045 discloses the user first enters the keystone adjustment mode by selecting the corresponding option on the on-screen display menu using the remote control unit) and a second original command (Fig. 13 and ¶0045 discloses the user then selects any one of the four directions by pressing on the four corresponding directional arrows on the directional pad on the remote control unit); and
a projection device, coupled to the terminal device (¶0044 and ¶0045 discloses the user first enters the keystone adjustment mode by selecting the corresponding option on the on-screen display menu using the remote control unit);
wherein the projection device is configured to project an adjustment image (Fig. 12) in response to the first original command corresponding to an image correction operation of the projection device (¶0045 discloses the user first enters the keystone adjustment mode by selecting the corresponding option on the on-screen display menu. The keystone adjustment pattern is then projected), wherein the adjustment image comprises at least one pattern array (¶0042 discloses keystone adjustment pattern that is projected) and at least one adjustment reference point (¶0042 and ¶0045 discloses the keystone adjustment pattern includes corners of the pattern that is to be adjusted); and
the projection device is configured to adjust a position of the at least one adjustment reference point of the adjustment image in response to the second original command corresponding to adjusting the position of the at least one adjustment reference point of the adjustment image (Fig. 13 and ¶0045 discloses as the user presses the SELECT button, the four-arrow cluster appears on one particular corner signaling the fact that the corner is selected for adjustment. The user then selects any one of the four directions by pressing on the four corresponding directional arrows on the directional pad on the remote control unit), wherein the at least one adjustment reference point is located in the corresponding at least one pattern array (¶0042 and ¶0045 discloses the keystone adjustment pattern includes corners of the pattern that is to be adjusted).
Regarding claim 22, Li discloses a projection device (¶0001 discloses this invention relates generally to image projectors), comprising:
a projection module (projector optics 24), configured to project an adjustment image (Figs. 1, 3, 12);
a communication interface (¶0044 discloses remote control unit of the projector implying a communication between the projector and the remote), configured to receive a first original command (¶0044 and ¶0045 discloses the user first enters the keystone adjustment mode by selecting the corresponding option on the on-screen display menu using the remote control unit) and a second original command (Fig. 13 and ¶0045 discloses as the user presses the SELECT button, the four-arrow cluster appears on one particular corner signaling the fact that the corner is selected for adjustment. The user then selects any one of the four directions by pressing on the four corresponding directional arrows on the directional pad on the remote control unit); and
a processor (¶0033 discloses projector control system 22), coupled to the projection module (projector optics 24) and the communication interface (¶0044 and ¶0045 discloses the keystone adjustment pattern is then projected in response to operation of the remote control), wherein the processor is configured to:
control the projection module to project the adjustment image (Fig. 12) in response to the first original command corresponding to an image correction operation of the projection device (¶0044 and ¶0045 discloses the user first enters the keystone adjustment mode by selecting the corresponding option on the on-screen display menu using the remote control unit), wherein the adjustment image comprises at least one pattern array (¶0042 discloses keystone adjustment pattern that is projected) and at least one adjustment reference point (¶0042 and ¶0045 discloses the keystone adjustment pattern includes corners of the pattern that is to be adjusted); and
control the projection module to adjust a position of the at least one adjustment reference point of the adjustment image in response to the second original command corresponding to adjusting the position of the at least one adjustment reference point of the adjustment image (Fig. 13 and ¶0045 discloses as the user presses the SELECT button, the four-arrow cluster appears on one particular corner signaling the fact that the corner is selected for adjustment. The user then selects any one of the four directions by pressing on the four corresponding directional arrows on the directional pad on the remote control unit), wherein the at least one adjustment reference point is located in the corresponding at least one pattern array (¶0042 and ¶0045 discloses the keystone adjustment pattern includes corners of the pattern that is to be adjusted).
Claim Rejections - 35 USC § 103
4. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
5. Claim(s) 2 and 13 is/are rejected under 35 U.S.C. 103 as being unpatentable over Li in view of Tokuyama (US 2023/0015137 A1, hereinafter referred as “Tokuyama”).
Regarding claim 2, Li doesn’t disclose the control method of the projection device according to claim 1, wherein the at least one pattern array comprises at least one grid pattern array, and the at least one adjustment reference point is adjusted to any grid pattern of the at least one grid pattern array, or the at least one pattern array comprises at least one dot pattern array, and the at least one adjustment reference point is adjusted to any dot pattern of the at least one dot pattern array.
However, in the same field of endeavor, Tokuyama discloses wherein the at least one pattern array comprises at least one grid pattern array (Fig. 5D and the projection image 23 is divided into a grid), and the at least one adjustment reference point is adjusted to any grid pattern of the at least one grid pattern array (Fig. 5D and ¶0046 discloses ¶0046 discloses the ‘point correction’ is geometric correction in which the projection image 23 is divided into a grid as shown in FIG. 5D and in which the user individually corrects each of the vertices and the division points on the projection image 23, as control points), or the at least one pattern array comprises at least one dot pattern array, and the at least one adjustment reference point is adjusted to any dot pattern of the at least one dot pattern array.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Li for the purpose of performing image correction onto curved or irregular surfaces.
Regarding claim(s) 13, this/these system claim(s) has/have similar limitations as method claim(s) 2, and therefore rejected on similar grounds.
6. Claim(s) 3, 10, 14, 21 and 25 is/are rejected under 35 U.S.C. 103 as being unpatentable over Li in view of Zhang (US 11,206,372 B1, hereinafter referred as “Zhang”).
Regarding claim 3, Li discloses the control method of the projection device according to claim 1 further comprising the following steps: receiving the first original command (¶0044 and ¶0045 discloses the user first enters the keystone adjustment mode by selecting the corresponding option on the on-screen display menu using the remote control unit), and receiving the first standard command to control the projection device to project the adjustment image (¶0045 discloses the user first enters the keystone adjustment mode by selecting the corresponding option on the on-screen display menu. The keystone adjustment pattern is then projected); receiving the second original command (Fig. 13 and ¶0045 discloses the user then selects any one of the four directions by pressing on the four corresponding directional arrows on the directional pad on the remote control unit); and control the projection device to adjust the position of the at least one adjustment reference point of the adjustment image (Fig. 13 and ¶0045 discloses as the user presses the SELECT button, the four-arrow cluster appears on one particular corner signaling the fact that the corner is selected for adjustment. The user then selects any one of the four directions by pressing on the four corresponding directional arrows on the directional pad on the remote control unit).
Li doesn’t disclose …inputting the first original command into a natural language model by a cloud server;… and inputting the second original command into the natural language model by the cloud server; wherein the step of projecting… by the projection device comprises: generating a first standard command according to the first original command by the natural language model, and receiving the first standard command to control the projection device… by the cloud server; wherein… the projection device comprises: generating a second standard command according to the second original command by the natural language model, and receiving the second standard command to control the projection device… by the cloud server.
However, in the same field of endeavor, Zhang discloses …inputting the first original command (col. 2, lines 17-18 discloses input the voice instruction to be analyzed to the cloud service system) into a natural language model by a cloud server (col. 14, lines 35-38 discloses the cloud has powerful computing capabilities and strong scalability, and has … Natural Language Processing (NLP) models);… and inputting the second original command (col. 2, lines 17-18 discloses input the voice instruction to be analyzed to the cloud service system) into the natural language model by the cloud server (col. 14, lines 35-38 discloses the cloud has powerful computing capabilities and strong scalability, and has … Natural Language Processing (NLP) models); wherein the step of projecting… by the projection device comprises: generating a first standard command according to the first original command by the natural language model (col. 14, lines 35-42 discloses the cloud has powerful computing capabilities and strong scalability, and has... Natural Language Processing (NLP) models…; in addition, it can update and optimize various parameters in real time, process the voice analysis and response in real time, and convert the results into executable commands and returned the same to the video conference device 10), and receiving the first standard command to control the projection device… by the cloud server (col. 14, lines 57-60 discloses cloud service system 20 may convert the analysis result into an executable command and feedback it to the projection processor 131, so that the projection processor 131 may perform an action matching the executable command); wherein… the projection device comprises: generating a second standard command according to the second original command by the natural language model (col. 14, lines 35-42 discloses the cloud has powerful computing capabilities and strong scalability, and has... Natural Language Processing (NLP) models…; in addition, it can update and optimize various parameters in real time, process the voice analysis and response in real time, and convert the results into executable commands and returned the same to the video conference device 10), and receiving the second standard command to control the projection device… by the cloud server (col. 14, lines 57-60 discloses cloud service system 20 may convert the analysis result into an executable command and feedback it to the projection processor 131, so that the projection processor 131 may perform an action matching the executable command).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Li for the purpose of performing hands-free operation during set-up and calibration of the image.
Regarding claim 10, Li discloses the control method of the projection device according to claim 1, wherein the step of adjusting the position of the at least one adjustment reference point of the adjustment image by the projection device further comprises: calculating and adjusting an adjustment angle (¶0006 discloses ‘Method for pre-compensating an asymmetrical picture in a projection system for displaying a picture,’ describe a system where the projection angle, and the trapezoidal error, is compensated for by the user entering positional information into the system via a keyboard) and a deformation amount of the adjustment image (¶0043 discloses the pattern will appear as warped rectangle 100 when the keystone effect is present, and will appear to be perfectly (or substantially) rectangular when the keystone effect is corrected)...
Li doesn’t disclose …according to the second original command.
However, in the same field of endeavor, Zhang discloses …according to the second original command (col. 14, lines 35-42 discloses the cloud has powerful computing capabilities and strong scalability, and has... Natural Language Processing (NLP) models…; in addition, it can update and optimize various parameters in real time, process the voice analysis and response in real time, and convert the results into executable commands and returned the same to the video conference device 10).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Li for the purpose of performing hands-free operation during set-up and calibration of the image.
Regarding claim(s) 14, this/these system claim(s) has/have similar limitations as method claim(s) 3, and therefore rejected on similar grounds.
Regarding claim 21, Li doesn’t disclose the projection system according to claim 14, wherein the natural language model is stored in the cloud server or connected to the cloud server through a wireless network.
However, in the same field of endeavor, Zhang discloses wherein the natural language model is stored in the cloud server or connected to the cloud server through a wireless network (col. 14, lines 35-37 discloses the cloud has powerful computing capabilities and strong scalability, and has … Natural Language Processing (NLP) models).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Li for the purpose of performing hands-free operation during set-up and calibration of the image.
Regarding claim 25, Li discloses the projection device according to claim 22, wherein the processor (¶0033 discloses projector control system 22) is configured to: calculate and adjust an adjustment angle (¶0006 discloses ‘Method for pre-compensating an asymmetrical picture in a projection system for displaying a picture,’ describe a system where the projection angle, and the trapezoidal error, is compensated for by the user entering positional information into the system via a keyboard) and a deformation amount of the adjustment image (¶0043 discloses the pattern will appear as warped rectangle 100 when the keystone effect is present, and will appear to be perfectly (or substantially) rectangular when the keystone effect is corrected)...
Li doesn’t disclose …according to the second original command.
However, in the same field of endeavor, Zhang discloses …according to the second original command (col. 14, lines 35-42 discloses the cloud has powerful computing capabilities and strong scalability, and has... Natural Language Processing (NLP) models…; in addition, it can update and optimize various parameters in real time, process the voice analysis and response in real time, and convert the results into executable commands and returned the same to the video conference device 10).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Li for the purpose of performing hands-free operation during set-up and calibration of the image.
7. Claim(s) 4, 6-9, 15 and 17-20 is/are rejected under 35 U.S.C. 103 as being unpatentable over Li in view of Zhang and in further view of Lin et al. (US 2020/0105258 A1, hereinafter referred as “Lin”).
Regarding claim 4, Li as modified doesn’t disclose the control method of the projection device according to claim 3, wherein the steps of transmitting the first original command by the terminal device and transmitting the second original command by the terminal device comprise: transmitting the first original command to the projection device by the terminal device, and transmitting the first original command to the cloud server by the projection device; and transmitting the second original command to the projection device by the terminal device, and transmitting the second original command to the cloud server by the projection device.
However, in the same field of endeavor, Lin discloses wherein the steps of transmitting the first original command by the terminal device and transmitting the second original command by the terminal device comprise: transmitting the first original command to the projection device by the terminal device (¶0084 discloses voice assistant 102a is integrated into the projector 106a and the projector 106a can thus integrally perform the operations originally respectively performed by the voice assistant 102a and the projector 106a of FIG. 1), and transmitting the first original command to the cloud server by the projection device (¶0085 discloses the voice assistant 102a extracts a plurality of first keywords from the voice signal AS1 and transmits the first keywords to the cloud service server 710); and transmitting the second original command to the projection device by the terminal device (¶0084 discloses voice assistant 102a is integrated into the projector 106a and the projector 106a can thus integrally perform the operations originally respectively performed by the voice assistant 102a and the projector 106a of FIG. 1), and transmitting the second original command to the cloud server by the projection device (¶0085 discloses the voice assistant 102a extracts a plurality of first keywords from the voice signal AS1 and transmits the first keywords to the cloud service server 710).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Li for the purpose of having a faster response during interactive operations.
Regarding claim 6, Li as modified doesn’t disclose the control method of the projection device according to claim 3, further comprising the following steps: transmitting the first standard command to the terminal device by the cloud server, so that the terminal device controls the projection device according to the first standard command.
However, in the same field of endeavor, Lin discloses transmitting the first standard command to the terminal device by the cloud server (¶0084 discloses voice assistant 102a is integrated into the projector 106a and the projector 106a can thus integrally perform the operations originally respectively performed by the voice assistant 102a and the projector 106a of FIG. 1), so that the terminal device controls the projection device according to the first standard command (¶0047 discloses the management server 108 may access/control the projector 106a (it means that the management server 108 reads information of the projector 106a to control the projector 106a) in response to the first alias AL1 and adjust the projector 106a as a first operating state).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Li so that the projector can be optimized for solely the projection function without additional circuitry.
Regarding claim 7, Li as modified doesn’t disclose the control method of the projection device according to claim 6, wherein the step of controlling the projection device according to the first standard command by the terminal device comprises: converting the first standard command into a first projector control code according to projection device information to control the projection device by the terminal device.
However, in the same field of endeavor, Lin discloses wherein the step of controlling the projection device according to the first standard command by the terminal device comprises: converting the first standard command into a first projector control code according to projection device information to control the projection device by the terminal device (¶0084 discloses voice assistant 102a is integrated into the projector 106a; ¶0045 discloses the cloud service platform 104a may transmit the first alias AL1 of the projector 106a and the second control command CMD2 corresponding to ‘increase brightness’ to the management server 108; ¶0047 discloses the management server 108 may access/control the projector 106a (it means that the management server 108 reads information of the projector 106a to control the projector 106a) in response to the first alias AL1 and adjust the projector 106a as a first operating state).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Li so that the projector can be optimized for solely the projection function without additional circuitry.
Regarding claim 8, Li as modified doesn’t disclose the control method of the projection device according to claim 3, wherein the step of receiving the first standard command by the cloud server further comprises: converting the first standard command into a first projector control code according to projection device information by the cloud server.
However, in the same field of endeavor, Lin discloses wherein the step of receiving the first standard command by the cloud server further comprises: converting the first standard command into a first projector control code according to projection device information by the cloud server (¶0045 discloses the cloud service platform 104a may transmit the first alias AL1 of the projector 106a and the second control command CMD2 corresponding to ‘increase brightness’ to the management server 108; ¶0047 discloses the management server 108 may access/control the projector 106a (it means that the management server 108 reads information of the projector 106a to control the projector 106a) in response to the first alias AL1 and adjust the projector 106a as a first operating state).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Li so that manufacture specific control code can be executed on the projector.
Regarding claim 9, Li as modified doesn’t disclose the control method of the projection device according to claim 8, wherein the step of controlling the projection device according to the first standard command further comprises: receiving the first projector control code from the cloud server by the projection device; or receiving the first projector control code from the cloud server and transmitting the first projector control code to the projection device by the terminal device.
However, in the same field of endeavor, Lin discloses wherein the step of controlling the projection device according to the first standard command further comprises: receiving the first projector control code from the cloud server by the projection device; or receiving the first projector control code from the cloud server and transmitting the first projector control code to the projection device by the terminal device (¶0084 discloses voice assistant 102a is integrated into the projector 106a; ¶0045 discloses the cloud service platform 104a may transmit the first alias AL1 of the projector 106a and the second control command CMD2 corresponding to ‘increase brightness’ to the management server 108; ¶0047 discloses the management server 108 may access/control the projector 106a (it means that the management server 108 reads information of the projector 106a to control the projector 106a) in response to the first alias AL1 and adjust the projector 106a as a first operating state).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Li so that the projector can be optimized for solely the projection function without additional circuitry.
Regarding claim(s) 15 and 17-20, this/these system claim(s) has/have similar limitations as method claim(s) 4 and 6-9, respectively, and therefore rejected on similar grounds.
8. Claim(s) 5 and 16 is/are rejected under 35 U.S.C. 103 as being unpatentable over Li in view of Zhang and in further view of Bui et al. (US 2020/0160042 A1, hereinafter referred as “Bui”).
Regarding claim 5, Li as modified doesn’t disclose the control method of the projection device according to claim 3, wherein the steps of inputting the first original command and the second original command into the natural language model comprise: inputting the first original command and a rule command into the natural language model by the cloud server, so that the natural language model outputs the first standard command according to the first original command and the rule command; and inputting the second original command and the rule command into the natural language model by the cloud server, so that the natural language model outputs the second standard command according to the second original command and the rule command.
However, in a similar field of endeavor, Bui discloses wherein the steps of inputting the first original command and the second original command into the natural language model comprise: inputting the first original command and a rule command (¶0040 discloses the multimodal selection system can train a natural language processing neural network to determine a verbal command based on verbal input) into the natural language model by the cloud server (¶0126 discloses the natural language processing neural network 210, the dispatcher 212, and the computer vision neural networks 214-218 are maintained remotely from the client device 108 (e.g., in a cloud-based fashion on the server(s) 104), so that the natural language model outputs the first standard command according to the first original command and the rule command (¶0040 discloses a verbal command includes an instruction from verbal input to perform or execute a particular function relative to one or more objects portrayed in a digital image); and inputting the second original command and the rule command (¶0040 discloses the multimodal selection system can train a natural language processing neural network to determine a verbal command based on verbal input) into the natural language model by the cloud server (¶0126 discloses the natural language processing neural network 210, the dispatcher 212, and the computer vision neural networks 214-218 are maintained remotely from the client device 108 (e.g., in a cloud-based fashion on the server(s) 104), so that the natural language model outputs the second standard command according to the second original command and the rule command (¶0040 discloses a verbal command includes an instruction from verbal input to perform or execute a particular function relative to one or more objects portrayed in a digital image).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Li for the purpose of distinguishing device specific intents from generic speech.
Regarding claim(s) 16, this/these system claim(s) has/have similar limitations as method claim(s) 5, and therefore rejected on similar grounds.
9. Claim(s) 11 and 26 is/are rejected under 35 U.S.C. 103 as being unpatentable over Li in view of Kurota et al. (US 2019/0219907 A1, hereinafter referred as “Kurota”).
Regarding claim 11, Li doesn’t disclose the control method of the projection device according to claim 1, wherein the adjustment image further comprises an outer frame and an inner frame, the at least one pattern array is located between the outer frame and the inner frame, the at least one adjustment reference point is a vertex of the outer frame, and the step of adjusting the position of the at least one adjustment reference point of the adjustment image by the projection device further comprises: adjusting the adjustment image such that a correction frame of a projection target is located between the outer frame and the inner frame; or adjusting the adjustment image such that the outer frame overlaps the correction frame of the projection target.
However, in a similar field of endeavor, Kurota discloses wherein the adjustment image further comprises an outer frame and an inner frame (¶0110 discloses outer trapezoid 606 and the inner trapezoid 605), the at least one pattern array is located between the outer frame and the inner frame (¶0093 discloses the shape pattern 607 is displayed in the operation target area including the end point 602 selected as the correction target point), the at least one adjustment reference point is a vertex of the outer frame (¶0079 discloses correction target point refers to an end point at any of four corners of the area of the projection image on which four-point keystone correction is to be performed), and the step of adjusting the position of the at least one adjustment reference point of the adjustment image by the projection device further comprises: adjusting the adjustment image such that a correction frame of a projection target is located between the outer frame and the inner frame (¶0077 discloses hatched area between the outer trapezoid 606 and the inner trapezoid 605 is an area in which a video ceases to be displayed as a result of four-point keystone correction); or adjusting the adjustment image such that the outer frame overlaps the correction frame of the projection target.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Li for the purpose of visibly locating the true corrected boundary.
Regarding claim 26, Li doesn’t disclose the projection device according to claim 22, wherein the adjustment image further comprises an outer frame and an inner frame, the at least one pattern array is located between the outer frame and the inner frame, the at least one adjustment reference point is a vertex of the outer frame, and the processor is configured to: control the projection module to adjust the adjustment image such that a correction frame of a projection target is located between the outer frame and the inner frame, or adjust the adjustment image such that the outer frame overlaps the correction frame of the projection target.
However, in a similar field of endeavor, Kurota discloses wherein the adjustment image further comprises an outer frame and an inner frame (¶0110 discloses outer trapezoid 606 and the inner trapezoid 605), the at least one pattern array is located between the outer frame and the inner frame (¶0093 discloses the shape pattern 607 is displayed in the operation target area including the end point 602 selected as the correction target point), the at least one adjustment reference point is a vertex of the outer frame (¶0079 discloses correction target point refers to an end point at any of four corners of the area of the projection image on which four-point keystone correction is to be performed), and the processor (¶0052 discloses image processor 140) is configured to: control the projection module to adjust the adjustment image such that a correction frame of a projection target is located between the outer frame and the inner frame (¶0077 discloses hatched area between the outer trapezoid 606 and the inner trapezoid 605 is an area in which a video ceases to be displayed as a result of four-point keystone correction), or adjust the adjustment image such that the outer frame overlaps the correction frame of the projection target.
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to further modify Li for the purpose of visibly locating the true corrected boundary.
10. Claim(s) 23 is/are rejected under 35 U.S.C. 103 as being unpatentable over Li in view of Lin.
Regarding claim 23, Li doesn’t disclose the projection device according to claim 22, wherein the communication interface is further configured to transmit the first original command and the second original command to a cloud server, and configured to receive a first projector control code and a second projector control code transmitted from the cloud server.
However, in the same field of endeavor, Lin discloses wherein the communication interface (¶0092 discloses the projector 106a may be provided with a wireless network medium and/or a wired network medium (e.g., a network card or a relevant dongle; it is not limited to transmission using Bluetooth, Wi-Fi, Zigbee, or another wireless transmission medium, and it is also not limited to transmission using optical fibers or another wired transmission interface)) is further configured to transmit the first original command and the second original command to a cloud server (¶0084 discloses voice assistant 102a is integrated into the projector 106a; ¶0085 discloses the voice assistant 102a extracts a plurality of first keywords from the voice signal AS1 and transmits the first keywords to the cloud service server 710)), and configured to receive a first projector control code and a second projector control code transmitted from the cloud server (¶0044 discloses the cloud service platform 104a may analyze the first control command CMD1 according to the first semantic analyzing program, acquire/retrieve or generate the corresponding second control command CMD2 according to the first control command CMD1, and transmit the first alias AL1 of the projector 106a and the corresponding second control command CMD2 to the management server 108; and ¶0047 discloses the management server 108 may access/control the projector 106a (it means that the management server 108 reads information of the projector 106a to control the projector 106a) in response to the first alias AL1 and adjust the projector 106a as a first operating state).
Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Li for the purpose of having a faster response during interactive operations.
Allowable Subject Matter
11. Claim 24 is objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to PRIYANK J SHAH whose telephone number is (571)270-3732. The examiner can normally be reached on 10:00 - 6:00 M-F.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, LunYi Lao can be reached on 5712727671. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/PRIYANK J SHAH/Primary Examiner, Art Unit 2621