DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d).
Status of Claims
Claims 1-10 are pending in this application.
Oath/Declaration
The receipt of Oath/Declaration is acknowledged.
Drawings
5. The receipt of Drawings is acknowledged.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-10 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Tsujimoto (US PG. Pub. 2014/0355039 A1).
Referring to Claim 1, Tsujimoto teaches an image processing apparatus (See Tsujimoto, Figs. 1-2, Image Output System) comprising:
a processor (See Tsujimoto, Fig. 1, CPU of Control Unit 10) that causes the image processing apparatus to function as (See Tsujimoto, Sect. [0071], the CPU or the MPU of the control unit 10 loads and executes a control program previously stored in a ROM (not illustrated) into a RAM (not illustrated) to operate the multifunction device 1 as the image output device of the image output system):
a receiving unit (See Tsujimoto, Fig. 2, The image obtaining unit 33
configured to receive image data (See Tsujimoto, Sect. [0089], The image obtaining unit 33 may be a used to obtain the image data.);
a generating unit (See Tsujimoto, Fig. 2, Control Unit 10) configured to generate a code image (See Sect. [0124] lines 4-6, the control unit 10 of the
multifunction device 1 receives the instruction for generation of the code pattern from the user);
a printing unit (See Tsujimoto, Fig. 12, The image forming unit 13 configured
to print, on a sheet, an image based on the received image data (See Tsujimoto, Sect. [0068], The image forming unit 13 is an image forming means based on an electro-photographic system, an inkjet system, a thermal transfer system or the like, and performs a printing processing (image output) based on image data that the image reading unit 12 read from a document or printing data that the communication unit 14 receives from outside through a network N. The image forming unit 13 forms images (characters/photographs/graphics) on a recording sheet, such as recording paper, based on the image data or the printing data.), and the generated code image (See Tsujimoto, Sect. [0096] lines 18-21, the code pattern may be displayed on the operation panel 11 (display unit 11b) of the multifunction device 1 and may be printed on paper so as to be attached onto an appropriate position of the multifunction device 1.);
a reading unit (See Tsujimoto, Fig. 2, Image reading unit 12) configured to read the printed image and the printed code image on the sheet to generate image data (See Tsujimoto, Fig. 2, Sect. [0067], The image reading unit 12 is, for example, a scanner including a charged coupled device (CCD) and reads characters, images and the like, which are printed on a document, as image data. Further, the image reading unit 12 may has a function of transferring sequentially each sheet of documents, which are placed on a predetermined document tray, up to positions read by the scanner.); and
a transmitting unit (See Tsujimoto, Fig. 2, Control Unit 20) configured to
transmit the generated image data to cause an external apparatus to store the generated image data in a storage location which is based on the read code image (See Tsujimoto, Sect. [0102], When the multifunction device 1 is specified (YES in S9), the control unit 20 of the authentication server 2 reads the printing data stored in the electronic data DB 21a in association with the user (user ID) authenticated in step S3 by the storage processing unit 24. Further, the control unit 20 transmits the printing data read from the electronic data DB 21a to the multifunction device 1 specified in step S8 (S12), and instructs the multifunction device 1 to execute the printing processing. Thereby, the printing processing, which was instructed to be executed by the user using the information processing terminal 3, can execute.)).
Referring to Claim 2, Tsujimoto teaches the image processing apparatus
according to claim 1 (See Tsujimoto, Figs. 1-2, Image Output System), wherein the processor further causes the image processing apparatus to function as (See Tsujimoto, Sect. [0071], the CPU or the MPU of the control unit 10 loads and executes a control program previously stored in a ROM (not illustrated) into a RAM (not illustrated) to operate the multifunction device 1 as the image output device of the image output system):
an analyzing unit (See Tsujimoto, Fig. 2, Control Unit 10) configured to analyze the read code image (See Tsujimoto, Sect. [0124], the control unit 10 of the multifunction device 1 receives the instruction for generation of the code pattern from the user, the control unit 10 requests the generation of the code pattern to the authentication server 2, and obtains the generated code pattern from the authentication server 2. Since the authentication server 2 generates the code pattern, it is possible to manage the code pattern of each multifunction device 1 at the authentication server 2 at one time, and the multifunction device 1 needs not to have the function of generating the code pattern. Further, when the code pattern are generated using the information (information of the multifunction device) which is likely to be appropriately changed, the user can request the multifunction device 1 to execute the printing processing using the latest code pattern.), and
wherein the transmitting unit transmits the generated image data to cause the external apparatus to store the generated image data in the storage location obtained by analyzing the read code image (See Tsujimoto, Sect. [0132], the control unit 10 codes the predetermined information (multifunction device ID) of the multifunction device and the received printing conditions to generate the code pattern (S33). Further, the control unit 10 displays the generated code pattern on the operation panel 11 (S34), and ends the processing. Further, the predetermined information of the multifunction device, that is, the information for each multifunction device 1 is stored in the storage unit (not illustrated) in advance.).
Referring to Claim 3, Tsujimoto teaches the image processing apparatus
according to claim 1 (See Tsujimoto, Figs. 1-2, Image Output System), wherein the processor further causes the image processing apparatus to function as (See Tsujimoto, Sect. [0071], the CPU or the MPU of the control unit 10 loads and executes a control program previously stored in a ROM (not illustrated) into a RAM (not illustrated) to operate the multifunction device 1 as the image output device of the image output system):
an input unit (See Tsujimoto, Fig. 2, input unit 11a) configured to input
connection information for connecting to the external apparatus (See Tsujimoto, Sect. [0066] lines 3-6 and Sect. [0069], The input unit 11a includes various input keys required to operate the multifunction device 1 by a user and receives information based on the input key operated by the user and transmits the received information to the control unit 10 for communicating with the communication unit 14 which includes a network card, a modem and the like, and can be connected to the communication network N such as a public-line network, a local area network (LAN), and the Internet, and communicates with external devices, such as the authentication server 2 and the information processing terminal 3 through the network N.).
Referring to Claim 4, Tsujimoto teaches the image processing apparatus
according to claim 3 (See Tsujimoto, Figs. 1-2, Image Output System), wherein the input unit is an operation panel (See Tsujimoto, Fig. Sect. [0066] lines 1-6, The operation panel 11 is a touch panel in which an input unit 11a and a display unit 11b are integrally configured. The input unit 11a includes various input keys required to operate the multifunction device 1 by a user and receives information based on the input key operated by the user and transmits the received information to the control unit 10.).
Referring to Claim 5, Tsujimoto teaches the image processing apparatus
according to claim 3 (See Tsujimoto, Figs. 1-2, Image Output System), wherein the connection information is an IP address of the external apparatus (See Tsujimoto, Sect. [0029] line7-12, network identification information of the IP address of the image output device is managed by the authentication device side, when the processing request of the image output device is transmitted from an external device (device to be used by the user) to the authentication device).
Referring to Claim 6, Tsujimoto teaches the image processing apparatus
according to claim 5 (See Tsujimoto, Figs. 1-2, Image Output System), wherein the IP address is obtained based on the read code image (See Tsujimoto, Sect. [0103] lines 1-7, when the destination information (IP address) of the multifunction device 1 is, for example, included in the code pattern (information of the multifunction device) photographed by the image obtaining unit 33 of the information processing terminal 3, the authentication server 2 can obtain the destination of the multifunction device 1 from the code pattern.).
Referring to Claim 7, Tsujimoto teaches the image processing apparatus
according to claim 1 (See Tsujimoto, Figs. 1-2, Image Output System), wherein the image is an image for an assignment (See Tsujimoto, Fig. 10, Sect. [0121],
The printing request screen illustrated on the lower side of FIG. 10 displays file names, registered dates for each of the plurality of printing data previously registered in the authentication server 2 by the logged-in user, and check boxes for selecting each printing data, in association with one another. Further, the check box may be configured to select the plurality of printing data. Further, the printing request screen illustrated on the lower side of FIG. 10 displays a description of the fact that the user photographs the pattern displayed on the multifunction device 1 by using the image obtaining unit 33 of the information processing terminal 3, the photographing button, and the print button.).
Referring to Claim 8, Tsujimoto teaches the image processing apparatus
according to claim 1 (See Tsujimoto, Figs. 1-2, Image Output System), wherein the code image is generated based on information which is different from the received image data (See Tsujimoto, Sect. [0128], the user who desires to use the multifunction device 1 operates the operation panel 11 of the multifunction device 1 to set the desired printing conditions, and operates, for example, a `code pattern generating` button displayed on the operation panel 11. When the code pattern generating button is operated, the control unit 10 of the multifunction device 1 codes the predetermined information of the multifunction device and the printing conditions set by the user, and displays the generated code pattern on the operation panel 11 (display unit 11b) to notify (present) the user. The printing conditions are the setting conditions relating to the processing which can be executed by the multifunction device 1, such as execution of N-up printing, execution of color printing, and execution of black and white printing.).
Referring to Claim 9, arguments analogous to claim 1 are applicable herein.
The structural elements of “An image processing apparatus” in claim 1 perform all of the operations of “A control method” in claim 9. Thus, “A control method” in claim 9 is rejected for reasons explicitly taught in the rejection of claim 1.
Referring to Claim 10, arguments analogous to claim 1 are applicable
herein. Thus, a non-transitory computer-readable storage medium of claim 10 is explicitly/inherently taught as evidenced by (See Nakajima, Fig. 1, HDD 16, Sect. [0049] and [0199], the HDD 160 as an example of a large-capacity and non-volatile storage device. Alternatively, a non-volatile memory such as a solid-state drive (SSD) may be used as long as the non-volatile memory is a large-capacity and non-volatile storage device… Additionally, the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions recorded on a storage medium (e.g. non-transitory computer-readable storage medium) to perform the functions of one or more of the above-described embodiment(s) of the present invention, and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more of a central processing unit (CPU), micro processing unit (MPU), or other circuitry, and may include a network of separate computers or separate computer processors. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.) and various memories stored therein.
Cited Art
9. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure Nakajima et al. (US PG. PUB. No. 2016/0105582 A1) discloses If a storage period of data held in a holding unit elapses and the data is deleted from the holding unit after a user logs out of a job processing apparatus, the user cannot easily recognize that there is deleted data. A method for controlling the job processing apparatus includes holding in the holding unit a job associated with a user, storing in a first storing unit identification information for identifying a user associated with a job deleted from the holding unit, storing in a second storing unit identification information for identifying a user associated with a job of which holding in the holding unit has failed, notifying based on the identification information stored in the first storing unit a user that the job is deleted, and notifying based on the identification information stored in the second storing unit a user that the holding of the job has failed.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to DARRYL V DOTTIN whose telephone number is (571)270-5471. The examiner can normally be reached M-F 9am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Abderrahim Merouan can be reached on 571-270-5254. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/DARRYL V DOTTIN/Primary Examiner, Art Unit 2683
/DARRYL V DOTTIN/Primary Examiner, Art Unit 2683