Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
DETAILED ACTION
Claim Rejections - 35 USC § 102
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1, 2, 6, 9-11, 15, and 18 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Cha (US 20180372332 A1), hereinafter Cha.
Regarding claim 1, Cha discloses a cooking apparatus comprising:
a plurality of burners for heating food (“The heating unit 280 may include a burner (a gas burner or an electric burner, not illustrated) and/or an induction coil (not illustrated) which heats a cooking container 1 placed on the upper plate 200a of the cook top 200 under the control of the controller 110” paragraph [0155]);
a communicator (“a communicator 120” paragraph [0066]); and
at least one processor (“The controller 110 may include the processor 111” paragraph [0069]) configured to:
based on information on the cooking apparatus being requested from an external apparatus through the communicator based on a streaming request for an image of the cooking apparatus photographed through a camera, control the communicator to transmit the information on the cooking apparatus including information on a burner that is being used among the plurality of burners to the external apparatus (“When the communicator 120 of the range hood 100 and the wireless router (not illustrated) are connected, the controller 110 may transmit a photographed (stored) video (or image) to the wireless router (not illustrated) via the communicator 120. The wireless router (not illustrated) may transmit the received video (or image) to a communicator (not illustrated) of the portable device 410. A controller (not illustrated) of the portable device 410 may receive a video (or image) which is transmitted from the wireless router (not illustrated)” paragraph [0220] and “Referring to (c) and (d) of FIG. 8B, the controller of the portable device 410 may display a video (or image, one of 474a and 474a′) which is photographed from the range hood 100 on the application screen 474 in response to receiving the user input 460e” paragraph [0263]).
PNG
media_image1.png
720
444
media_image1.png
Greyscale
PNG
media_image2.png
540
492
media_image2.png
Greyscale
PNG
media_image3.png
470
622
media_image3.png
Greyscale
Regarding claim 2, Cha discloses the cooking apparatus of claim 1, wherein the information on the burner that is being used comprises:
information on a location where the burner that is being used is located on the cooking apparatus (The information on location is provided visually as seen in figure 8B).
Regarding claim 6, Cha discloses cooking apparatus of claim 1, wherein the external apparatus is a server that can communicate with a display apparatus wherein the streaming request is initiated, a range hood including the camera, and the cooking apparatus (“when the communicator 120 of the range hood 100 is connected to a communicator (not illustrated) of a server 440, at least one of a video and image photographed by a camera of the range hood 100 may be received via the communicator (not illustrated) of the server 440” paragraph [0228]), and wherein the server is configured to:
transmit the information on the burner that is being used that was received from the cooking apparatus to the range hood, and transmit a streaming image of the cooking apparatus received from the range hood and the information on the burner that is being used to the display apparatus (Figure 2C).
PNG
media_image4.png
462
690
media_image4.png
Greyscale
Regarding claim 9, Cha discloses the cooking apparatus of claim 2, wherein the information on the burner that is being used further comprises:
at least one of identification information of the burner that is being used, information on an operation level of the burner that is being used, or information on a time when the burner that is being used started operating (Visual identification information as in figure 8B).
Regarding claim 10, Cha discloses a method of controlling a cooking apparatus comprising a plurality of burners for heating food (“The heating unit 280 may include a burner (a gas burner or an electric burner, not illustrated) and/or an induction coil (not illustrated) which heats a cooking container 1 placed on the upper plate 200a of the cook top 200 under the control of the controller 110” paragraph [0155]), the method comprising:
based on a streaming request for an image of the cooking apparatus photographed through a camera, receiving a request for information on the cooking apparatus from an external apparatus (“Referring to (c) and (d) of FIG. 8B, the controller of the portable device 410 may display a video (or image, one of 474a and 474a′) which is photographed from the range hood 100 on the application screen 474 in response to receiving the user input 460e” paragraph [0263]); and
transmitting the information on the cooking apparatus including information on a burner that is being used among the plurality of burners to the external apparatus in response to the request for information (“When the communicator 120 of the range hood 100 and the wireless router (not illustrated) are connected, the controller 110 may transmit a photographed (stored) video (or image) to the wireless router (not illustrated) via the communicator 120. The wireless router (not illustrated) may transmit the received video (or image) to a communicator (not illustrated) of the portable device 410. A controller (not illustrated) of the portable device 410 may receive a video (or image) which is transmitted from the wireless router (not illustrated)” paragraph [0220]).
Regarding claim 11, Cha discloses the method of claim 10, wherein the information on the burner that is being used comprises:
information on a location where the burner that is being used is located on the cooking apparatus (The information on location is provided visually as seen in figure 8B).
Regarding claim 15, Cha discloses the method of claim 10, wherein the external apparatus is a server that can communicate with a display apparatus wherein the streaming request is initiated, a range hood including the camera, and the cooking apparatus (“when the communicator 120 of the range hood 100 is connected to a communicator (not illustrated) of a server 440, at least one of a video and image photographed by a camera of the range hood 100 may be received via the communicator (not illustrated) of the server 440” paragraph [0228]), and wherein the server is configured to:
transmit the information on the burner that is being used that was received from the cooking apparatus to the range hood, and transmit a streaming image of the cooking apparatus received from the range hood and the information on the burner that is being used to the display apparatus (Figure 2C).
Regarding claim 18, Cha discloses the method of claim 11, wherein the information on the burner that is being used further comprises:
at least one of identification information of the burner that is being used, information on an operation level of the burner that is being used, or information on a time when the burner that is being used started operating (Visual identification information as in figure 8B).
Claim Rejections - 35 USC § 103
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 3-5 and 12-14 are rejected under 35 U.S.C. 103 as being unpatentable over Cha, in view of Kobayashi (US 20080013787 A1), hereinafter Kobayashi.
Regarding claims 3-5, Cha discloses the cooking apparatus of claim 2.
Cha does not disclose wherein the information on the location where the burner that is being used is located comprises:
information on coordinates of vertexes of a quadrangle surrounding the burner that is being used among a plurality of quadrangles respectively surrounding the plurality of burners;
wherein the at least one processor is further configured to:
based on an entire horizontal length and an entire vertical length of the cooking apparatus, a horizontal length and a vertical length of the quadrangle surrounding the burner that is being used, and a distance between the vertexes of the quadrangle surrounding the burner that is being used and the vertexes of the cooking apparatus, calculate the coordinates of an left upper vertex and a right lower vertex of the burner that is being used;
wherein the at least one processor is further configured to:
based on receiving the information on the coordinates wherein the cooking apparatus is located in an entire image photographed through the camera from the external apparatus through the communicator, calculate the coordinates of the vertexes of the quadrangle surrounding the burner that is being used based on the information on the coordinates wherein the cooking apparatus is located.
However, Kobayashi teaches wherein the information on the location where the subject that is located comprises:
information on coordinates of vertexes of a quadrangle surrounding the subject among a plurality of quadrangles respectively surrounding the plurality of subjects;
wherein the at least one processor is further configured to:
based on an entire horizontal length and an entire vertical length of the frame, a horizontal length and a vertical length of the quadrangle surrounding the burner that is being used, and a distance between the vertexes of the quadrangle surrounding the subject and the vertexes of the frame, calculate the coordinates of an left upper vertex and a right lower vertex of the subject;
wherein the at least one processor is further configured to:
based on receiving the information on the coordinates wherein the frame is located in an entire image photographed through the camera from the external apparatus through the communicator, calculate the coordinates of the vertexes of the quadrangle surrounding the subject based on the information on the coordinates wherein the frame is located (“FIG. 3 shows an example of an image in which the face detector 30 detects human faces. The face detector 30 confines face areas A1, A2 and A3 of the detected human faces with rectangles whose four sides are parallel to four sides of a rectangular image frame G respectively. The face area data represents coordinates of two diagonal vertices of each rectangle in a coordinate system whose origin is located at an appropriate point in the image frame G and whose axes are parallel to horizontal and vertical lines of the image frame G. For example, the face area Al is represented by coordinates (X11, Y11) and coordinates (X12, Y12), the face area A2 is represented by coordinates (X21, Y21) and coordinates (X22, Y22), and the face area A3 is represented by coordinates (X31, Y31) and coordinates (X32, Y32)” paragraph [0065] and “the face area data represents coordinates locating two diagonal vertices of each of the rectangular face areas confining the detected faces. That is, the face area data of each face area consist of an abscissa (X,) and an ordinate (Y) of an upper left vertex and an abscissa (X) and an ordinate (Y) of a lower right vertex of that face area” paragraph [0086]).
PNG
media_image5.png
422
448
media_image5.png
Greyscale
In view of Kobayashi’s teachings, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the coordinate system as is taught in Kobayashi, in the cooking apparatus disclosed by Cha because Kobayashi states “a primary object of the present invention is to provide an imaging apparatus that captures image data from a subject and processes and records the image data, an image processor for processing the image data after being recorded by the imaging apparatus, an image filing method for the image data, an image processing method and an image processing program, which facilitate processing the image data in an optimum way” (paragraph [0006]). Therefore, including the teachings of Kobayashi will enable optimum processing of the image data of Cha.
Regarding claims 12-14, Cha discloses the method of claim 11, wherein the cooking apparatus is located in an entire image photographed through the camera from the external apparatus (Figure 8B).
Cha does not disclose wherein the information on the location where the burner that is being used is located comprises:
information on coordinates of vertexes of a quadrangle surrounding the burner that is being used among a plurality of quadrangles respectively surrounding the plurality of burners;
based on an entire horizontal length and an entire vertical length of the cooking apparatus, a horizontal length and a vertical length of the quadrangle surrounding the burner that is being used, and a distance between the vertexes of the quadrangle surrounding the burner that is being used and the vertexes of the cooking apparatus, calculating the coordinates of an left upper vertex and a right lower vertex of the burner that is being used;
receiving the information on the coordinates; and
calculating the coordinates of the vertexes of the quadrangle surrounding the burner that is being used based on the information on the coordinates wherein the cooking apparatus is located.
However, Kobayashi teaches wherein the information on the location where the subject is located comprises:
information on coordinates of vertexes of a quadrangle surrounding the subject among a plurality of quadrangles respectively surrounding the plurality of subjects;
based on an entire horizontal length and an entire vertical length of the frame, a horizontal length and a vertical length of the quadrangle surrounding the subject, and a distance between the vertexes of the quadrangle surrounding the subject and the vertexes of the frame, calculating the coordinates of an left upper vertex and a right lower vertex of the subject;
receiving the information on the coordinates wherein the cooking apparatus is located in an entire image photographed through the camera from the external apparatus; and
calculating the coordinates of the vertexes of the quadrangle surrounding the subject based on the information on the coordinates wherein the frame is located (“FIG. 3 shows an example of an image in which the face detector 30 detects human faces. The face detector 30 confines face areas A1, A2 and A3 of the detected human faces with rectangles whose four sides are parallel to four sides of a rectangular image frame G respectively. The face area data represents coordinates of two diagonal vertices of each rectangle in a coordinate system whose origin is located at an appropriate point in the image frame G and whose axes are parallel to horizontal and vertical lines of the image frame G. For example, the face area Al is represented by coordinates (X11, Y11) and coordinates (X12, Y12), the face area A2 is represented by coordinates (X21, Y21) and coordinates (X22, Y22), and the face area A3 is represented by coordinates (X31, Y31) and coordinates (X32, Y32)” paragraph [0065] and “the face area data represents coordinates locating two diagonal vertices of each of the rectangular face areas confining the detected faces. That is, the face area data of each face area consist of an abscissa (X,) and an ordinate (Y) of an upper left vertex and an abscissa (X) and an ordinate (Y) of a lower right vertex of that face area” paragraph [0086]).
In view of Kobayashi’s teachings, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include the coordinate system as is taught in Kobayashi, in the method disclosed by Cha because Kobayashi states “a primary object of the present invention is to provide an imaging apparatus that captures image data from a subject and processes and records the image data, an image processor for processing the image data after being recorded by the imaging apparatus, an image filing method for the image data, an image processing method and an image processing program, which facilitate processing the image data in an optimum way” (paragraph [0006]). Therefore, including the teachings of Kobayashi will enable optimum processing of the image data of Cha.
Claims 7, 8, 16, and 17 are rejected under 35 U.S.C. 103 as being unpatentable over Cha, in view of Park (US 20210102707 A1), hereinafter Park.
Regarding claims 7 and 8, Cha discloses the cooking apparatus of claim 6, wherein the display apparatus is configured to:
display the burner that is being used in the streaming image of the cooking apparatus based on the information on the burner that is being used (Figure 8B);
wherein the range hood is configured to:
control the camera to photograph the burner that is being used based on the information on the burner that is being used (“The camera 150 may photograph a still image and a video in the direction from a bottom surface 10b of the range hood toward the upper plate 200a of the cook top 200 under the control of the controller 110” paragraph [0096]); and
transmit a streaming image regarding the burner that is being used that was photographed to the server (Figure 2C).
Cha does not disclose:
wherein the display apparatus is configured to display the burner that is being used in an enlarged size;
wherein the range hood is configured to control the camera to photograph the burner that is being used in an enlarged size.
However, Park teaches:
wherein the display apparatus is configured to display the burner that is being used in an enlarged size;
wherein the range hood is configured to control the camera to photograph the burner that is being used in an enlarged size (“user interface 600 includes a new user interface element 602 having an enlarged image 604 of the cookware 1′ displayed for a user. The enlarged image may be obtained through a zoom function of the imaging sensor 160 or image processing at the controller 150 to enlarge a portion of the image data 502. The user interface element 602 may also include a cooktop data display element 606 that includes text, image, or color reference of cooktop data for the burner 1. For example, the cooktop data may include temperature, time cooking, and other suitable data” paragraph [0045]).
PNG
media_image6.png
620
510
media_image6.png
Greyscale
PNG
media_image7.png
494
730
media_image7.png
Greyscale
PNG
media_image8.png
490
760
media_image8.png
Greyscale
In view of Park’s teachings, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include enlargement as is taught in Park, in the cooking apparatus disclosed by Cha because enlarging an image simplifies viewing an image. Therefore, enlarging the images as taught by Park will simplify viewing the images of Cha.
Regarding claims 16 and 17, Cha discloses the method of claim 15, further comprising:
displaying the burner that is being used in the streaming image of the cooking apparatus based on the information on the burner that is being used (Figure 8B);
controlling the camera to photograph the burner that is being used based on the information on the burner that is being used (“The camera 150 may photograph a still image and a video in the direction from a bottom surface 10b of the range hood toward the upper plate 200a of the cook top 200 under the control of the controller 110” paragraph [0096]); and
transmitting a streaming image regarding the burner that is being used to the server (Figure 2C).
Cha does not disclose:
displaying the burner that is being used in an enlarged size in the streaming image of the cooking apparatus based on the information on the burner that is being used;
controlling the camera to photograph the burner that is being used in an enlarged size based on the information on the burner that is being used.
However, Park teaches:
displaying the burner that is being used in an enlarged size in the streaming image of the cooking apparatus based on the information on the burner that is being used;
controlling the camera to photograph the burner that is being used in an enlarged size based on the information on the burner that is being used (“user interface 600 includes a new user interface element 602 having an enlarged image 604 of the cookware 1′ displayed for a user. The enlarged image may be obtained through a zoom function of the imaging sensor 160 or image processing at the controller 150 to enlarge a portion of the image data 502. The user interface element 602 may also include a cooktop data display element 606 that includes text, image, or color reference of cooktop data for the burner 1. For example, the cooktop data may include temperature, time cooking, and other suitable data” paragraph [0045]).
In view of Park’s teachings, it would have been obvious to a person having ordinary skill in the art before the effective filing date of the invention to include enlargement as is taught in Park, in the method disclosed by Cha because enlarging an image simplifies viewing an image. Therefore, enlarging the images as taught by Park will simplify viewing the images of Cha.
Conclusion
The prior art made of record and not relied upon is considered pertinent to applicant's disclosure:
Lee (US 9497378 B1)
PNG
media_image9.png
510
282
media_image9.png
Greyscale
Cha (US 20190261459 A1)
PNG
media_image10.png
502
210
media_image10.png
Greyscale
Williams (US 20200378610 A1)
PNG
media_image11.png
360
506
media_image11.png
Greyscale
Obara (JP 2021055908 A) “the control circuit 10 extracts an image of the detection target region from the total region AR0 of the image acquired in step S10. As the detection target area, the first area AR1, the second area AR2, and the third area AR3 are set corresponding to each of the stove units 4A, 4B, and 4C”
PNG
media_image12.png
394
624
media_image12.png
Greyscale
Any inquiry concerning this communication or earlier communications from the examiner should be directed to LOGAN P JONES whose telephone number is (303)297-4309. The examiner can normally be reached Mon-Fri 8:30-5:00 EST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Michael Hoang can be reached at (571) 272-6460. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/LOGAN P JONES/Examiner, Art Unit 3762