DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
Claims 1-5 and 13-19 are rejected under 35 U.S.C. 103 as being unpatentable over Kitabayashi et al (US 20230262204 A1) in view of Peterson (US 20190371009 A1).
Regarding claim 1, Kitabayashi discloses an electronic device (Kitabayashi fig. 1 – 40, [0022], “terminal apparatus”), comprising:
a display (Kitabayashi fig. 1 – 420; [0026], “display device”);
a transceiver (Kitabayashi fig. 1 – 50; [0026], “communication network”);
one or more memories (Kitabayashi fig. 2; [0030], “a nonvolatile memory and a volatile memory”); and
at least one processor, comprising processing circuitry, electrically connected to the display, the transceiver and the one or more memories (Kitabayashi figs. 1-2; [0022]), wherein at least one processor, individually or collectively is configured to:
obtain a captured image for one or more real objects using one or more cameras (Kitabayashi [0019], “The room of the user who uses the projector 10 is an example of the real space”; [0024], “camera 30 is directed to the projection target object SC … The camera 30 performs imaging … the image data representing the captured image is referred to as captured image data”);
identify a projection area included as at least part of the captured image based on feature information included in the captured image (Kitabayashi [0021], “a predetermined region of the projector 10 in the room … The projection target object SC (feature information that identifies the target) is included in the projection region”);
obtain a first image corresponding to the projection area from the captured image (Kitabayashi [0033], “the projection region sets the projection target object SC in the projection range indicated by the first guide image GA1 as shown in FIG. 4.”);
generate an edit screen by superposing the second image and the first image (Kitabayashi figs. 4 and 7; [0040], “first guide image GA1 (exemplary first image) … FIG. 7 is a diagram showing a display example of the superimposed image GA4 (exemplary second image)”);
control the display to display a user interface representation including the edit screen (Kitabayashi fig. 7; [0040] “As shown in FIG. 7 (display a user interface representation including the edit screen) …the user touches the user interface image GA5.”; [0042], “The user edits, using the image drawing tool (using the edit screen), a projection image for decorating the projection target object SC while referring to the position and the shape of the projection target object SC reflected in the reference image.”);
generate a projection screen in response to one or more inputs to the user interface representation (Kitabayashi fig. 8; [0043], “images drawn by the user in order to decorate the projection target object SC … FIG. 8 represents a color added to the images of the stars such as yellow. The output unit 210e acquires image data representing the projection image edited based on the reference image from the terminal apparatus 40 as the projection target image.”); and
control the transceiver to transmit the projection screen to an external electronic device connected to the electronic device (Kitabayashi [0025], “The image storage apparatus 60 is connected to the communication network 50 … the image data stored in the image storage apparatus 60 include projection image data edited by the image editing system 1 (control the transceiver to transmit the projection screen to an external electronic device connected to the electronic device)”).
Kitabayashi does not disclose
generate a second image by vectorizing the first image ;
However, Peterson discloses
generate a second image by vectorizing the first image (Peterson [0201], “the image editing system 1002 operates in connection with one or more applications to facilitate the vectorization of raster images”).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Kitabayashi with Peterson to vectorize an image. This would have been done to facilitate easy and clean editing of images.
Regarding claim 2, Kitabayashi in view of Peterson discloses the electronic device of claim 1, wherein the first image comprises a raster image including one or more captured objects corresponding to the one or more real objects, and the second image comprises a vector image including one or more vector objects corresponding to the one or more captured objects (Peterson [0067], “FIGS. 2A-2F illustrate further detail for generating enhanced digital images by transforming raster-based elements in raster images (corresponding to the one or more real objects) to vector drawing segments (vector objects corresponding to the one or more captured objects)”; [0201], “the image editing system 1002 operates in connection with one or more applications to facilitate the vectorization of raster images”).
Regarding claim 3, Kitabayashi in view of Peterson discloses the electronic device of claim 2, wherein the one or more vector objects include one or more vector lines or one or more vector planes (Peterson [0048], “[0048] As used herein, the term “vector drawing” refers to a digital image include a series of mathematical curves and lines.”).
Regarding claim 4, Kitabayashi in view of Peterson discloses the electronic device of claim 2, wherein at least one processor, individually or collectively, is configured to:
process the one or more vector objects as one or more content mapping areas (Peterson [0029], “the image transformation system generates enhanced digital images by transforming raster elements in raster images to vector drawing segments. In one or more embodiments, the image transformation system generates an edge map by detecting edges of a raster element displayed in a raster image (e.g., an object that a user wants to convert to vector form).”); and
generate one or more content layers to which content is mapped in the one or more content mapping areas in response to one or more inputs, and wherein the projection screen includes the generated one or more content layers (Kitabayashi [0043], “The projector 10 projects the projection image represented by the projection image data output from the information processing apparatus 20 onto the projection region. As a result, as shown in FIG. 9, the images of the stars for decorating the projection target object SC are displayed around the projection target object SC. Horizontal-line hatching in FIG. 9 represents a color added to the images of the stars such as yellow.”).
Regarding claim 5, Kitabayashi in view of Peterson discloses the electronic device of claim 4, wherein the projection screen is generated by superposing the one or more content layers (Kitabayashi [0035], “the display controller 210b generates a superimposed image GA4 shown in FIG. 5 based on the captured image data. FIG. 5 is a diagram showing a relation among the superimposed image GA4”; [0043], “As a result, as shown in FIG. 9, the images of the stars for decorating the projection target object SC are displayed around the projection target object SC.”; figs. 5 and 9 disclose exemplary superposing of content layers).
Regarding claim 13, Kitabayashi in view of Peterson discloses the electronic device of claim 1, wherein an entire frame of the projection screen is configured by synthesizing respective frames of one or more content layers of the projection screen, and wherein the entire frame of the projection screen is reproduced based on the respective frames of the one or more content layers (Kitabayashi fig. 9; [0043], “As a result, as shown in FIG. 9, the images of the stars for decorating the projection target object SC are displayed around the projection target object SC. Horizontal-line hatching in FIG. 9 represents a color added to the images of the stars such as yellow.”; [0068], “a superimposed image obtained by superimposing an image obtained by imaging, with the camera, the projection region where the first guide image is projected”).
Claim 14 recites an electronic device which corresponds to the function performed by the electronic device of claim 1. As such, the mapping and rejection of claim 1 above is considered applicable to the electronic device of claim 14.
Claim 15 recites an image mapping method which corresponds to the function performed by the electronic device of claim 1. As such, the mapping and rejection of claim 1 above is considered applicable to the image mapping method of claim 15.
Claim 16 recites an image mapping method which corresponds to the function performed by the electronic device of claim 2. As such, the mapping and rejection of claim 2 above is considered applicable to the image mapping method of claim 16.
Claim 17 recites an image mapping method which corresponds to the function performed by the electronic device of claim 3. As such, the mapping and rejection of claim 3 above is considered applicable to the image mapping method of claim 17.
Claim 18 recites an image mapping method which corresponds to the function performed by the electronic device of claim 4. As such, the mapping and rejection of claim 4 above is considered applicable to the image mapping method of claim 18.
Claim 19 recites an image mapping method which corresponds to the function performed by the electronic device of claim 5. As such, the mapping and rejection of claim 5 above is considered applicable to the image mapping method of claim 19.
Claims 6-9 and 20 are rejected under 35 U.S.C. 103 as being unpatentable over Kitabayashi in view of Peterson and further view of Kim (US20240231577A1).
Regarding claim 6, Kitabayashi in view of Peterson discloses the electronic device of claim 1, but does not disclose wherein the user interface representation includes a drawing area and a content selection area, and wherein the edit screen is displayed in the drawing area, and content options mappable to a content mapping area are displayed in the content selection area.
However, Kim discloses
the user interface representation includes a drawing area and a content selection area, and wherein the edit screen is displayed in the drawing area, and content options mappable to a content mapping area are displayed in the content selection area (Kim fig. 7A; [0104], “The clip editing UI may be set differently according to the type of the selected clip. Specifically, when the type of clip is a video clip, the electronic device may configure and provide a clip editing UI 500, by including a trim/split menu 501, a pan/zoom menu 502, an audio control menu 503, a clip graphic menu 504”; 702 - a drawing area; 500 - a content selection area; 706 – an edit screen is displayed in the drawing area; 501, 504 - content options mappable to a content mapping area are displayed in the content selection area).
It would have been obvious to a person of ordinary skill in the art before the effective filing date of the claimed invention to modify Kitabayashi with Kim to display a detailed editor user interface. This would have been done to provide users with a editing interface comprising multiple functions for detailed editing of content.
Regarding claim 7, Kitabayashi in view of Peterson and further view of Kim disclose the electronic device of claim 6, wherein at least one processor, individually or collectively, is configured to generate a content layer where content associated with at least one of the content options is mapped to the content mapping area in response to receiving an input for mapping the content in the content mapping area (Peterson fig. 7A; [0149], “as shown in FIG. 7A, the layer interface may include a layer rotation input unit 706 for rotation of the layer media 704, a layer size input unit 708 for size change, and a position movement input unit (not shown) that receives a user's drag gesture to change the position of the layer media 704 on the main media 702.”).
Regarding claim 8, Kitabayashi in view of Peterson and further view of Kim disclose the electronic device of claim 7, wherein at least one processor, individually or collectively, is configured to generate the content layer whenever receiving the input for mapping the content associated with at least one of the content options in the content mapping area (Peterson fig. 7A; [0149], “as shown in FIG. 7A, the layer interface may include a layer rotation input unit 706 for rotation of the layer media 704, a layer size input unit 708 for size change, and a position movement input unit (not shown) that receives a user's drag gesture to change the position of the layer media 704 on the main media 702.”).
Regarding claim 9, The electronic device of claim 7, wherein at least one processor, individually or collectively, is configured to:
generate a new projection screen by adding the generated content layer to a pre-generated projection screen (Kitabayashi fig. 8; [0043], “images drawn by the user in order to decorate the projection target object SC … FIG. 8 represents a color added to the images of the stars such as yellow. The output unit 210e acquires image data representing the projection image edited based on the reference image from the terminal apparatus 40 as the projection target image (generate a new projection screen by adding the generated content layer to a pre-generated projection screen).”); and
control the transceiver to transmit the new projection screen to the external electronic device in response to the generation of the new projection screen (Kitabayashi [0025], “The image storage apparatus 60 is connected to the communication network 50 … the image data stored in the image storage apparatus 60 include projection image data edited by the image editing system 1 (control the transceiver to transmit the projection screen to an external electronic device connected to the electronic device)”).
Claim 20 recites an image mapping method which corresponds to the function performed by the electronic device of claim 6. As such, the mapping and rejection of claim 6 above is considered applicable to the image mapping method of claim 20.
Allowable Subject Matter
Claims 10-12 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims.
The following is a statement of reasons for the indication of allowable subject matter:
Regarding claim 10, Park et al (US 20200344517 A1) discloses
control the transceiver to transmit a message for controlling the external electronic device to output a standby screen (Park [0044], “The server 200 may be a configuration for obtaining a standby screen to be transmitted to the electronic device”; ).
However, none of the prior art of record, alone or in combination, disclose
wherein the first image comprises a raster image including a captured image for the one or more real objects where the standby screen is projected.
Claims 11-12 are allowed for depending from claim 10.
Conclusion
See the notice of references cited (PTO-892) for prior art made of record, including art that is not relied upon but considered pertinent to applicant's disclosure.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JITESH PATEL whose telephone number is (571)270-3313. The examiner can normally be reached 8am - 5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Said A. Broome can be reached at (571) 272-2931. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JITESH PATEL/Primary Examiner, Art Unit 2612