DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Amendment
The examiner acknowledges receipt of arguments/amendments filed 11/20/25. The arguments set forth are addressed herein below. Claims 1-18 and 21-22 remain pending and Claims 1 and 14 are currently amended, Claims 19-20 are canceled, and Claims 21-22 are newly added.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claim(s) 1-3, 5-16, 18, and 21-22 is/are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Wong (US 2018/0043259).
Claim 1: Wong discloses method performed on an augmented reality (AR) wearable device (Figs. 5, 8, ¶ 42, 72-77, 88-90), the method comprising: capturing, by an image capturing device of the AR wearable device, an image corresponding to a user view of a real-world scene; accessing, in a memory of the AR wearable device, data indicating a card within the image and a location of the card, the card comprising a code (¶ 38, 41, 48, 50-52, 62, the “app” which is a program in memory is accessed to determining the identity of the card and the card’s location, ¶ 78, 100); receiving, from a host device (gaming server) via a wireless network (¶ 82-84), a card assignment generated by the host device, the card assignment dynamically (innovative manner) mapping a plurality of codes printed (¶ 48, 50-51, 54, 87, 99-100) on a plurality of cards to a plurality of overlays (animations), each card of the plurality of cards comprising a different code (¶ 38 “The Gamer then uses the App, which controls the camera on the Gaming Device, to scan and identify the physical card in play. Once the physical card has been identified, the identity information is sent to the Gaming Server via a data network to retrieve from a database the gaming rules and 3D model with pre-defined sequence of animated movements associated with the physical card in play.”, ¶ 41 “Through identification techniques including, but not limited to, optical character recognition of the title of the trading card or serial number, graphical recognition of the Game Component (105), or scanning special barcode (e.g., a QR code) by the App (102) running on the Gaming Device (100), the Game Component (105) represented by the physical trading card in play is identified. The App can service one or more types of TCG(s) depending on the business objectives of the operator of the Gaming Server (200). This information is sent to the Gaming Server (200) via the data network (202) where certain predetermined animated sequence of the 3D model associated with the Gaming Component (105) is fetched from the database (201) and returned to the Gaming Device (100). The 3D animation will be shown on the screen (103) of the Gaming Device (100) where the Game Component (e.g., the monster) (105) appears to “come alive” and arise from the flat physical trading card shown on screen (106) as captured by the camera (101).”, ¶ 50-51 – discloses that each card has a unique serial number or barcode affixed thereto, ¶ 54, 62-64, 69); determining an overlay for the card based on the card assignment (see, above) and the code; adjusting a shape of the overlay based on the location of the card and a user view; and displaying, on a display of the AR wearable device, the overlay for the card, wherein a position of the overlay is based on the location of the card and the user view (¶ 38, 41, 48, 50-52, 94, 117), the overlay covering a face of the card (Figs. 11-17, ¶ 98-115).
Claims 14 and 21: Wong discloses an augmented reality (AR) wearable device (¶ 42) comprising: at least one processor; and a memory (non-transitory computer-readable storage medium) storing instructions that, when executed by the at least one processor (Figs. 5, 8, ¶ 72-77, 88-90), configure the AR wearable device to perform operations comprising: capturing, by an image capturing device of the AR wearable device, an image corresponding to a user view of a real-world scene; accessing, in a memory of the AR wearable device, data indicating a card within the image and a location of the card, the card comprising a code (¶ 38, 41, 48, 50-52, 62, the “app” which is a program in memory is accessed to determining the identity of the card and the card’s location, ¶ 78, 100); a card assignment generated by the host device, the card assignment dynamically (innovative manner) mapping a plurality of codes printed (¶ 48, 50-51, 54, 87, 99-100) on a plurality of cards to a plurality of overlays (animations), each card of the plurality of cards comprising a different code (¶ 38 “The Gamer then uses the App, which controls the camera on the Gaming Device, to scan and identify the physical card in play. Once the physical card has been identified, the identity information is sent to the Gaming Server via a data network to retrieve from a database the gaming rules and 3D model with pre-defined sequence of animated movements associated with the physical card in play.”, ¶ 41 “Through identification techniques including, but not limited to, optical character recognition of the title of the trading card or serial number, graphical recognition of the Game Component (105), or scanning special barcode (e.g., a QR code) by the App (102) running on the Gaming Device (100), the Game Component (105) represented by the physical trading card in play is identified. The App can service one or more types of TCG(s) depending on the business objectives of the operator of the Gaming Server (200). This information is sent to the Gaming Server (200) via the data network (202) where certain predetermined animated sequence of the 3D model associated with the Gaming Component (105) is fetched from the database (201) and returned to the Gaming Device (100). The 3D animation will be shown on the screen (103) of the Gaming Device (100) where the Game Component (e.g., the monster) (105) appears to “come alive” and arise from the flat physical trading card shown on screen (106) as captured by the camera (101).”, ¶ 50-51 – discloses that each card has a unique serial number or barcode affixed thereto, ¶ 54, 62-64, 69); determining an overlay for the card based on the card assignment (see above) and the code; adjusting a shape of the overlay based on the location of the card and a user view; and displaying, on a display of the AR wearable device, the overlay for the card, wherein a position of the overlay is based on the location of the card and the user view (¶ 38, 41, 48, 50-52), the overlay covering a face of the card (Figs. 11-17, ¶ 98-115).
Claims 2, 15, and 22: Wong discloses determining a gesture performed by a user of the AR wearable device related to the card; determining a rule associated with the gesture; and performing the rule (¶ 52, 89, 97, 100, 108, 112, 117).
Claims 3 and 16: Wong teaches wherein the gesture is play the card and the rule comprises: in response to an end of hand condition, determine which user of a plurality of users, comprising the user, won the hand, and adjusting a score (¶ 39, 45, 52, 56, 61-67, 101, emphasis on ¶ 61-67).
Claims 5 and 18: Wong teaches: determining a gesture performed by the user is laying out a plurality of cards for a game (¶ 67, scanning of the cards indicates that the user has laid out the preferred cards for play of the game); and in response to a number of the plurality of cards being a same number as a number of cards to play the game (¶ 67 e.g. after all the user’s cards in play are scanned), determining a card assignment, wherein the card assignment assigns each of the plurality of cards an overlay of a plurality of overlays, and wherein each of the plurality of cards is identified based on each of the plurality of cards comprising a different code (¶ 38, 41, 48, 50-52, 67).
Claim 6: Wong discloses wherein the overlay is a video that the AR wearable device plays on the card, the overlay is an image, or the overlay is an animated image (¶ 38-39, 50-52).
Claim 7: Wong discloses wherein the overlay is a first overlay and wherein the method further comprises: in response to determining that a condition of a rule is satisfied, changing the first overlay to a second overlay (¶ 61-64, 67, 94, 105-106).
Claim 8: Wong discloses wherein the code is on at least one face of the card or on at least one edge of the card (¶ 48, 86-88, Fig. 7).
Claim 9: Wong discloses receiving an indication of a selection of a game from the user of the AR wearable device (selection from a website); downloading the game, the game comprising the overlay; and running the game (see above, ¶ 38, 41, 58, 61-64).
Claim 10: Wong discloses wherein the user is a first user and the AR wearable device is a first AR wearable device, and wherein the method further comprises: receiving from a second AR wearable device an indication of a request from a second user to join the game; and adding the second user and the second AR wearable device to the game (¶ 38, 43-50, 52-64, Figs. 2-4).
Claim 11: Wong teaches sending the card assignment (the specific TCG (trading card game (app)) to the second AR wearable device (¶ 38, 43-50, 52-64, Figs. 2-4)(the second user of the second AR wearable device receives or downloads the specific TCG (trading card (app)) to join the first user).
Claim 12: Wong teaches wherein the user played the card and wherein the method further comprises: sending to the second AR wearable device an indication (attack animation - ¶ 61-64, 67, 94, 105-106) that the user played the card and an indication of the code (“come alive” animation - ¶ 61-64, 67, 94, 105-106) (see above, ¶ 13-14, 39, 55, 64, 97, 104-105, Figs. 10, 13, 16-18).
Claim 13: Wong discloses sending an indication to join a game to a computing device; receiving an acceptance to join the game from the computing device; and running the game comprising the overlay (see above, ¶ 39, 58, 61-62).
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action.
Claim(s) 4 and 17 is/are rejected under 35 U.S.C. 103 as being unpatentable over Wong (US 2018/0043259) in view of Watanabe (US 8,152,637).
Claims 4 and 17: Wong teaches the above, but lacks explicitly suggesting wherein the gesture is play the card and the rule comprises: in response to a card being incorrectly play condition, displaying on a display of the AR wearable device an indication that the card is being incorrectly played. Wong at least teaches that various modifications can be applied without departing from the overall scope of the invention (¶ 15). Furthermore, an analogous art of Watanabe teaches a similarly structured method and/or apparatus comprising wherein a gesture is play of a card and a rule comprises: in response to a card being incorrectly play condition, displaying on a display of the AR device an indication that the card is being incorrectly played (Col. 25:26-Col. 26:60, Figs. 41 and 43). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the AR wearable device means of Wong with the indication means of Watanabe to address situations in which a card should be re-recognized due to movement, etc (Watanabe – Col. 25:28-45). Such a modification helps prevent user error when using the game.
Response to Arguments
Applicant's arguments filed 11/20/25 have been fully considered but they are not persuasive.
Applicant argues that Wong fails to teach “a card assignment generated by the host device, the card assignment dynamically mapping a plurality of codes printed on a plurality of cards to a plurality of overlays”.
The examiner respectfully disagrees. The examiner notes that “dynamically” is not defined in the claims and one of ordinary skill in the art can interpret dynamically to be in an innovative manner. Wong, given its broadest reasonable interpretation teaches a card assignment generated by the host device, the card assignment dynamically (innovative manner) mapping a plurality of codes printed on a plurality of cards to a plurality of overlays (animations), each card of the plurality of cards comprising a different code (¶ 38 “The Gamer then uses the App, which controls the camera on the Gaming Device, to scan and identify the physical card in play. Once the physical card has been identified, the identity information is sent to the Gaming Server via a data network to retrieve from a database the gaming rules and 3D model with pre-defined sequence of animated movements associated with the physical card in play.” (here a pre-defined sequence can refer to a form of dynamically mapping since multiple sequences can be mapped to a particular code for a physical card), ¶ 41 “Through identification techniques including, but not limited to, optical character recognition of the title of the trading card or serial number, graphical recognition of the Game Component (105), or scanning special barcode (e.g., a QR code) by the App (102) running on the Gaming Device (100), the Game Component (105) represented by the physical trading card in play is identified. The App can service one or more types of TCG(s) depending on the business objectives of the operator of the Gaming Server (200). This information is sent to the Gaming Server (200) via the data network (202) where certain predetermined animated sequence of the 3D model associated with the Gaming Component (105) is fetched from the database (201) and returned to the Gaming Device (100). The 3D animation will be shown on the screen (103) of the Gaming Device (100) where the Game Component (e.g., the monster) (105) appears to “come alive” and arise from the flat physical trading card shown on screen (106) as captured by the camera (101).”, ¶ 50-51 – discloses that each card has a unique serial number or barcode affixed thereto, ¶ 54, 62-64, 69). The above, cited portions clearly discloses dynamic mapping of a plurality of codes printed on a plurality of cards to a plurality of overlays (animations).
Applicant argues that Wong teaches that the barcodes are printed or affixed to protective plastic sleeves which is not “a plurality of codes printed on a plurality of cards”.
However, Wong, additionally, teaches the barcodes can be printed or affixed to the plurality of cards themselves which is adequate disclosure for “a plurality of codes printed on a plurality of cards” (¶ 48, 50-51, 54, 87, 99-100).
Applicant argues that Wong fails to teach “the overlay covering a face of the card”.
However, the examiner respectfully disagrees. Wong teaches the overlay/animation covering a face of the card (Figs. 11-17, ¶ 98-115). As seen in at least Figs. 11-17 the animations/overlay are seen as on top or covering a face of the card. The examiner further notes that being “alongside the physical trading card” or “as if they are in the same space as the physical game components” does not teach away from the overlays covering a face of the card, but suggests how they cover a face of the card. Furthermore, to cover a face of the card doesn’t not suggest an entirety of the face but encompasses portions of the face as well.
At least based on the above, the rejection is maintained and amended to address the newly added limitations.
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to TRAMAR HARPER whose telephone number is (571)272-6177. The examiner can normally be reached 7:30am to 5:30pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kang Hu can be reached at (571) 270-1344. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/TRAMAR HARPER/Primary Examiner, Art Unit 3715