DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/12/2025 has been entered.
Response to Arguments
Applicant’s arguments with respect to claim(s) 1-4 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1, 2, and 4 is/are rejected under 35 U.S.C. 103 as being unpatentable over Anabuki (pub # 20050159916) in view of Shimoda (pub # 20160320863).
Consider claim 1. Anabuki teaches An electronic device (Fig. 1 and paragraph 0029). comprising:
an image sensor; (Fig. 4 and paragraph 0030, image sensing unit 1).
memory storing instructions; (Fig. 1 and paragraph 0035).
and one or more processors communicatively coupled to the image sensor and the memory, (Fig. 1 and paragraph 0029).
wherein the one or more instructions when executed by the one or more processors individually or collectively, cause the electronic device to:
transmit control signal to an external electronic device to cause the external electronic device to display a screen including at least one object, (Fig. 4 and paragraph 0051, The image display unit 11 receives the mixed reality space image from the image composition unit 10, and displays it, thus presenting the mixed reality space image to the user and allowing the user to experience mixed reality. Fig. 4 shows a real object being displayed to the user).
and a plurality of markers; (Fig. 6A and paragraph 0039, When the image sensing unit 1 senses markers 602 to 604 set on the real space, a real space image 610 includes images of the markers 602 to 604, as shown in FIG. 6B. Hence, the positions of the markers 602 to 604 on this image 610 are detected).
detect the plurality of markers displayed on the screen via the image sensor (Fig. 6A and paragraph 0039, a real space image 610 includes images of the markers 602 to 604, as shown in FIG. 6B. Hence, the positions of the markers 602 to 604 on this image 610 are detected).
identify a first coordinate corresponding to a position of the target image based on a position of the detected plurality of markers, (paragraph 0093, The positional relationship between the real and virtual markers may be determined by comparing, e.g., the absolute values of the coordinate values of the real and virtual marker positions to have the image center as an origin).
determine an accuracy error indicating a positional difference between the first coordinate and the second coordinate by calculating a difference between the first coordinate and the second coordinate: (paragraph 0075, the CPU 501 compares the virtual and real marker positions on the real space image to calculate correction values of the position and orientation data that allow these positions to match, and adds the calculated correction values to the position and orientation data by executing the program that serves as the position/orientation/field angle correction unit 7 (step S910)).
store accuracy correction data in the memory based on the determined accuracy error, (paragraph 0082, predetermined reference value).
the accuracy correction data being used to correct a coordinate of the user's finger and determine the coordinate of the user's finger in the AR environment based on the stored accuracy correction data. (paragraph 0083, If the average value is larger than the predetermined reference value, the flow advances to step S914 to calculate the correction amounts of the field angle data, and to add these correction amounts to the field angle data (step S914)).
Anabuki does not specifically disclose the at least one object including a target image indicating a touch position,
and a user's finger touching the target image,
and a second coordinate of a virtual finger in an augmented reality (AR) environment based on the detected user's finger.
In an analogous art Shimoda teaches the at least one object including a target image indicating a touch position, (Fig. 2 and paragraph 0057, The user U11, while watching the output image displayed on the image display unit 21, performs touch operation for the virtual image V11 by stretching a hand to the virtual image V11 which is displayed as if existing in a real space. That is, the user performs operation such as pushing of a button displayed on the virtual image V11).
and a user's finger touching the target image, (Fig. 2 and paragraph 0057, The user U11, while watching the output image displayed on the image display unit 21, performs touch operation for the virtual image V11 by stretching a hand to the virtual image V11 which is displayed as if existing in a real space).
and a second coordinate of a virtual finger in an augmented reality (AR) environment based on the detected user's finger. (paragraph 0183, The touch coordinates calculation unit 111 calculates, and supplies to the touch correction unit 112, the coordinates of the user's touch position on the virtual image V21).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the system and method of Shimoda with the system and method of Anabuki in order to improve the operability of the display apparatus (Shimoda paragraph 0093).
Consider claim 4. Anabuki teaches A method for operating an electronic device, the method comprising:
transmitting control signal to an external electronic device to cause the external electronic device to display a screen including at least one object, (Fig. 4 and paragraph 0051, The image display unit 11 receives the mixed reality space image from the image composition unit 10, and displays it, thus presenting the mixed reality space image to the user and allowing the user to experience mixed reality. Fig. 4 shows a real object being displayed to the user).
and a plurality of markers; (Fig. 6A and paragraph 0039, When the image sensing unit 1 senses markers 602 to 604 set on the real space, a real space image 610 includes images of the markers 602 to 604, as shown in FIG. 6B. Hence, the positions of the markers 602 to 604 on this image 610 are detected).
detecting the plurality of markers displayed on the screen via an image sensor (Fig. 6A and paragraph 0039, a real space image 610 includes images of the markers 602 to 604, as shown in FIG. 6B. Hence, the positions of the markers 602 to 604 on this image 610 are detected).
identifying a first coordinate corresponding to a position of the target image based on a position of the detected plurality of markers, (paragraph 0093, The positional relationship between the real and virtual markers may be determined by comparing, e.g., the absolute values of the coordinate values of the real and virtual marker positions to have the image center as an origin).
determining an accuracy error indicating a positional difference between the first coordinate and the second coordinate by calculating a difference between the first coordinate and the second coordinate (paragraph 0075, the CPU 501 compares the virtual and real marker positions on the real space image to calculate correction values of the position and orientation data that allow these positions to match, and adds the calculated correction values to the position and orientation data by executing the program that serves as the position/orientation/field angle correction unit 7 (step S910)).
storing accuracy correction data in memory based on the determined accuracy error, (paragraph 0082, predetermined reference value).
the accuracy correction data being used to correct a coordinate of the user's finger and determining the coordinate of the user's finger in the AR environment based on the stored accuracy correction data. (paragraph 0083, If the average value is larger than the predetermined reference value, the flow advances to step S914 to calculate the correction amounts of the field angle data, and to add these correction amounts to the field angle data (step S914)).
Anabuki does not specifically the at least one object including a target image indicating a touch position, and a user's finger touching the target image
and a second coordinate of a virtual finger in an augmented reality (AR) environment based on the detected user's finger.
In an analogous art Shimoda teaches the at least one object including a target image indicating a touch position, (Fig. 2 and paragraph 0057, The user U11, while watching the output image displayed on the image display unit 21, performs touch operation for the virtual image V11 by stretching a hand to the virtual image V11 which is displayed as if existing in a real space. That is, the user performs operation such as pushing of a button displayed on the virtual image V11).
and a user's finger touching the target image, (Fig. 2 and paragraph 0057, The user U11, while watching the output image displayed on the image display unit 21, performs touch operation for the virtual image V11 by stretching a hand to the virtual image V11 which is displayed as if existing in a real space).
and a second coordinate of a virtual finger in an augmented reality (AR) environment based on the detected user's finger. (paragraph 0183, The touch coordinates calculation unit 111 calculates, and supplies to the touch correction unit 112, the coordinates of the user's touch position on the virtual image V21).
Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the system and method of Shimoda with the system and method of Anabuki in order to improve the operability of the display apparatus (Shimoda paragraph 0093).
Consider claim 2. Shimoda further teaches The electronic device of claim 1, wherein the first coordinate corresponding to the position of the target image is identified when the user's finger touches the target image. (paragraph 0183, The touch coordinates calculation unit 111 calculates, and supplies to the touch correction unit 112, the coordinates of the user's touch position on the virtual image V21).
Claim(s) 3 is/are rejected under 35 U.S.C. 103 as being unpatentable over Anabuki (pub # 20050159916) in view of Shimoda (pub # 20160320863) as applied to claim 1 above, and further in view of Barth (pub # 20190004667).
Consider claim 3. Anabuki in view of Shimoda does not specifically disclose The electronic device of claim 1, wherein the one or more instructions, when executed by the processor individually or collectively, cause the electronic device to: display a guide message to encourage the user to touch the target image in response to detect the at least one object via the image sensor. However Barth in at least Fig. 6 and paragraph 0086 discloses a guided calibration procedure wherein the user is asked here to press the buttons 1 to 5 sequentially. Therefore it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to combine the system and method of Barth with the system and method of Anabuki in view of Shimoda in order to improve user experience by making the instructions clear and precise.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHAYCE R BIBBEE whose telephone number is (571)270-7222. The examiner can normally be reached Mon-Thurs 8:00-6:00.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Matthew Eason can be reached at 571-270-7230. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHAYCE R BIBBEE/Examiner, Art Unit 2624