DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
Information Disclosure Statement
2. The information disclosure statements (IDS) submitted on the following dates are in compliance with the provisions of 37 CFR 1.97 and are being considered by the Examiner: 03/26/2024.
Claim Rejections - 35 USC § 103
3. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
This application currently names joint inventors. In considering patentability of the claims the examiner presumes that the subject matter of the various claims was commonly owned as of the effective filing date of the claimed invention(s) absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and effective filing dates of each claim that was not commonly owned as of the effective filing date of the later invention in order for the examiner to consider the applicability of 35 U.S.C. 102(b)(2)(C) for any potential 35 U.S.C. 102(a)(2) prior art against the later invention.
4. Claims 1-2, 4-12, 15-17 and 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over Bannai, (“Bannai”) [US-7,532,224-B2] in view of Ohshima et al. (“Ohshima”) [US-6,972,734-B1]
Regarding claim 1, Bannai discloses an information processing system (Bannai- Fig. 1 shows the connection between the worker mixed reality apparatus 10 a and the instructor mixed reality apparatus 10 b; col 2, lines 24-26, at least discloses a method and an apparatus that enable a second user to share a mixed reality space image including a virtual object superimposed in a space where a first user exists […] a technique that enables the second user to acquire a mixed reality space image from an arbitrary viewpoint) comprising:
a processor; and a memory storing a program that, when executed by the processor (Bannai- col 20, lines 8-27, at least discloses a system or device with a storage medium that stores program code (software) for implementing the functions of the above-described embodiments, and causing a computer (or a CPU, MPU or the like) of the system or device to read the program code from the storage medium and then to execute the program code), causes the processor to
generate a first image representing a view from a viewpoint of a first user on a virtual three-dimensional space shared by a plurality of users (Bannai- col 3, lines 1-12, at least discloses an information processing apparatus that enables a further user to share a mixed reality space image [a first image] including a virtual object superimposed in a space where a first user exists; col 6, lines 43-61, at least discloses When the instructor selects the worker's camera, the image captured by the worker mixed reality apparatus 10 a [a first image] can be directly sent to the instructor mixed reality apparatus 10 b […] FIG. 2A shows the state of a worker 40 wearing the HMD 20 a working in the mixed reality space, with a real work object 42 and a CG 3D virtual object 43 representing the work object stereoscopically displayed [a view from a viewpoint of a first user on a virtual three-dimensional space]. The worker 40 can share the virtual object 43 with the instructor; Figs. 6A-6B and col 15, lines 41-46, at least disclose two remote participants 50 b and 50 c can share the worker's mixed reality space shown in FIG. 6A [shared by a plurality of users]),
synthesize an indication user interface (UI) used by each user to indicate a point
in the three-dimensional space with the first image (Bannai- Fig. 2A-2C and col 6, line 44 to col 7, line 12, at least disclose When the instructor selects the worker's camera, the image captured by the worker mixed reality apparatus 10 a [the first image] can be directly sent to the instructor mixed reality apparatus 10 b […] The worker 40 can share the virtual object 43 with the instructor […] The virtual object 43 (i.e., as an exemplary model) placed in the world coordinate system can be seen in a positional relationship as if it is placed next to the real work object 42 when observed through the HMD […] FIG. 2B shows a space where a remote instructor 50 is present. The instructor 50, having the HMD 20 b, can look at a stereoscopic image displayed on the display unit 23 b [an indication user interface (UI)] of the HMD 20 b. The stereoscopic image includes the work object 42 seen from the viewpoint of the camera 70 a, its 3D CG model 43, the pointer 41 a [point in the three-dimensional space] showing the stylus position of the worker 40, and the pointer 41 b [point in the three-dimensional space] showing the stylus position of the instructor 50 [Wingdings font/0xE0] suggests “a stereoscopic image displayed on the display unit 23 b” as “an indication user interface” that integrated by worker 40 and instructor 50 to indicate pointer 41 a and pointer 41 b in the mixed reality space; col 7, lines 44-49, at least discloses the instructor 50 can depress the second button of the stylus to switch the image on the display unit 23 b to the image seen from the viewpoint of the worker 40. In this case, the image that the worker 40 can look through the HMD 20 a perfectly agrees with the image that the instructor 50 can look through the HMD 20 b),
determine visibility of an indicated position of a second user , the indicated position being a point in the three-dimensional space that is indicated by the second user through the indication UI (Bannai- col 5, lines 47-54, at least discloses A display unit 23 b includes a left eye display 23 b (L) and a right eye display 23 b (R) to display the images sent from the video decoding section 32 b. A position and orientation measuring section 11 b can input the position and orientation of the instructor's HMD 20 b [position of a second user] from a three-dimensional position and orientation sensor 21 b and the three-dimensional position and orientation of a stylus 41 b [indicated position being a point in the three-dimensional space that is indicated by the second user]; col 7, lines 5-12, at least discloses The instructor 50, having the HMD 20 b, can look at a stereoscopic image displayed on the display unit 23 b of the HMD 20 b. The stereoscopic image includes the work object 42 seen from the viewpoint of the camera 70 a, its 3D CG model 43, the pointer 41 a showing the stylus position of the worker 40, and the pointer 41 b showing the stylus position of the instructor 50; col 9, lines 64-67, at least discloses the three-dimensional position and orientation information (xs, ys, zs, αs, βs, γs) of the instructor's stylus 41 b can be entered into the position and orientation measuring section 11 b; col 10, lines 2-7, at least discloses The position and orientation of the stylus 41 b can be converted into the relative position and orientation seen from the instructor's viewpoint [determine visibility of an indicated position]. As a result, the instructor's stylus relative position (xd, yd, zd, αd, βd, γd)=(xs−xh, ys−yh, zs−zh, αs−αh, βs−βh, γs−γh) can be calculated), and
displaying the indication UI of the second user in the first image according to a determination result of the visibility (Bannai- Fig. 2B shows display unit 23 b includes a stylus 41 b; col 5, lines 47-54, at least discloses A display unit 23 b includes a left eye display 23 b (L) and a right eye display 23 b (R) to display the images sent from the video decoding section 32 b. A position and orientation measuring section 11 b can input the position and orientation of the instructor's HMD 20 b from a three-dimensional position and orientation sensor 21 b and the three-dimensional position and orientation of a stylus 41 b; col 6, lines 65-67, at least discloses A pointer 41 b represents the stylus of the instructor superimposed on the CG virtual object; col 10, lines 2-7, at least discloses The position and orientation of the stylus 41 b can be converted into the relative position and orientation seen from the instructor's viewpoint [determination result of the visibility]).
Bannai does not explicitly disclose determine visibility of an indicated position of a second user when seen from the viewpoint of the first user; change a method for displaying the indication UI of the second user in the first image.
However, Ohshima discloses
determine visibility of an indicated position of a second user when seen from the viewpoint of the first user (Ohshima- Fig. 12C and col 12, lines 20-24, at least disclose since the pointer display [an indicated position] can be seen from a position directly opposite thereto [seen from the viewpoint of the first user], as shown in FIG. 12C, a target player can recognize it in a game application in which a sight is set on an opponent player using the pointer display [visibility of an indicated position of a second user]);
change a method for displaying the indication UI of the second user in the first image (Ohshima- col 11, lines 53-61, at least discloses FIGS. llA and llB show an example of a pointer display for setting the sight. When one point in the space shared by a plurality of players must be pointed, the pointed point must be clearly recognized by not only the player who is pointing but also other players. Furthermore, when a pointer display is large or thick, contents displayed in the field of view are complicated very much and may disturb progress of the game. Also, since one point in the space is pointed, it is not easy to recognize the distance to that point; Figs. 12A-12C and col 12, lines 9-30, at least disclose By making such pointer display, a portion (irrespective of whether it is in a virtual or real space) that overlaps the pointer display is not hidden by the pointer display, and the distance to the pointed point and the pointed direction can be recognized more easily. More specifically, as shown in FIGS. 12A to 12C, the player who is pointing can see the pointer display, as shown in FIG. 12A, and can easily recognize the pointed point in the space. By contrast, since a player who is located at a position nearly in front of the player who is pointing can see the pointer display, as shown in FIG. 12B, he or she can recognize that a pointer display is made toward his or her side [method for displaying]. Also, since the pointer display can be seen from a position directly opposite thereto [method for displaying], as shown in FIG. 12C, a target player can recognize it in a game application in which a sight is set on an opponent player using the pointer display).
It would have been obvious to one of ordinary in the art before the effective filing date of the claimed invention to have modified Bannai to incorporate the teachings of Ohshima, and apply the pointer display can be seen from a position into the indication user interface of the second user, as taught by Bannai in order to determine visibility of an indicated position of a second user when seen from the viewpoint of the first user, the indicated position being a point in the three-dimensional space that is indicated by the second user through the indication UI, and change a method for displaying the indication UI of the second user in the first image according to a determination result of the visibility.
Doing so would allow a given operator to easily recognize the state of another operator when a plurality of operators make predetermined operation while sharing a mixed reality space.
Regarding claim 2, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein the visibility of the indicated position of the second user is determined to be good in a case where the indicated position of the second user is at a place seen from the viewpoint of the first user (Ohshima- col 12, lines 9-24, at least disclose By making such pointer display, a portion (irrespective of whether it is in a virtual or real space) that overlaps the pointer display is not hidden by the pointer display, and the distance to the pointed point and the pointed direction can be recognized more easily […] since the pointer display can be seen from a position directly opposite thereto [the indicated position of the second user is at a place seen from the viewpoint of the first user], as shown in FIG. 12C, a target player can recognize it in a game application in which a sight is set on an opponent player using the pointer display [the indicated position of the second user is at a place seen from the viewpoint of the first user]), and determined to be poor in a case where the indicated position of the second user is at a place not seen from the viewpoint of the first user (Ohshima- col 3, lines 24-27, at least discloses when a plurality of pointing rods are present in a space, the displayed images are confusing. Also, since each rod is formed by planes, the visibility of overlapping portions is poor).
It would have been obvious to one of ordinary in the art before the effective filing date of the claimed invention to have modified Bannai to incorporate the teachings of Ohshima, and apply the pointed point can be recognized more easily into Bannai’s teachings in order the visibility of the indicated position of the second user is determined to be good in a case where the indicated position of the second user is at a place seen from the viewpoint of the first user, and determined to be poor in a case where the indicated position of the second user is at a place not seen from the viewpoint of the first user.
The same motivation that was utilized in the rejection of claim 1 applies equally to this claim.
Regarding claim 4, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein the visibility of the indicated position of the second user is determined to be good in a case where the indicated position of the second user is on a front side of an object (Ohshima- Fig. 16 shows the opponent player 3000 [second user] is on a front side of puck 1500 [an object]; col 12, lines 9-24, at least disclose By making such pointer display, a portion (irrespective of whether it is in a virtual or real space) that overlaps the pointer display is not hidden by the pointer display, and the distance to the pointed point and the pointed direction can be recognized more easily […] since the pointer display can be seen from a position directly opposite thereto [the indicated position of the second user is at a place seen from the viewpoint of the first user], as shown in FIG. 12C, a target player can recognize it in a game application in which a sight is set on an opponent player using the pointer display) and determined to be poor in a case where the indicated position of the second user is on a back side of the object, using depth information on the object included in the first image (Ohshima- Fig. 16 shows the opponent player 3000 is on a back side of puck 1500 when viewed from the left player 2000; col 3, lines 24-27, at least discloses when a plurality of pointing rods are present in a space, the displayed images are confusing. Also, since each rod is formed by planes, the visibility of overlapping portions is poor).
It would have been obvious to one of ordinary in the art before the effective filing date of the claimed invention to have modified Bannai to incorporate the teachings of Ohshima, and apply the pointed point can be recognized more easily into Bannai’s teachings in order the visibility of the indicated position of the second user is determined to be good in a case where the indicated position of the second user is on a front side of an object and determined to be poor in a case where the indicated position of the second user is on a back side of the object, using depth information on the object included in the first image.
The same motivation that was utilized in the rejection of claim 1 applies equally to this claim.
Regarding claim 5, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein
the indication UI includes a pointer representing an indicated position that is a point in the three-dimensional space indicated by a user and a ray representing an indicated direction that is a direction indicated by the user (Bannai- Fig. 2B shows pointer 41 b representing an indicated position that is a point in the three-dimensional space; col 6, lines 65-67, at least discloses pointer 41 b represents the stylus of the instructor superimposed on the CG virtual object; Ohshima- col 12, lines 9-24, at least disclose By making such pointer display, a portion (irrespective of whether it is in a virtual or real space) that overlaps the pointer display is not hidden by the pointer display, and the distance to the pointed point and the pointed direction can be recognized more easily […] since the pointer display can be seen from a position directly opposite thereto [direction], as shown in FIG. 12C, a target player can recognize it in a game application in which a sight is set on an opponent player using the pointer display), and
a ray of the second user is displayed in a case where the visibility of the indicated
position of the second user is determined to be poor (Ohshima- col 3, lines 24-27, at least discloses when a plurality of pointing rods are present in a space, the displayed images are confusing. Also, since each rod is formed by planes, the visibility of overlapping portions is poor), and
the ray of the second user is hidden in a case where the visibility of the indicated position of the second user is determined to be good (Ohshima- col 12, lines 9-24, at least disclose By making such pointer display, a portion (irrespective of whether it is in a virtual or real space) that overlaps the pointer display is not hidden by the pointer display, and the distance to the pointed point and the pointed direction can be recognized more easily […] since the pointer display can be seen from a position directly opposite thereto, as shown in FIG. 12C, a target player can recognize it in a game application in which a sight is set on an opponent player using the pointer display).
It would have been obvious to one of ordinary in the art before the effective filing date of the claimed invention to have modified Bannai to incorporate the teachings of Ohshima, and apply the pointed point can be recognized more easily into Bannai’s teachings in order the indication UI includes a pointer representing an indicated position that is a point in the three-dimensional space indicated by a user and a ray representing an indicated direction that is a direction indicated by the user, and a ray of the second user is displayed in a case where the visibility of the indicated position of the second user is determined to be poor, and the ray of the second user is hidden in a case where the visibility of the indicated position of the second user is determined to be good.
The same motivation that was utilized in the rejection of claim 1 applies equally to this claim.
Regarding claim 6, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein
the indication UI includes a pointer representing an indicated position that is a point in the three-dimensional space indicated by a user and a ray representing an indicated direction that is a direction indicated by the user (Bannai- Fig. 2B shows pointer 41 b representing an indicated position that is a point in the three-dimensional space; col 6, lines 65-67, at least discloses pointer 41 b represents the stylus of the instructor superimposed on the CG virtual object; Ohshima- col 12, lines 9-24, at least disclose By making such pointer display, a portion (irrespective of whether it is in a virtual or real space) that overlaps the pointer display is not hidden by the pointer display, and the distance to the pointed point and the pointed direction can be recognized more easily […] since the pointer display can be seen from a position directly opposite thereto [direction], as shown in FIG. 12C, a target player can recognize it in a game application in which a sight is set on an opponent player using the pointer display),
a method for displaying a pointer of the second user is changed in a case where
the visibility of the indicated position of the second user is determined to be poor (Ohshima- col 3, lines 24-27, at least discloses when a plurality of pointing rods are present in a space, the displayed images are confusing. Also, since each rod is formed by planes, the visibility of overlapping portions is poor), and
a ray of the second user is hidden in a case where the visibility of the indicated
position of the second user is determined to be good (Ohshima- col 12, lines 9-24, at least disclose By making such pointer display, a portion (irrespective of whether it is in a virtual or real space) that overlaps the pointer display is not hidden by the pointer display, and the distance to the pointed point and the pointed direction can be recognized more easily […] since the pointer display can be seen from a position directly opposite thereto, as shown in FIG. 12C, a target player can recognize it in a game application in which a sight is set on an opponent player using the pointer display).
It would have been obvious to one of ordinary in the art before the effective filing date of the claimed invention to have modified Bannai to incorporate the teachings of Ohshima, and apply the visibility of overlapping portions is poor into Bannai’s teachings in order a method for displaying a pointer of the second user is changed in a case where the visibility of the indicated position of the second user is determined to be poor.
The same motivation that was utilized in the rejection of claim 1 applies equally to this claim.
Regarding claim 7, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein control is performed so as to display the indication UI of the second user only when the first user performs a predetermined operation (Bannai- col 6, line 53 to col 7, line 12, at least discloses The worker 40 can share the virtual object 43 with the instructor. The instructor can use the virtual object 43 to instruct the contents or examples of the work […] A pointer 41 b represents the stylus of the instructor superimposed on the CG virtual object […] The stereoscopic image includes the work object 42 seen from the viewpoint of the camera 70 a, its 3D CG model 43, the pointer 41 a showing the stylus position of the worker 40, and the pointer 41 b showing the stylus position of the instructor 50).
Regarding claim 8, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein
in a case where indication Uls of the plurality of users are superimposed on each
other on the first image (Bannai- Fig. 2C shows the stylus 41 a and 41 b users are superimposed on each other), a method for displaying the indication Uls superimposed on each other is changed (Ohshima- col 12, lines 16-24, at least disclose By contrast, since a player who is located at a position nearly in front of the player who is pointing can see the pointer display, as shown in FIG. 12B, he or she can recognize that a pointer display is made toward his or her side [method for displaying]. Also, since the pointer display can be seen from a position directly opposite thereto [method for displaying], as shown in FIG. 12C, a target player can recognize it in a game application in which a sight is set on an opponent player using the pointer display).
It would have been obvious to one of ordinary in the art before the effective filing date of the claimed invention to have modified Bannai to incorporate the teachings of Ohshima, and apply the indication Uls of the plurality of users are superimposed on each other into Bannai’s teachings in order in a case where indication Uls of the plurality of users are superimposed on each other on the first image, a method for displaying the indication Uls superimposed on each other is changed.
The same motivation that was utilized in the rejection of claim 1 applies equally to this claim.
Regarding claim 9, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein
the visibility of the indicated position of the second user is determined to be poor in a case where a distance between the viewpoint of the first user and the indicated position of the second user is longer than a threshold (Bannai- col 13, lines 35-38, at least discloses In step S578, it is determined based on the comparison result whether there is any virtual object within a predetermined distance (i.e., whether the distance is within a threshold; Ohshima- col 3, lines 24-27, at least discloses when a plurality of pointing rods are present in a space, the displayed images are confusing. Also, since each rod is formed by planes, the visibility of overlapping portions is poor), and determined to be good in a case where the distance is shorter than the threshold (Bannai- col 13, lines 35-38, at least discloses In step S578, it is determined based on the comparison result whether there is any virtual object within a predetermined distance (i.e., whether the distance is within a threshold; Ohshima- col 11, lines 27-28, at least discloses A direction that minimizes the distance to the other player is computed (step S203); col 12, lines 9-24, at least disclose By making such pointer display, a portion (irrespective of whether it is in a virtual or real space) that overlaps the pointer display is not hidden by the pointer display, and the distance to the pointed point and the pointed direction can be recognized more easily).
It would have been obvious to one of ordinary in the art before the effective filing date of the claimed invention to have modified Bannai to incorporate the teachings of Ohshima, and apply the predetermined distance into Bannai’s teachings in order the visibility of the indicated position of the second user is determined to be poor in a case where a distance between the viewpoint of the first user and the indicated position of the second user is longer than a threshold, and determined to be good in a case where the distance is shorter than the threshold.
The same motivation that was utilized in the rejection of claim 1 applies equally to this claim.
Regarding claim 10, Bannai in view of Ohshima, discloses the information processing system according to claim 9, and further discloses wherein the threshold is determined on a basis of a distance between a viewpoint of the second user and the indicated position of the second user (Ohshima- Fig. 16 shows distance between a viewpoint of the second user and the puck 1500 as indicated position of the second user; Bannai- col 13, lines 35-38, at least discloses In step S578, it is determined based on the comparison result whether there is any virtual object within a predetermined distance (i.e., whether the distance is within a threshold; Ohshima- col 11, lines 27-28, at least discloses A direction that minimizes the distance to the other player is computed (step S203)).
Regarding claim 11, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein
the visibility of the indicated position of the second user is determined to be good
in a case where an angle formed by a visual line of the first user and a line connecting
viewpoint of the first user and the indicated position of the second user falls within a predetermined angle range, and determined to be poor in a case where the angle falls outside the predetermined angle range (Ohshima- col 11, lines 1-13, at least discloses In a game in which a player set his or her sight on a target like in this embodiment, since the visual axis direction is normally close to the line-of-sight direction, the visual axis information is useful upon planning the game strategy. In order to implement such changes in display, for example, as shown in FIG. 9, deviation angle a between the direction of player 1 with respect to player 2 and the visual axis direction is computed from location information of the individual players and visual axis information of player 1, and display is changed in correspondence with changes in angle a; col 12, lines 9-24, at least disclose By making such pointer display, a portion (irrespective of whether it is in a virtual or real space) that overlaps the pointer display is not hidden by the pointer display, and the distance to the pointed point and the pointed direction can be recognized more easily […] since the pointer display can be seen from a position directly opposite thereto, as shown in FIG. 12C, a target player can recognize it in a game application in which a sight is set on an opponent player using the pointer display; col 3, lines 24-27, at least discloses when a plurality of pointing rods are present in a space, the displayed images are confusing. Also, since each rod is formed by planes, the visibility of overlapping portions is poor).
It would have been obvious to one of ordinary in the art before the effective filing date of the claimed invention to have modified Bannai to incorporate the teachings of Ohshima, and apply the deviation angle into Bannai’s teachings in order the visibility of the indicated position of the second user is determined to be good in a case where an angle formed by a visual line of the first user and a line connecting the viewpoint of the first user and the indicated position of the second user falls within a predetermined angle range, and determined to be poor in a case where the angle falls outside the predetermined angle range.
The same motivation that was utilized in the rejection of claim 1 applies equally to this claim.
Regarding claim 12, Bannai in view of Ohshima, discloses the information processing system according to claim 11, and further discloses wherein
the predetermined angle range is determined on a basis of a view angle of the first user (Ohshima- col 11, lines 1-13, at least discloses In a game in which a player set his or her sight on a target like in this embodiment, since the visual axis direction is normally close to the line-of-sight direction, the visual axis information is useful upon planning the game strategy. In order to implement such changes in display, for example, as shown in FIG. 9, deviation angle a between the direction of player 1 with respect to player 2 and the visual axis direction is computed from location information of the individual players and visual axis information of player 1, and display is changed in correspondence with changes in angle a).
Regarding claim 15, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein an annotation is added to the indication UI of the second user according to the determination result of the visibility (Bannai- col 19, lines 43-46, at least discloses As an alternative to the CG of a face, the identifier can be text data representing another participant's name or a color identifier. When plural remote participants share the same camera viewpoint, their identifiers should be displayed in a way that discriminates one from the other; col 10, lines 2-7, at least discloses The position and orientation of the stylus 41 b can be converted into the relative position and orientation seen from the instructor's viewpoint [determination result of the visibility]).
Regarding claim 16, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein a message for letting the first user know indication by the second user through the indication UI is displayed according to the determination result of the visibility (Bannai- col 19, lines 43-46, at least discloses As an alternative to the CG of a face, the identifier can be text data representing another participant's name or a color identifier. When plural remote participants share the same camera viewpoint, their identifiers should be displayed in a way that discriminates one from the other; col 10, lines 2-7, at least discloses The position and orientation of the stylus 41 b can be converted into the relative position and orientation seen from the instructor's viewpoint [determination result of the visibility]).
Regarding claim 17, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein a method for displaying an object at which the indicated position of the second user is located is changed according to the determination result of the visibility (Bannai- Fig. 2B shows display unit 23 b includes a stylus 41 b; col 5, lines 47-54, at least discloses A display unit 23 b includes a left eye display 23 b (L) and a right eye display 23 b (R) to display the images sent from the video decoding section 32 b. A position and orientation measuring section 11 b can input the position and orientation of the instructor's HMD 20 b from a three-dimensional position and orientation sensor 21 b and the three-dimensional position and orientation of a stylus 41 b; col 6, lines 65-67, at least discloses A pointer 41 b represents the stylus of the instructor superimposed on the CG virtual object; col 10, lines 2-7, at least discloses The position and orientation of the stylus 41 b can be converted into the relative position and orientation seen from the instructor's viewpoint [determination result of the visibility]; Ohshima- col 11, lines 53-61, at least discloses FIGS. llA and llB show an example of a pointer display for setting the sight. When one point in the space shared by a plurality of players must be pointed, the pointed point must be clearly recognized by not only the player who is pointing but also other players. Furthermore, when a pointer display is large or thick, contents displayed in the field of view are complicated very much and may disturb progress of the game. Also, since one point in the space is pointed, it is not easy to recognize the distance to that point; Figs. 12A-12C and col 12, lines 9-30, at least disclose By making such pointer display, a portion (irrespective of whether it is in a virtual or real space) that overlaps the pointer display is not hidden by the pointer display, and the distance to the pointed point and the pointed direction can be recognized more easily. More specifically, as shown in FIGS. 12A to 12C, the player who is pointing can see the pointer display, as shown in FIG. 12A, and can easily recognize the pointed point in the space. By contrast, since a player who is located at a position nearly in front of the player who is pointing can see the pointer display, as shown in FIG. 12B, he or she can recognize that a pointer display is made toward his or her side [method for displaying]. Also, since the pointer display can be seen from a position directly opposite thereto [method for displaying], as shown in FIG. 12C, a target player can recognize it in a game application in which a sight is set on an opponent player using the pointer display).
It would have been obvious to one of ordinary in the art before the effective filing date of the claimed invention to have modified Bannai to incorporate the teachings of Ohshima, and apply the method for displaying an object into Bannai’s teachings in order a method for displaying an object at which the indicated position of the second user is located is changed according to the determination result of the visibility.
The same motivation that was utilized in the rejection of claim 1 applies equally to this claim.
The method of claim 19 is similar in scope to the functions performed by the system of claim 1 and therefore claim 19 is rejected under the same rationale.
Regarding claim 20, all claim limitations are set forth as claim 1 in a non-transitory computer readable medium that stores a program and rejected as per discussion for claim 1.
Regarding claim 20, Bannai in view of Ohshima, discloses a non-transitory computer readable medium that stores a program, wherein the program causes a computer to execute an information processing method of claim 1 (Bannai- col 20, lines 8-27, at least discloses a system or device with a storage medium that stores program code (software) for implementing the functions [...] and causing a computer (or a CPU, MPU or the like) of the system or device to read the program code from the storage medium and then to execute the program code).
5. Claims 3, 14 and 18 are rejected under 35 U.S.C. 103 as being unpatentable over Bannai in view of Ohshima, further in view of Ookubo et al. (machine translation of JPH03119478A cited by applicant, hereinafter "Ookubo")
Regarding claim 3, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein the method for displaying the indication UI of the second user in the first image (see Claim 1 rejection for detailed analysis) .
The prior art does not explicitly disclose, but Ookubo discloses
displaying the indication UI of the user in the first image is changed at a time point in a case where a same determination result is continued for a predetermined time after the determination result of the visibility is changed (Ookubo- page 5, 1st paragraph, at least discloses The A-pointer generating unit 11 detects the change in the pointed position between time t-1 and time t based on the operation of user A, and converts it into coordinate values (XA, YA) that represent the change in position on the screen).
It would have been obvious to one of ordinary in the art before the effective filing date of the claimed invention to have modified Bannai to incorporate the teachings of Ohshima, and apply detecting the change in the pointed position between time t-1 and time t into Bannai/Ohshima’s teachings in order the method for displaying the indication UI of the second user in the first image is changed at a time point in a case where a same determination result is continued for a predetermined time after the determination result of the visibility is changed.
Doing so would smoothly perform collaborative work by a group such as office work, design work, software development, etc. by superimposing and displaying a pointer held by each user on the display devices of all users.
Regarding claim 14, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein the indication Ui of the second user is changed according to the determination result of the visibility (see Claim 1 rejection for detailed analysis).
The prior art does not explicitly disclose, but Ookubo discloses
at least one of a size, a shape, and transparency of the indication (Ookubo- Fig. 2 shows the pointer display method for each user; page 5, 1st paragraph, at least discloses The pointer display information registration unit 16 stores pointer display information for each user, that is, the size, shape, color, etc. of the pointer).
It would have been obvious to one of ordinary in the art before the effective filing date of the claimed invention to have modified Bannai to incorporate the teachings of Ohshima, and apply the size, shape of the pointer into Bannai/Ohshima’s teachings in order at least one of a size, a shape, and transparency of the indication UI of the second user is changed according to the determination result of the visibility.
The same motivation that was utilized in the rejection of claim 3 applies equally to this claim.
Regarding claim 18, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses a method for displaying the indication UI is changed (see Claim 1 rejection for detailed analysis).
The prior art does not explicitly disclose, but Ookubo discloses
a time at which the indicated position of the second user is determined to be seen from the viewpoint of the first user is measured (Ookubo- page 5, 1st paragraph, at least discloses The A-pointer generating unit 11 detects the change in the pointed position between time t-1 and time t based on the operation of user A, and converts it into coordinate values (XA, YA) that represent the change in position on the screen), and a method for displaying the indication UI is changed according to the measured time (Ookubo- page 5, 1st paragraph, at least discloses The A-pointer generating unit 11 detects the change in the pointed position between time t-1 and time t based on the operation of user A, and converts it into coordinate values (XA, YA) that represent the change in position on the screen).
It would have been obvious to one of ordinary in the art before the effective filing date of the claimed invention to have modified Bannai to incorporate the teachings of Ohshima, and apply the change in the pointed position between time t-1 and time t into Bannai/Ohshima’s teachings in order a time at which the indicated position of the second user is determined to be seen from the viewpoint of the first user is measured, and a method for displaying the indication UI is changed according to the measured time.
The same motivation that was utilized in the rejection of claim 3 applies equally to this claim.
6. Claim 13 is rejected under 35 U.S.C. 103 as being unpatentable over Bannai in view of Ohshima, further in view of Hoover et al. (“Hoover”) [US-2022/0179220-A1]
Regarding claim 13, Bannai in view of Ohshima, discloses the information processing system according to claim 1, and further discloses wherein a determination is performed as to whether the viewpoint of the first user is on a front side or a back side of an object at the indicated position of the second user (Ohshima- Fig. 16 shows two players 2000 and 3000 confront each other via a table 1000 while holding with their hands control boxes (260L and 260R), which are used as mallets. The two players can directly observe the surface of the table 1000 even with the HMDs 210L and 210R. The two players move the real control boxes held in their hands to hit the virtual puck 1500 displayed by each other's image processing systems. The viewpoint of HMDs 210L is on a front side or a back side of the virtual puck 1500),
the indicated position of the second user is determined to be at a place seen from
the viewpoint of the first user in a case where the viewpoint of the first user is on the front side of the object (Ohshima- Fig. 16 shows two players 2000 and 3000 confront each other via a table 1000. The indicated position of the opponent player 3000 at a place seen from
the viewpoint of the player 2000 in a case where the viewpoint of the player 2000 is on the front side of the virtual puck 1500) and an angle formed by a visual line of the first user and a line connecting the viewpoint of the first user and the indicated position of the second user falls within a predetermined angle range (Ohshima- col 11, lines 1-13, at least discloses In a game in which a player set his or her sight on a target like in this embodiment, since the visual axis direction is normally close to the line-of-sight direction, the visual axis information is useful upon planning the game strategy. In order to implement such changes in display, for example, as shown in FIG. 9, deviation angle a between the direction of player 1 with respect to player 2 and the visual axis direction is computed from location information of the individual players and visual axis information of player 1, and display is changed in correspondence with changes in angle a), and
the indicated position of the second user is determined to fall within the view of
the first user but is determined to be hidden from the viewpoint of the first user since the
indicated position of the second user is occluded by the object in a case where the
viewpoint of the first user is on the back side of the object and the angle falls within the
predetermined angle range (Ohshima- col 10, lines 16-28, at least discloses As a virtual object to be superposed on the arm to which the interactive input device is attached, an object that simply covers the arm portion may be used. Instead, when an object that represents a function in the game is used, an extra effect can be obtained in addition to hiding of the attached devices; col 11, lines 1-13, at least discloses In a game in which a player set his or her sight on a target like in this embodiment, since the visual axis direction is normally close to the line-of-sight direction, the visual axis information is useful upon planning the game strategy. In order to implement such changes in display, for example, as shown in FIG. 9, deviation angle a between the direction of player 1 with respect to player 2 and the visual axis direction is computed from location information of the individual players and visual axis information of player 1, and display is changed in correspondence with changes in angle a).
The prior art does not explicitly disclose, but Hoover discloses
the viewpoint of the first user is on a basis of a surface normal of the object (Hoover- ¶0032, at least discloses The AR system may further orient the virtual object to align the surface normal of the virtual object with user's direction of gaze; ¶0107, at least discloses an object may have an origin point, a down vector in the direction of gravity, and a forward vector in the direction of the surface normal of the object. For example, the surface normal of a display (e.g., a virtual TV) may indicate the direction from which displayed images can be viewed (rather than the direction from which the back of the display can be seen)).
It would have been obvious to one of ordinary in the art before the effective filing date of the claimed invention to have modified Bannai/Ohshima to incorporate the teachings of Hoover, and apply the surface normal into Bannai/Ohshima’s teachings in order a determination is performed as to whether the viewpoint of the first user is on a front side or a back side of an object on a basis of a surface normal of the object at the indicated position of the second user.
Doing so would produce new environments where physical and virtual objects co-exist and interact in real time.
Conclusion
7. The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. They are as recited in the attached PTO-892 form.8. Any inquiry concerning this communication or earlier communications from the examiner should be directed to MICHAEL LE whose telephone number is (571)272-5330. The examiner can normally be reached 9am-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Kent Chang can be reached at (571) 272-7667. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MICHAEL LE/Primary Examiner, Art Unit 2614