DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Election/Restrictions
Applicant’s election without traverse of claims 1-10 of group I in the reply filed on 11/10/2025 is acknowledged.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claim(s) 1-2, 5-10, 21 and 24-30 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lyren et al. (US 2016/0109940, hereinafter Lyren), and further in view of Ramsby et al. (US 2015/0268821, hereinafter Ramsby).
Regarding claim 1, Lyren teaches a head-mountable device (fig. 7A shows user 730 using wearable electronic device 740; [0039]: Consider an example in which two users at a geographical location wear wearable electronic devices (WEDs) that capture, record, and transmit information concerning the users, the ambient environment, and people and objects at the geographical location; [0124]: a wearable electronic device 710; [0195]: “wearable electronic device” is a portable electronic device that is worn on or attached to a person. Examples of such devices include, but are not limited to head-mounted displays; claim 9: second electronic glasses) comprising:
a first camera (claim 9: second electronic glasses including a camera) configured to capture first view data (fig. 6A and [116]: a second user 650 with a second electronic device 660 capturing the image of the building that is in the user’s line of sight; claim 9: second electronic glasses that include a processor in communication with the wireless network, a display, and a camera that captures an image of the object without the person since the person is located behind the object and not within a field of view of a second user wearing the second electronic glasses);
a first display (claim 9: second electronic glasses including a display) configured to provide a first graphical user interface ([0178]: an electronic device 1000 that includes one or more interfaces 1140 such as graphical user interface) showing a first view of an object (building corresponds to the object; second user views the side of the building on an area of the display of the second electronic glasses; [0030]: a user views an area on a display of the first electronic device and interacts with the first electronic device through a user interface to select an object being displayed as the target; [0118]: FIG. 6B shows the second electronic device 660 with a display 670 that displays a field of view), the first view being based on the first view data (the building 630 is in the line of sight of the second electronic device and thus viewable to the second user; fig. 6A and [0116]: A second user 650 with a second electronic device 660 has an obstructed view or line of sight 670 to the target 620 since a side 675 of the building 630 blocks or impedes the second user 650 and/or second electronic device 660 from viewing, sensing, detecting, and/or perceiving the target 620); and
a communication interface (wireless communication 1060, fig. 10) configured to receive second view data (information about the target and the building as viewed from the line of sight of the first user 600 using first electronic device 610 is shared with the second electronic device 660 of the second user 650; [0043]: A second electronic device (in communication with the first electronic device); [0063]: second user located in the building wears a second WED that communicates with the first WED; [0108]: the first electronic device transmits the target and/or area of the target to a second electronic device (such as a WED, WEG, HPED, computer, server, electronic scope, network location, electronic binoculars, smartphone, network, memory, database, etc.). As another example, the target and/or area of the target are stored to memory or provided to a network location and retrieved or received with the second electronic device; [0117]: The first electronic device 610 shares information with the second electronic device 660 so the second electronic device 660 can display or provide information about the target. By way of example, this information includes an image or model of the target, a location of the target, an orientation of the target, an object with the target, an activity of the target, a direction to the target, a distance to the target, a view of the target, and other information determined with the first electronic device and/or obtained from another electronic device; [0165]: the electronic system that includes the first and second electronic devices and computer that communicate with each other over the network) from an additional head-mountable device (Another user 730 wears a wearable electronic device 740, [0124]; [0039]: Consider an example in which two users at a geographical location wear wearable electronic devices (WEDs) that capture, record, and transmit information concerning the users, the ambient environment, and people and objects at the geographical location; [0117]: The first electronic device 610 shares information with the second electronic device 660 so the second electronic device 660 can display or provide information about the target. By way of example, this information includes an image or model of the target, a location of the target, an orientation of the target, an object with the target, an activity of the target, a direction to the target, a distance to the target, a view of the target, and other information determined with the first electronic device and/or obtained from another electronic device), the additional head-mountable device comprising a second display (claim 9: first electronic glasses including a display) configured to provide a second graphical user interface ([0178]: an electronic device 1000 that includes one or more interfaces 1140 such as graphical user interface) showing a second view of the object (first user views the building and target that are in the line of sight of the first user 600 and are displayed on an area of the display of the first electronic glasses 610; [0030]: a user views an area on a display of the first electronic device and interacts with the first electronic device through a user interface to select an object being displayed as the target; [0120]: As shown in FIG. 6A, the first user 600 and first electronic device 610 view a front orientation or front profile of the target 620 since the target (which is a person) faces the first user and first electronic device. Viewing the target with this orientation, the first electronic device captures an image, photo, or video of the target and determines other information), the second view data indicating a feature (target 620, fig. 6A) of the second view of the object (first user views the building and target that are in the line of sight of the first user 600 and are displayed on graphical user interface area of the display of the first electronic glasses 610, fig. 6A and [0120]),
wherein the first display is further configured to provide the first graphical user interface showing an indicator (image or model 680 of the target 620 as shown in fig. 6B indicates the target 620 that was originally obstructed from the second user 650’s view) located at the object and being based on the second view data (as shown in fig. 6B, information regarding an image or model 680 of the target 620 is shared by the first electronic device 610 used by the first user 600 that was originally obstructed from the line of sight of the second user 650 is being displayed on an area of the display 670 of the second electronic device 660; [0117]: The first electronic device 610 shares information with the second electronic device 660 so the second electronic device 660 can display or provide information about the target. By way of example, this information includes an image or model of the target, a location of the target, an orientation of the target, an object with the target, an activity of the target, a direction to the target, a distance to the target, a view of the target, and other information determined with the first electronic device and/or obtained from another electronic device; [0118]: FIG. 6B shows the second electronic device 660 with a display 670 that displays a field of view that includes an image or model 680 of the target and additional information 690 about the target; [0121]: As shown in FIG. 6B, the second electronic device 660 displays a side orientation or side profile of the target 620 since a side of the target (which is a person) faces the second user and second electronic device. The second electronic device displays an image of the target in an orientation or profile that matches an orientation or a profile of what the second electronic device and/or second user would see if the view or line of sight to the target were not obstructed; [0173]: The second electronic device switches views from displaying the 3D images of the person, the weapon, and the location over the object in the field of view of the second user and/or second electronic device to displaying real time images from the first electronic glasses that show the person holding the weapon at the location in the field of view of the first user and/or first electronic device).
Lyren does not explicitly teach the display configured to provide a graphical user interface for showing an object.
Ramsby teaches the display configured to provide a graphical user interface for showing an object ([0012]: The HMD device 100 includes a see-through display 102 and a controller 104 … The see-through display 102 may be configured to visually augment an appearance of a real-world, physical environment to a wearer viewing the physical environment through the transparent display. For example, the appearance of the physical environment may be augmented by graphical content that is presented via the transparent display 102 to create a mixed reality environment. In one example, the display may be configured to display one or more UI objects on a graphical user interface. In some embodiments, the UI objects presented on the graphical user interface may be virtual objects overlaid in front of the real-world environment. Likewise, in some embodiments, the UI objects presented on the graphical user interface may incorporate elements of real-world objects of the real-world environment seen through the transparent display 102; claim 19: A head-mounted display device, comprising: a display configured to display a graphical user interface including one or more user interface objects). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to apply Ramsby’s knowledge of a display configured to display a graphical user interface to display objects as taught and modify the system of Lyren because such a system can provide suitable feedback to a user as the user interacts with the user interface via eye gaze ([0043]).
Regarding claim 21, Lyren teaches a head-mountable device (fig. 7A shows user 730 using wearable electronic device 740; [0039]: Consider an example in which two users at a geographical location wear wearable electronic devices (WEDs) that capture, record, and transmit information concerning the users, the ambient environment, and people and objects at the geographical location; [0124]: a wearable electronic device 710; [0195]: “wearable electronic device” is a portable electronic device that is worn on or attached to a person. Examples of such devices include, but are not limited to head-mounted displays; claim 9: second electronic glasses) comprising:
a first camera (claim 9: second electronic glasses including a camera) configured to capture first view data (fig. 6A and [116]: a second user 650 with a second electronic device 660 capturing the image of the building that is in the user’s line of sight; claim 9: second electronic glasses that include a processor in communication with the wireless network, a display, and a camera that captures an image of the object without the person since the person is located behind the object and not within a field of view of a second user wearing the second electronic glasses);
a display (claim 9: second electronic glasses including a display) configured to provide a graphical user interface ([0178]: an electronic device 1000 that includes one or more interfaces 1140 such as graphical user interface) showing a first view of an object (building corresponds to the object; second user views the side of the building on an area of the display of the second electronic glasses; [0030]: a user views an area on a display of the first electronic device and interacts with the first electronic device through a user interface to select an object being displayed as the target; [0118]: FIG. 6B shows the second electronic device 660 with a display 670 that displays a field of view), the first view being based on the first view data (the building 630 is in the line of sight of the second electronic device and thus viewable to the second user; fig. 6A and [0116]: A second user 650 with a second electronic device 660 has an obstructed view or line of sight 670 to the target 620 since a side 675 of the building 630 blocks or impedes the second user 650 and/or second electronic device 660 from viewing, sensing, detecting, and/or perceiving the target 620); and
a communication interface (wireless communication 1060, fig. 10) configured to receive second view data (information about the target and the building as viewed from the line of sight of the first user 600 using first electronic device 610 is shared with the second electronic device 660 of the second user 650; [0043]: A second electronic device (in communication with the first electronic device); [0063]: second user located in the building wears a second WED that communicates with the first WED; [0108]: the first electronic device transmits the target and/or area of the target to a second electronic device (such as a WED, WEG, HPED, computer, server, electronic scope, network location, electronic binoculars, smartphone, network, memory, database, etc.). As another example, the target and/or area of the target are stored to memory or provided to a network location and retrieved or received with the second electronic device; [0117]: The first electronic device 610 shares information with the second electronic device 660 so the second electronic device 660 can display or provide information about the target. By way of example, this information includes an image or model of the target, a location of the target, an orientation of the target, an object with the target, an activity of the target, a direction to the target, a distance to the target, a view of the target, and other information determined with the first electronic device and/or obtained from another electronic device; [0165]: the electronic system that includes the first and second electronic devices and computer that communicate with each other over the network) from an additional head-mountable device (Another user 730 wears a wearable electronic device 740, [0124]; [0039]: Consider an example in which two users at a geographical location wear wearable electronic devices (WEDs) that capture, record, and transmit information concerning the users, the ambient environment, and people and objects at the geographical location; [0117]: The first electronic device 610 shares information with the second electronic device 660 so the second electronic device 660 can display or provide information about the target. By way of example, this information includes an image or model of the target, a location of the target, an orientation of the target, an object with the target, an activity of the target, a direction to the target, a distance to the target, a view of the target, and other information determined with the first electronic device and/or obtained from another electronic device), the additional head-mountable device comprising a second camera (claim 9: first electronic glasses including a camera) configured to capture a second view of the object (claim 9: first electronic glasses that include a camera that captures an image of an object and a person in a field of view of a first user wearing the first electronic glasses), the second view data indicating a feature (target 620, fig. 6A) of the second view of the object (first user views the building and target that are in the line of sight of the first user 600 and are displayed on graphical user interface area of the display of the first electronic glasses 610, fig. 6A and [0120]),
wherein the display is further configured to provide the graphical user interface showing an indicator (image or model 680 of the target 620 as shown in fig. 6B indicates the target 620 that was originally obstructed from the second user 650’s view) located at the object and being based on the second view data (as shown in fig. 6B, information regarding an image or model 680 of the target 620 is shared by the first electronic device 610 used by the first user 600 that was originally obstructed from the line of sight of the second user 650 is being displayed on an area of the display 670 of the second electronic device 660; [0117]: The first electronic device 610 shares information with the second electronic device 660 so the second electronic device 660 can display or provide information about the target. By way of example, this information includes an image or model of the target, a location of the target, an orientation of the target, an object with the target, an activity of the target, a direction to the target, a distance to the target, a view of the target, and other information determined with the first electronic device and/or obtained from another electronic device; [0118]: FIG. 6B shows the second electronic device 660 with a display 670 that displays a field of view that includes an image or model 680 of the target and additional information 690 about the target; [0121]: As shown in FIG. 6B, the second electronic device 660 displays a side orientation or side profile of the target 620 since a side of the target (which is a person) faces the second user and second electronic device. The second electronic device displays an image of the target in an orientation or profile that matches an orientation or a profile of what the second electronic device and/or second user would see if the view or line of sight to the target were not obstructed; [0173]: The second electronic device switches views from displaying the 3D images of the person, the weapon, and the location over the object in the field of view of the second user and/or second electronic device to displaying real time images from the first electronic glasses that show the person holding the weapon at the location in the field of view of the first user and/or first electronic device).
Lyren does not explicitly teach the display configured to provide a graphical user interface for showing an object.
Ramsby teaches the display configured to provide a graphical user interface for showing an object ([0012]: The HMD device 100 includes a see-through display 102 and a controller 104 … The see-through display 102 may be configured to visually augment an appearance of a real-world, physical environment to a wearer viewing the physical environment through the transparent display. For example, the appearance of the physical environment may be augmented by graphical content that is presented via the transparent display 102 to create a mixed reality environment. In one example, the display may be configured to display one or more UI objects on a graphical user interface. In some embodiments, the UI objects presented on the graphical user interface may be virtual objects overlaid in front of the real-world environment. Likewise, in some embodiments, the UI objects presented on the graphical user interface may incorporate elements of real-world objects of the real-world environment seen through the transparent display 102; claim 19: A head-mounted display device, comprising: a display configured to display a graphical user interface including one or more user interface objects). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to apply Ramsby’s knowledge of a display configured to display a graphical user interface to display objects as taught and modify the system of Lyren because such a system can provide suitable feedback to a user as the user interacts with the user interface via eye gaze ([0043]).
Regarding claim 28, Lyren teaches a head-mountable device (fig. 7A shows user 730 using wearable electronic device 740; [0039]: Consider an example in which two users at a geographical location wear wearable electronic devices (WEDs) that capture, record, and transmit information concerning the users, the ambient environment, and people and objects at the geographical location; [0124]: a wearable electronic device 710; [0195]: “wearable electronic device” is a portable electronic device that is worn on or attached to a person. Examples of such devices include, but are not limited to head-mounted displays; claim 9: second electronic glasses) comprising:
a first camera (claim 9: second electronic glasses including a camera) configured to capture first view data (fig. 6A and [116]: a second user 650 with a second electronic device 660 capturing the image of the building that is in the user’s line of sight; claim 9: second electronic glasses that include a processor in communication with the wireless network, a display, and a camera that captures an image of the object without the person since the person is located behind the object and not within a field of view of a second user wearing the second electronic glasses);
a display (claim 9: second electronic glasses including a display) configured to provide a graphical user interface ([0178]: an electronic device 1000 that includes one or more interfaces 1140 such as graphical user interface) showing a first view of a first feature (landscape at the base of the building corresponds to the first feature; landscape at the base of the side of the building as viewed from line of sight 670 of the second user 650 using a second electronic device 660, fig. 6A) and a second feature (window of the building corresponds to the second feature; rectangular window on the side of the building as viewed from line of sight 670 of the second user 650 using a second electronic device 660, fig. 6A) of an object (building corresponds to the object; second user views the side of the building on an area of the display of the second electronic glasses; [0030]: a user views an area on a display of the first electronic device and interacts with the first electronic device through a user interface to select an object being displayed as the target; [0118]: FIG. 6B shows the second electronic device 660 with a display 670 that displays a field of view), the first view being based on the first view data (the building 630 is in the line of sight of the second electronic device and thus viewable to the second user; fig. 6A and [0116]: A second user 650 with a second electronic device 660 has an obstructed view or line of sight 670 to the target 620 since a side 675 of the building 630 blocks or impedes the second user 650 and/or second electronic device 660 from viewing, sensing, detecting, and/or perceiving the target 620); and
a communication interface (wireless communication 1060, fig. 10) configured to receive second view data (information about the target and the building as viewed from the line of sight of the first user 600 using first electronic device 610 is shared with the second electronic device 660 of the second user 650; [0043]: A second electronic device (in communication with the first electronic device); [0063]: second user located in the building wears a second WED that communicates with the first WED; [0108]: the first electronic device transmits the target and/or area of the target to a second electronic device (such as a WED, WEG, HPED, computer, server, electronic scope, network location, electronic binoculars, smartphone, network, memory, database, etc.). As another example, the target and/or area of the target are stored to memory or provided to a network location and retrieved or received with the second electronic device; [0117]: The first electronic device 610 shares information with the second electronic device 660 so the second electronic device 660 can display or provide information about the target. By way of example, this information includes an image or model of the target, a location of the target, an orientation of the target, an object with the target, an activity of the target, a direction to the target, a distance to the target, a view of the target, and other information determined with the first electronic device and/or obtained from another electronic device; [0165]: the electronic system that includes the first and second electronic devices and computer that communicate with each other over the network) from an additional head-mountable device (Another user 730 wears a wearable electronic device 740, [0124]; [0039]: Consider an example in which two users at a geographical location wear wearable electronic devices (WEDs) that capture, record, and transmit information concerning the users, the ambient environment, and people and objects at the geographical location; [0117]: The first electronic device 610 shares information with the second electronic device 660 so the second electronic device 660 can display or provide information about the target. By way of example, this information includes an image or model of the target, a location of the target, an orientation of the target, an object with the target, an activity of the target, a direction to the target, a distance to the target, a view of the target, and other information determined with the first electronic device and/or obtained from another electronic device), the additional head-mountable device comprising a second camera (claim 9: first electronic glasses including a camera) configured to capture a second view (claim 9: first electronic glasses that include a camera that captures an image of an object and a person in a field of view of a first user wearing the first electronic glasses) of the second feature (windows of the building correspond to the second feature; as shown in fig. 6A, user 600 using the first electronic device views the front side of the building with the front orientation or front profile of the target 620 in one of the windows) of the object (fig. 6A; [0120]: As shown in FIG. 6A, the first user 600 and first electronic device 610 view a front orientation or front profile of the target 620 since the target (which is a person) faces the first user and first electronic device. Viewing the target with this orientation, the first electronic device captures an image, photo, or video of the target and determines other information (such as an identity of the target with a facial recognition program, a GPS location of the target, a distance to the target, an identity or existence of weapons or other objects with or near the target with an object recognition program, and an activity of the target)),
wherein the display is further configured to provide the graphical user interface showing, based on the second view data (as shown in fig. 6B, information regarding an image or model 680 of the target 620 is shared by the first electronic device 610 used by the first user 600 that was originally obstructed from the line of sight of the second user 650 is being displayed on an area of the display 670 of the second electronic device 660; [0117]: The first electronic device 610 shares information with the second electronic device 660 so the second electronic device 660 can display or provide information about the target. By way of example, this information includes an image or model of the target, a location of the target, an orientation of the target, an object with the target, an activity of the target, a direction to the target, a distance to the target, a view of the target, and other information determined with the first electronic device and/or obtained from another electronic device), an indicator (image or model 680 of the target 620 as shown in fig. 6B indicates the target 620 that was originally obstructed from the second user 650’s view) located at the object (as shown in fig. 6B, information regarding an image or model 680 of the target 620 is shared by the first electronic device 610 used by the first user 600 that was originally obstructed from the line of sight of the second user 650 is being displayed on an area of the display 670 of the second electronic device 660; [0118]: FIG. 6B shows the second electronic device 660 with a display 670 that displays a field of view that includes an image or model 680 of the target and additional information 690 about the target; [0121]: As shown in FIG. 6B, the second electronic device 660 displays a side orientation or side profile of the target 620 since a side of the target (which is a person) faces the second user and second electronic device. The second electronic device displays an image of the target in an orientation or profile that matches an orientation or a profile of what the second electronic device and/or second user would see if the view or line of sight to the target were not obstructed; [0173]: The second electronic device switches views from displaying the 3D images of the person, the weapon, and the location over the object in the field of view of the second user and/or second electronic device to displaying real time images from the first electronic glasses that show the person holding the weapon at the location in the field of view of the first user and/or first electronic device).
Lyren does not explicitly teach the display configured to provide a graphical user interface for showing an object.
Ramsby teaches the display configured to provide a graphical user interface for showing an object ([0012]: The HMD device 100 includes a see-through display 102 and a controller 104 … The see-through display 102 may be configured to visually augment an appearance of a real-world, physical environment to a wearer viewing the physical environment through the transparent display. For example, the appearance of the physical environment may be augmented by graphical content that is presented via the transparent display 102 to create a mixed reality environment. In one example, the display may be configured to display one or more UI objects on a graphical user interface. In some embodiments, the UI objects presented on the graphical user interface may be virtual objects overlaid in front of the real-world environment. Likewise, in some embodiments, the UI objects presented on the graphical user interface may incorporate elements of real-world objects of the real-world environment seen through the transparent display 102; claim 19: A head-mounted display device, comprising: a display configured to display a graphical user interface including one or more user interface objects). Therefore, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to apply Ramsby’s knowledge of a display configured to display a graphical user interface to display objects as taught and modify the system of Lyren because such a system can provide suitable feedback to a user as the user interacts with the user interface via eye gaze ([0043]).
Regarding claim 29, the combination of Lyren and Ramsby teaches the head-mountable device of claim 28, wherein the second view does not include the first feature (landscape at the base of the building corresponds to the first feature) of the object (as shown in fig. 6A, first user 600 viewing in the line of sight 640 using the first electronic device 610 views the front side of the building and therefore cannot view the landscape as seen by the second user 650 at the base of the side of the building; information about the target and the building as viewed from the line of sight of the first user 600 using first electronic device 610 is shared with the second electronic device 660 of the second user 650; [0043]: A second electronic device (in communication with the first electronic device); [0063]: second user located in the building wears a second WED that communicates with the first WED; [0108]: the first electronic device transmits the target and/or area of the target to a second electronic device (such as a WED, WEG, HPED, computer, server, electronic scope, network location, electronic binoculars, smartphone, network, memory, database, etc.). As another example, the target and/or area of the target are stored to memory or provided to a network location and retrieved or received with the second electronic device; [0117]: The first electronic device 610 shares information with the second electronic device 660 so the second electronic device 660 can display or provide information about the target. By way of example, this information includes an image or model of the target, a location of the target, an orientation of the target, an object with the target, an activity of the target, a direction to the target, a distance to the target, a view of the target, and other information determined with the first electronic device and/or obtained from another electronic device; [0165]: the electronic system that includes the first and second electronic devices and computer that communicate with each other over the network).
Regarding claim 30, the combination of Lyren and Ramsby teaches the head-mountable device of claim 28, wherein the graphical user interface does not include an indicator located at the first feature of the object (image or model 680 of the target 620 as shown in fig. 6B indicates the target 620 that was originally obstructed from the second user 650’s view; as shown in fig. 6B, the image or model 680 of the target 620 is not located at the landscape at the base of the building).
Regarding claim 2, the combination of Lyren and Ramsby teaches the head-mountable device of claim 1, wherein: the first display is an opaque display (Lyren – claim 9: second electronic glasses includes a display; Ramsby – [0011]: It will be appreciated an HMD device may take any other suitable form in which a transparent, semi-transparent, and/or non-transparent display is supported in front of a viewer's eye or eyes); and the second display is a translucent display providing a view to a physical environment (Lyren – claim 9: first electronic glasses includes a display; Ramsby – [0011]: It will be appreciated an HMD device may take any other suitable form in which a transparent, semi-transparent, and/or non-transparent display is supported in front of a viewer's eye or eyes).
Regarding claim 5, the combination of Lyren and Ramsby teaches the head-mountable device of claim 1, wherein: the first display has a first size (Lyren - wearable electronic device 740 with display 780 is shown in fig. 7D; it is inherent that display 780 has a size); and the second display has a second size (Lyren - wearable electronic device 710 with display 760 is shown in fig. 7C; it is inherent that display 760 has a size). Further, the size of display 760 and display 780 might be equal or one of the two displays would be larger in size than the other display. However, it would have been prima facie obvious for size of display 760 to be smaller than the size of display 780. Whether the size of the second display 760 is smaller than the size of the first display 780 or the size of the second display 760 is larger than the size of the first display 780 or size of display 760 is equal to the size of display 780 is solely a matter of aesthetic design choice, and would not be sufficient to distinguish over the prior art. See MPEP 2144.04.
Regarding claim 6, the combination of Lyren and Ramsby teaches the head-mountable device of claim 1, wherein: the first graphical interface has a first size (it is inherent that the graphical user interface provided by display 780 of the wearable electronic device 740 has a first size; Lyren - wearable electronic device 740 with display 780 is shown in fig. 7D; Lyren - [0030]: a user views an area on a display of the first electronic device and interacts with the first electronic device through a user interface to select an object being displayed as the target; Ramsby – [0012]: the display may be configured to display one or more UI objects on a graphical user interface. In some embodiments, the UI objects presented on the graphical user interface may be virtual objects overlaid in front of the real-world environment. Likewise, in some embodiments, the UI objects presented on the graphical user interface may incorporate elements of real-world objects of the real-world environment seen through the transparent display 102; Ramsby – claim 19: A head-mounted display device, comprising: a display configured to display a graphical user interface including one or more user interface objects); and the second graphical interface has a second size (it is inherent that the graphical user interface provided by display 760 of the wearable electronic device 710 has a second size; Lyren - wearable electronic device 710 with display 760 is shown in fig. 7C; Lyren - [0030]: a user views an area on a display of the first electronic device and interacts with the first electronic device through a user interface to select an object being displayed as the target; Ramsby – [0012]: the display may be configured to display one or more UI objects on a graphical user interface. In some embodiments, the UI objects presented on the graphical user interface may be virtual objects overlaid in front of the real-world environment. Likewise, in some embodiments, the UI objects presented on the graphical user interface may incorporate elements of real-world objects of the real-world environment seen through the transparent display 102; Ramsby – claim 19: A head-mounted display device, comprising: a display configured to display a graphical user interface including one or more user interface objects). Further, the size of graphical user interface provided by display 760 and the size of graphical user interface provided by display 780 might be equal or the size of one of the graphical user interface provided by either one of the two displays would be larger in size than the graphical user interface provided by the other display. However, it would have been prima facie obvious for size of the graphical user interface provided by display 760 to be smaller than the size of the graphical user interface provided by display 780. Whether the size of the graphical user interface provided by the second display 760 is smaller than the size of the graphical user interface provided by the first display 780 or the size of the graphical user interface provided by the second display 760 is larger than the size of the graphical user interface provided by the first display 780 or size of the graphical user interface provided by display 760 is equal to the size of the graphical user interface provided by display 780 is solely a matter of aesthetic design choice, and would not be sufficient to distinguish over the prior art. See MPEP 2144.04.
Regarding claim 7, the combination of Lyren and Ramsby teaches the head-mountable device of claim 1, wherein: the second view shows a second side (Lyren - front side of the building as seen from line of sight 640, fig. 6A/side 770B, fig. 7C) of the object (first user views the front side of the building that is in the line of sight of the first user 600 of the first electronic glasses 610; Lyren - [0120]: As shown in FIG. 6A, the first user 600 and first electronic device 610 view a front orientation or front profile of the target 620 since the target (which is a person) faces the first user and first electronic device. Viewing the target with this orientation, the first electronic device captures an image, photo, or video of the target and determines other information); and the first view shows a first side (Lyren - side of the building as seen from line of sight 670, fig. 6A/side 770D, fig. 7D) of the object and at least a portion of the second side of the object (Lyren - depending on the orientation of user 650 as shown in fig. 6A, front side or back side of the building might be viewable from the location of user 650 /side 770B, fig. 7D), wherein the indicator (Lyren - image of model 680 of the target 620 as shown in fig. 6B indicates the target 620 that was originally obstructed from the second user 650’s view) is applied to the portion of the second side of the object in the first view (Lyren - as shown in fig. 6B, information regarding an image or model 680 of the target 620 is shared by the first electronic device 610 used by the first user 600 that was originally obstructed from the line of sight of the second user 650 is being displayed on an area of the display 670 of the second electronic device 660; Lyren - [0117]: The first electronic device 610 shares information with the second electronic device 660 so the second electronic device 660 can display or provide information about the target. By way of example, this information includes an image or model of the target, a location of the target, an orientation of the target, an object with the target, an activity of the target, a direction to the target, a distance to the target, a view of the target, and other information determined with the first electronic device and/or obtained from another electronic device; Lyren - [0118]: FIG. 6B shows the second electronic device 660 with a display 670 that displays a field of view that includes an image or model 680 of the target and additional information 690 about the target; Lyren - [0121]: As shown in FIG. 6B, the second electronic device 660 displays a side orientation or side profile of the target 620 since a side of the target (which is a person) faces the second user and second electronic device. The second electronic device displays an image of the target in an orientation or profile that matches an orientation or a profile of what the second electronic device and/or second user would see if the view or line of sight to the target were not obstructed; Lyren - [0173]: The second electronic device switches views from displaying the 3D images of the person, the weapon, and the location over the object in the field of view of the second user and/or second electronic device to displaying real time images from the first electronic glasses that show the person holding the weapon at the location in the field of view of the first user and/or first electronic device).
Regarding claim 8, the combination of Lyren and Ramsby teaches the head-mountable device of claim 1, wherein the indicator (Lyren - image of model 680 of the target 620 as shown in fig. 6B indicates the target 620 that was originally obstructed from the second user 650’s view) comprises at least one of a highlighting, glow, shadow, reflection, outline, border, text (Lyren - additional information 690 related to the target as shown in fig. 6B), icons (Lyren - image or model 680 of the target), symbols, emphasis, duplication, aura, or animation (Lyren - [0118]: FIG. 6B shows the second electronic device 660 with a display 670 that displays a field of view that includes an image or model 680 of the target and additional information 690 about the target).
Regarding claim 9, the combination of Lyren and Ramsby teaches the head-mountable device of claim 1, wherein the object is a virtual object (Lyren - building is a physical object; Lyren - fig. 6A shows a building 630 as viewed by multiple users; Lyren - [0039]: A first one of the users looks at a window of a nearby building and sees a person holding an object. The WED of the first user records this event and determines a location of the person with respect to the first user. A second one of the users also looks at the window of the building or looks in the direction of the window but is unable to see the person holding the object since the second user is located away from the first user and sees the window at a different angle; Lyren – [0193]: “virtual image” or “virtual object” is computer or processor generated image or object. This image or object often appears to a user in the real, physical world (such as a virtual 3D dimensional object that the user views in the real world)). However, it would have been prima facie for the object of Lyren to be a virtual object. Whether the object is a physical object or a virtual object is solely a matter of aesthetic design choice, and would not be sufficient to distinguish over the prior art. See MPEP 2144.04.
Regarding claim 10, the combination of Lyren and Ramsby teaches the head-mountable device of claim 1, wherein the object is a physical object in an physical environment(Lyren - building is a physical object in a physical environment; Lyren - fig. 6A shows a building 630 as viewed by multiple users; Lyren - [0039]: A first one of the users looks at a window of a nearby building and sees a person holding an object. The WED of the first user records this event and determines a location of the person with respect to the first user. A second one of the users also looks at the window of the building or looks in the direction of the window but is unable to see the person holding the object since the second user is located away from the first user and sees the window at a different angle; Lyren – [0121]: a placement location of the image of the target in the building matches the actual physical location in the building where the target is located).
Claims 24-27 are similar in scope to claims 7-10, and therefore the examiner provides similar rationale to reject these claims.
Claim(s) 3-4 and 22-23 is/are rejected under 35 U.S.C. 103 as being unpatentable over Lyren, in view of Ramsby, and further in view of Scientific Imaging, Inc (“O6) Choosing a Lens to meet FOV and WD requirements”, hereinafter Scientific Imaging).
Regarding claim 3, the combination of Lyren and Ramsby teaches the head-mountable device of claim 1, wherein the additional head- mountable device further comprises a second camera (Lyren - wearable electronic device 710 comprising a camera as shown in fig. 7A; Lyren- claim 9: first electronic glasses including a camera).
The combination of Lyren and Ramsby does not explicitly teach the first camera has a resolution that is greater than a resolution of the second camera.
The combination of Lyren and Ramsby further teaches the distance between the first user 730 using a first wearable electronic device740 comprising a first camera is greater than the distance of the second user 700 using a second wearable electronic device 710 comprising a second camera as shown in fig. 7A ([0125]), however they do not explicitly teach the relationship between distance between the wearable electronic devices and object and resolution of the camera of the wearable electronic device.
Scientific Imaging teaches the working distance as the distance from the object plane to the front of the lens of the camera, and that increasing working distance results in larger field of view and lower optical resolution (page 3 section Notes). Therefore it would have obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to apply Scientific Imaging’s knowledge of the relationship between the working distance and optical resolution and modify the first camera of the wearable electronic device 740 to have higher optical resolution due it being farther than the distance between the object and the second camera of the wearable electronic device 710 because such a system will be able to capture better images from farther distance.
Regarding claim 4, the combination of Lyren and Ramsby teaches the head-mountable device of claim 1, wherein the additional head- mountable device further comprises a second camera (Lyren - wearable electronic device 710 comprising a camera as shown in fig. 7A; Lyren- claim 9: first electronic glasses including a camera).
The combination of Lyren and Ramsby does not explicitly teach the first camera has a field of view that is greater than a field of view of the second camera.
The combination of Lyren and Ramsby further teaches the distance between the first user 730 using a first wearable electronic device740 comprising a first camera is greater than the distance of the second user 700 using a second wearable electronic device 710 comprising a second camera as shown in fig. 7A ([0125]), however they do not explicitly teach the relationship between distance between the wearable electronic devices and object and field of view of the camera of the wearable electronic device.
Scientific Imaging teaches the working distance as the distance from the object plane to the front of the lens of the camera, and that increasing working distance results in larger field of view and lower optical resolution (page 3 section Notes). Therefore it would have obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to implement Scientific Imaging’s knowledge of the relationship between the working distance and field of view into the electronic system of the combination of Lyren and Ramsby such that the first camera of the wearable electronic device 740 has a larger field of view due it being farther than the distance between the object and the second camera of the wearable electronic device 710 because such a system will be able to capture images with more details of the environment surrounding the camera.
Claims 22-23 are similar in scope to claims 3-4, and therefore the examiner provides similar rationale to reject these claims.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to JWALANT B AMIN whose telephone number is (571)272-2455. The examiner can normally be reached Monday-Friday 10am - 630pm CST.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Said Broome can be reached at 571-272-2931. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/JWALANT AMIN/Primary Examiner, Art Unit 2612