DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Claim Interpretation
The following is a quotation of 35 U.S.C. 112(f):
(f) Element in Claim for a Combination. – An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The following is a quotation of pre-AIA 35 U.S.C. 112, sixth paragraph:
An element in a claim for a combination may be expressed as a means or step for performing a specified function without the recital of structure, material, or acts in support thereof, and such claim shall be construed to cover the corresponding structure, material, or acts described in the specification and equivalents thereof.
The claims in this application are given their broadest reasonable interpretation using the plain meaning of the claim language in light of the specification as it would be understood by one of ordinary skill in the art. The broadest reasonable interpretation of a claim element (also commonly referred to as a claim limitation) is limited by the description in the specification when 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is invoked.
As explained in MPEP § 2181, subsection I, claim limitations that meet the following three-prong test will be interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph:
(A) the claim limitation uses the term “means” or “step” or a term used as a substitute for “means” that is a generic placeholder (also called a nonce term or a non-structural term having no specific structural meaning) for performing the claimed function;
(B) the term “means” or “step” or the generic placeholder is modified by functional language, typically, but not always linked by the transition word “for” (e.g., “means for”) or another linking word or phrase, such as “configured to” or “so that”; and
(C) the term “means” or “step” or the generic placeholder is not modified by sufficient structure, material, or acts for performing the claimed function.
Use of the word “means” (or “step”) in a claim with functional language creates a rebuttable presumption that the claim limitation is to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites sufficient structure, material, or acts to entirely perform the recited function.
Absence of the word “means” (or “step”) in a claim creates a rebuttable presumption that the claim limitation is not to be treated in accordance with 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph. The presumption that the claim limitation is not interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, is rebutted when the claim limitation recites function without reciting sufficient structure, material or acts to entirely perform the recited function.
Claim limitations in this application that use the word “means” (or “step”) are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action. Conversely, claim limitations in this application that do not use the word “means” (or “step”) are not being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, except as otherwise indicated in an Office action.
This application includes one or more claim limitations that do not use the word “means,” but are nonetheless being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, because the claim limitation(s) uses a generic placeholder that is coupled with functional language without reciting sufficient structure to perform the recited function and the generic placeholder is not preceded by a structural modifier. Such claim limitation(s) is/are: “database system configured to store” in claim 1, 18, “augmented reality glasses configured to display” in claim 1, 24, “executable program logic further configured to control” in claim 1, 11, 19, “positioning system configured for sensing” in claim 2, “GPS receiver configured to determine” in claim 3, “program logic is configured to display” in claim 4, 6 , “augmented reality system is configured to provide” in claim 6, “robot-control module configured to control” in claim 7, “position reporting module configured to monitor” in claim 21.
Because this/these claim limitation(s) is/are being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, it/they is/are being interpreted to cover the corresponding structure described in the specification as performing the claimed function, and equivalents thereof.
If applicant does not intend to have this/these limitation(s) interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph, applicant may: (1) amend the claim limitation(s) to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph (e.g., by reciting sufficient structure to perform the claimed function); or (2) present a sufficient showing that the claim limitation(s) recite(s) sufficient structure to perform the claimed function so as to avoid it/them being interpreted under 35 U.S.C. 112(f) or pre-AIA 35 U.S.C. 112, sixth paragraph.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless –
(a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention.
Claims 1-5, 7-24 are rejected under 35 U.S.C. 102(a)(1) as being anticipated by Schmirler (US 20200336706 A1).
Regarding claim 1, Schmirler teaches An automation system for controlling an automated production process of an industrial plant ([0002] The subject matter disclosed herein relates generally to industrial automation systems, and, more particularly, to visualization of industrial data), the industrial plant including a plurality of equipment for performing the production process (Fig. 1 industrial devices 120 [0045] Industrial devices 120 may include both input devices that provide data relating to the controlled industrial systems to the industrial controllers 118, and output devices that respond to control signals generated by the industrial controllers 118 to control aspects of the industrial systems.), the plurality of equipment being spatially distributed in the industrial plant, the automation system comprising: ([0044] FIG. 1 is a block diagram of an example industrial control environment 100. In this example, a number of industrial controllers 118 are deployed throughout an industrial plant environment to monitor and control respective industrial systems or processes relating to product manufacture, machining, motion control, batch processing, material handling, or other such industrial functions. Industrial controllers 118 typically execute respective control programs to facilitate monitoring and control of industrial devices 120 making up the controlled industrial systems)
an augmented reality system, including: ([0054] FIG. 2 is a conceptual diagram illustrating presentation of augmented or virtual reality presentations 204 to a wearable appliance 206 or computing device worn by a user.)
a database system configured to store data and spatial coordinates associated with respective equipment of the plurality of equipment; ([0081] This collected plant data 610 can be stored in memory associated with the VR/AR presentation system 302 (e.g., memory 322) and used by rendering component 308 to populate virtual and augmented reality presentations with live or historical data.)
augmented reality glasses configured to display the data; and ([0055] Data used to populate the presentations 204 can be obtained by the VR/AR presentation system from the relevant industrial devices and delivered as part of the VR/AR presentations 204. In some scenarios, wearable appliance 206 can also obtain at least a portion of the industrial data directly from the industrial devices via the industrial network by virtue of a communication stack that interfaces the wearable appliance 206 to the various devices on the network.)
executable program logic coupled to the augmented reality glasses ([0110] the user can send a request to present system 302 (e.g., via a gesture or verbal command recognizable to the wearable appliance) for additional information about the control cabinet, including electrical schematics or line diagrams for the cabinet, ladder logic programming associated with an industrial controller mounted within the cabinet, diagnostic data for any of the devices, etc.), the executable program logic configured to receive an up-to-date spatial position of a user wearing the augmented reality glasses from a positioning system and the data of equipment that are located in a proximity of the up-to-date spatial position of the user from the database system ([0090] the location and orientation component 410 of wearable appliance 206 can be configured to determine a current geographical location of the appliance 206. In some embodiments, location and orientation component 410 can leverage global positioning system (GPS) technology to determine the user's absolute location, or may be configured to exchange data with positioning sensors located within the plant facility in order to determine the user's relative location within the plant.), the executable program logic further configured to control the augmented reality glasses for display of at least some of the received data. (Fig. 8-11)
Regarding claim 2, Schmirler teaches The automation system of claim 1, wherein the augmented reality system further includes the positioning system, the positioning system configured for sensing a spatial position of the user in the plant. ([0090] the location and orientation component 410 of wearable appliance 206 can be configured to determine a current geographical location of the appliance 206. In some embodiments, location and orientation component 410 can leverage global positioning system (GPS) technology to determine the user's absolute location, or may be configured to exchange data with positioning sensors located within the plant facility in order to determine the user's relative location within the plant.)
Regarding claim 3, Schmirler teaches The automation system of claim 2, wherein the positioning system includes a spatial mapping mesh covering at least a portion of the plant, an indoor positioning system, a cellular positioning system, or a GPS receiver configured to determine the spatial position. ([0090] the location and orientation component 410 of wearable appliance 206 can be configured to determine a current geographical location of the appliance 206. In some embodiments, location and orientation component 410 can leverage global positioning system (GPS) technology to determine the user's absolute location, or may be configured to exchange data with positioning sensors located within the plant facility in order to determine the user's relative location within the plant.)
Regarding claim 4, Schmirler teaches The automation system of claim 3, wherein the spatial mapping mesh comprises spatial anchors, wherein the spatial anchors are interconnected, and wherein each spatial anchor relates a coordinate system associated with the augmented reality glasses for the display of the data to a spatial coordinate system associated with the spatial coordinates of one or more of the plurality of equipment stored in the database system, wherein the program logic is configured to display the data of equipment that is located in a proximity of the up-to-date spatial position of the user via the augmented reality glasses at predefined positions relative to the spatial anchors.([0138] one or more of the video capture devices 1414 may be a time-of-flight (TOF) optical scanner or sensor, which generates distance information (e.g., point cloud or depth map information) for objects and surfaces within the scanner's field of view. In such embodiments, monitoring component 316 can be configured to correlate object recognition results with the distance information, and generate a notification directed to a wearable appliance or an industrial controller in response to determining that a result of this correlation satisfies a defined criterion.[0167] VR/AR presentation system 302 can generate this documentation in a three-dimensional format. In some scenarios, the system 302 can collect the data required to generate these plant mappings from multiple wearable appliances 206 associated with multiple users distributed throughout the plant environment, and update the plant documentation as more data is received.)
Regarding claim 5, Schmirler teaches The automation system of claim 1, wherein the augmented reality system is configured to create an avatar for at least one remote user, the remote user being remote to the industrial plant, wherein the program logic is configured to display the avatar in a proximity of the up-to-date spatial position of the user wearing the augmented reality glasses via the augmented reality glasses, wherein the augmented reality system is configured to provide a bidirectional visual and/or acoustic communication channel between the user wearing the augmented reality glasses and the remote user. (Fig. 8 [0050] The system can render a scaled down view of the factory floor area, which affords the user an external overview of the area. This external view can include real-time avatars representing human operators, superimposed production statistics and status data, and other information. [0090] Rendering component 308 can also render human icons 808 a and 808 b representing human operators present on in the production area. Returning briefly to FIG. 7, in some embodiments the locations and orientations of the human icons 808 a and 808 b within the VR/AR presentation can be determined based on location and orientation data 606 received by VR/AR presentation system 302 from the wearable appliances 206 associated with each user)
Regarding claim 7, Schmirler teaches The automation system of claim 5,
wherein the automation system comprises a robot with a camera and/or a microphone, the robot being located in the industrial plant, ([0044] FIG. 1 is a block diagram of an example industrial control environment 100 [0045] Industrial devices 120 may include both input devices that provide data relating to the controlled industrial systems to the industrial controllers 118, and output devices that respond to control signals generated by the industrial controllers 118 to control aspects of the industrial systems. Example input devices can include telemetry devices (e.g., temperature sensors, flow meters, level sensors, pressure sensors, etc.), manual operator control devices (e.g., push buttons, selector switches, etc.), safety monitoring devices (e.g., safety mats, safety pull cords, light curtains, etc.), and other such devices. Output devices may include motor drives, pneumatic actuators, signaling devices, robot control inputs, valves, and the like [0051] The presentation system 302 can also be configured to work in conjunction with video capture devices (e.g., 360-degree cameras, webcams, swivel-based IP cameras, etc.) installed at one or more locations within the plant environment.)
wherein the augmented reality system comprises a robot-control module configured to control movement and/or orientation of the robot such that a position and/or the orientation of the robot reflects a position and/or orientation of the avatar within the augmented reality displayed via the AR glasses, and ([0105] the user may speak a request for a current status of a particular asset (e.g., an industrial robot, a production line, a motor, a stamping press, etc.), which is received by the user's wearable appliance 402 and relayed to the VR/AR presentation system 302.)
wherein the robot-control module is configured to automatically update the position and/or orientation of the avatar and the robot in accordance with a change in a position and/or orientation of the remote user or in response to a control command submitted by the remote user. ([0105] The presentation system 302 can translate the spoken request into a query for the desired information about the specified asset, retrieve the relevant subset of plant data 610, and render the requested information as a VR/AR presentation on the user's wearable appliance 206)
Regarding claim 8, Schmirler teaches The automation system of claim 1, the database system being a graph database system, a spatial database system and/or a streaming database system. ([0005] the augmented reality representation from a virtual view of the industrial facility to a video presentation; and streaming, by the system to the wearable device as the video presentation, a subset of the video data received from a video capture device, of the video capture devices, corresponding to the camera icon.)
Regarding claim 9, Schmirler teaches The automation system of claim 1, further comprising:
a distributed control system coupled to the database system, the distributed control system, comprising:
a memory configured to store one or more process parameters in association with one or more of the equipment and to store process control software, and
a processor configured to execute the process control software for automatically controlling the production process. ([0197] With reference to FIG. 23, an example environment 2310 for implementing various aspects of the aforementioned subject matter includes a computer 2312. The computer 2312 includes a processing unit 2314, a system memory 2316, and a system bus 2318. The system bus 2318 couples system components including, but not limited to, the system memory 2316 to the processing unit 2314. The processing unit 2314 can be any of various available processors. Multi-core microprocessors and other multiprocessor architectures also can be employed as the processing unit 2314.)
Regarding claim 10, Schmirler teaches The automation system of claim 9, further comprising:
one or more sensors coupled to at least one equipment, the one or more sensors configured to measure the one or more process parameters of the at least one equipment, the one or more process parameters indicating d current state of the production process of the plant and/or a current state or mode of operation of the equipment; and ([0045] ndustrial devices 120 may include both input devices that provide data relating to the controlled industrial systems to the industrial controllers 118, and output devices that respond to control signals generated by the industrial controllers 118 to control aspects of the industrial systems. Example input devices can include telemetry devices (e.g., temperature sensors, flow meters, level sensors, pressure sensors, etc.), manual operator control devices (e.g., push buttons, selector switches, etc.), safety monitoring devices (e.g., safety mats, safety pull cords, light curtains, etc.), and other such devices. Output devices may include motor drives, pneumatic actuators, signaling devices, robot control inputs, valves, and the like.)
one or more actuators coupled to the at least one equipment, the one or more actuators configured to control the one or more process parameters or configured to be operated in accordance with one or more control parameters, wherein the distributed control system is coupled to the one or more sensors and the one or more actuators, wherein the processor is configured to execute the process control software for automatically controlling the production process based on measurement signals received from the one or more sensors and control signals sent to the one or more actuators. ([0194] This can include substantially any type of control, communications module, computer, Input/Output (I/O) device, sensor, actuator, instrumentation, and human machine interface (HMI) that communicate via the network, which includes control, automation, and/or public networks. The PLC or automation controller can also communicate to and control various other devices such as standard or safety-rated I/O modules including analog, digital, programmed/intelligent I/O modules, other programmable controllers, communications modules, sensors, actuators, output devices, and the like.)
Regarding claim 11, Schmirler teaches The automation system of claim 1, further comprising:
a user database coupled to a distributed control system, the user database configured to store user information associated with the production process, wherein the user information comprises user IDs, user roles and/or user privileges. ([0154] Concurrently, device interface component 314 collects user data 1704 that can be used to confirm that the user is performing the steps recommended by the workflow delivered to the user's wearable appliance 206. User data 1704 can include, for example, the user's identity and location relative to the automation system or components thereof. The user's location can be used to confirm that the user is at the appropriate location to perform the workflow step currently awaiting completion (e.g., in front of the appropriate control panel, HMI, machine station, or mechanical/electrical component). User data 1704 can also include data indicating the user's interactions with devices associated with the automation system)
Regarding claim 12, Schmirler teaches The automation system of claim 11, wherein the executable program logic is configured to receive user credentials from the user and to authenticate the user ([0059] VR/AR presentation system 302 can include…an authentication component 306), wherein a user ID of the user wearing the augmented reality glasses is associated with one or more of the user roles and each user role of the one or more user roles includes one or more of the user privileges([0061] Authentication component 306 may also determine a defined role associated with the user identification information, and grant a level of control privilege commensurate with the user's role.), and wherein the executable program logic is further configured to select and display content of the at least some of the received data for display based upon the one or more user roles and/or the one or more user privileges associated with the user ID. ([0062] rendering component 308 can generate presentations based on an identity of an industrial device, automation system, control cabinet, or machine received from the wearable appliance, such that available information about devices, machines, or control cabinets within the user's line of sight is displayed on the appliance. The rendering component 308 can also select the VR/AR presentation in accordance with the user's control privileges (determined by the authentication component 306). The selected presentation can then be sent to the wearable appliance the client interface component 304.)
Regarding claim 13, Schmirler teaches The automation system of claim 11, wherein the one or more user roles includes one or more of a maintenance role and an engineering role, and the one or more user privileges defines access to one or more specific equipment of the plurality of equipment. ([0065] Monitoring component 316 can be configured to monitor selected subsets of data collected by device interface component 314 according to defined monitoring rules, and to deliver notifications and/or workflow recommendations in response to detecting a maintenance or performance issue based on a result of the monitoring [0074] Example user roles that can determine how VR and AR data is presented to a user can include, but are not limited to, line operators, maintenance personnel, plant managers, plant engineers, or other roles.)
Regarding claim 14, Schmirler teaches The automation system of claim 1, wherein the data comprises one or more task instructions associated with the respective equipment. ([0071] The one or more processors 420 can perform one or more of the functions described herein with reference to the systems and/or methods disclosed. Memory 422 can be a computer-readable storage medium storing computer-executable instructions and/or information for performing the functions described herein with reference to the systems and/or methods disclosed.)
Regarding claim 15, Schmirler teaches The automation system of claim 1, wherein one or more equipment of the plurality of equipment are one or more machines. ([0045] Industrial devices 120 may include both input devices that provide data relating to the controlled industrial systems to the industrial controllers 118, and output devices that respond to control signals generated by the industrial controllers 118 to control aspects of the industrial systems. Example input devices can include telemetry devices (e.g., temperature sensors, flow meters, level sensors, pressure sensors, etc.), manual operator control devices (e.g., push buttons, selector switches, etc.), safety monitoring devices (e.g., safety mats, safety pull cords, light curtains, etc.), and other such devices. Output devices may include motor drives, pneumatic actuators, signaling devices, robot control inputs, valves, and the like)
Regarding claim 16, Schmirler teaches The automation system of claim 1, wherein the executable program logic is configured to perform a query of the database based upon the up-to-date spatial position for retrieving, from the database, the data of equipment that are located in the proximity of the up-to-date spatial position of the user. ([0105] In addition to presenting asset data to the user in response to determining that the user's location and orientation places the asset within the user's line of sight, some embodiments of VR/AR presentation system can also process natural language spoken queries requesting specified information about an industrial asset, regardless of whether the user is currently viewing the asset. For example, the user may speak a request for a current status of a particular asset (e.g., an industrial robot, a production line, a motor, a stamping press, etc.), which is received by the user's wearable appliance 402 and relayed to the VR/AR presentation system 302. The presentation system 302 can translate the spoken request into a query for the desired information about the specified asset, retrieve the relevant subset of plant data 610, and render the requested information as a VR/AR presentation on the user's wearable appliance 206.)
Regarding claim 17, Schmirler teaches The automation system of claim 1, wherein the database system is accessible via a subscription-based streaming service, and wherein the executable program logic is configured to receive from the database via the streaming service the data of equipment that are located in the proximity of the up-to-date spatial position of the user. ([0134] In response to receiving the request, rendering component 308 retrieves the stored video data corresponding to the identified video capture device 1414 and the indicated date and time, and can begin streaming the retrieved video data to the wearable appliance 206. Similar to the live video feeds, the user can interact with the historical video feed by moving his or her head to the left or right to change the perspective or line of site.)
Regarding claim 18, Schmirler teaches The automation system of claim 17,
wherein the database system is configured to provide, in response to the query of the executable program logic, selectively the data of equipment that are located proximate to the up-to-date spatial position of the user wearing the augmented reality glasses, whereby a movement of the user relative to the equipment triggers the executable program logic to submit a new query to the database system comprising a new up-to-date spatial position of the user; ([0105] In addition to presenting asset data to the user in response to determining that the user's location and orientation places the asset within the user's line of sight, some embodiments of VR/AR presentation system can also process natural language spoken queries requesting specified information about an industrial asset, regardless of whether the user is currently viewing the asset. For example, the user may speak a request for a current status of a particular asset (e.g., an industrial robot, a production line, a motor, a stamping press, etc.), which is received by the user's wearable appliance 402 and relayed to the VR/AR presentation system 302. The presentation system 302 can translate the spoken request into a query for the desired information about the specified asset, retrieve the relevant subset of plant data 610, and render the requested information as a VR/AR presentation on the user's wearable appliance 206.) or
wherein the database system is configured to provide, via the subscription-based streaming service, selectively the data of equipment that are located proximate to the up-to-date spatial position of the user wearing the augmented reality glasses, whereby a movement of the user relative to the equipment triggers the executable program logic to update the subscription with new up-to-date spatial position of the user.
Regarding claim 19, Schmirler teaches The automation system of claim 1, wherein the executable program logic is configured for:
receiving coordinates of visual objects graphically representing the data of the equipment proximate to the user wearing the augmented reality glasses; ([0055] In response to various conditions, such as the user's determined role, location, line of sight, or other information, the system can generate and deliver augmented or virtual reality presentations to the user's wearable appliance 206)
determining said user's line of sight in the coordinate system of the real world; ([0055] The VR/AR presentation system can customize the presentations 204 based on a user's current context, line of sight, type of client device being used by the user (e.g., wearable computer, handheld device, etc.), and/or other relevant information, such that customized augmented reality or virtual reality presentations can be generated based on relevant subsets of data available on the industrial network.)
receiving, from the database system, coordinates of real-world objects which are proximate to the up-to-date spatial position of the user wearing the augmented reality glasses; ([0090] the location and orientation component 410 of wearable appliance 206 can be configured to determine a current geographical location of the appliance 206. In some embodiments, location and orientation component 410 can leverage global positioning system (GPS) technology to determine the user's absolute location, or may be configured to exchange data with positioning sensors located within the plant facility in order to determine the user's relative location within the plant.)
determining if one or more of the virtual objects representing data of an equipment are positioned along the user's line of sight and are farther away from the user than at least one of the real world objects which is also positioned along the user's line of sight; ([0056] as a user is viewing an automation system, machine, or industrial device through a wearable computer (or as a substantially real-time video image rendered on the user's client device), the VR/AR presentation system can monitor the wearable computer to determine the user's location relative to the automation system, the user's current line of sight or field of view, and/or other contextual information indicative of the user's relationship to the automation system)
not displaying selectively the one or more virtual objects determined to be on the user's line of sight and be located farther away than at least one of the real-world objects. ([0103] In some embodiments, rendering component 308 can be configured to display a minimal amount of information about the cabinet 1102 (or other machine or industrial device) in response to determining that the cabinet 1102 is within the user's current line of sight, and display additional information about the cabinet in response to a user gesture or verbal command (e.g., a natural language spoken command) indicating a request for more detailed data)
Regarding claim 20, Schmirler teaches The automation system of claim 1, wherein the database system is a graph database system, the graph database system including a graph database having a plurality of interconnected nodes, wherein each node represents one or more objects of the industrial plant and includes spatial information associated with the one or more objects, and wherein the one or more objects are selected from the group consisting of a sensor, an actuator, raw material, and an equipment of the plurality of equipment, the equipment being e.g., a machine or a robot. ([0005] the augmented reality representation from a virtual view of the industrial facility to a video presentation; and streaming, by the system to the wearable device as the video presentation, a subset of the video data received from a video capture device, of the video capture devices, corresponding to the camera icon. [0203] Computer 2312 can operate in a networked environment using logical connections to one or more remote computers, such as remote computer(s) 2344. The remote computer(s) 2344 can be a personal computer, a server, a router, a network PC, a workstation, a microprocessor based appliance, a peer device or other common network node and the like, and typically includes many or all of the elements described relative to computer 2312. )
Regarding claim 21, Schmirler teaches The automation system of claim 20, wherein one or more of the nodes comprises or is stored in association with at least one respective container, each container being an isolated runtime environment instance comprising software for monitoring and/or operating at least one of the one or more objects represented by said node, and wherein the software in each container comprises a position reporting module configured to monitor a position of at least one of the one or more objects of the respective node and to update the spatial information associated with the at least one object of the respective node in the database system. ([0005] the augmented reality representation from a virtual view of the industrial facility to a video presentation; and streaming, by the system to the wearable device as the video presentation, a subset of the video data received from a video capture device, of the video capture devices, corresponding to the camera icon. [0179] the perspective or angle of view of the scaled facility will change in coordination with the user's position and orientation to simulate walking around a physical scale model of the facility. [0203] Computer 2312 can operate in a networked environment using logical connections to one or more remote computers, such as remote computer(s) 2344. The remote computer(s) 2344 can be a personal computer, a server, a router, a network PC, a workstation, a microprocessor based appliance, a peer device or other common network node and the like, and typically includes many or all of the elements described relative to computer 2312. )
Regarding claim 22, Schmirler teaches The automation system of claim 21, wherein the spatial information associated with the at least one object of the respective node includes one or more spatial coordinates of the at least one object, the one or more spatial coordinates corresponding to one or more points on a bounding surface of the object and/or inside the bounding surface of the object. ([0138] ne or more of the video capture devices 1414 may be a time-of-flight (TOF) optical scanner or sensor, which generates distance information (e.g., point cloud or depth map information) for objects and surfaces within the scanner's field of view. In such embodiments, monitoring component 316 can be configured to correlate object recognition results with the distance information, and generate a notification directed to a wearable appliance or an industrial controller in response to determining that a result of this correlation satisfies a defined criterion.)
Regarding claim 23, Schmirler teaches The automation system of claim 1, further comprising a network communicatively coupling the database system to the augmented reality glasses.([0072] One or both of office network 108 or plant network 116 may also have access to external networks 514 such as the Internet (e.g., via firewall device 516).)
Regarding claim 24, Schmirler teaches A method of using an automation system for controlling an automated production process of an industrial plant ([0002] The subject matter disclosed herein relates generally to industrial automation systems, and, more particularly, to visualization of industrial data), the industrial plant including a plurality of equipment for performing the production process(Fig. 1 industrial devices 120 [0045] Industrial devices 120 may include both input devices that provide data relating to the controlled industrial systems to the industrial controllers 118, and output devices that respond to control signals generated by the industrial controllers 118 to control aspects of the industrial systems.), the plurality of equipment being spatially distributed in the industrial plant, the method comprising: ([0044] FIG. 1 is a block diagram of an example industrial control environment 100. In this example, a number of industrial controllers 118 are deployed throughout an industrial plant environment to monitor and control respective industrial systems or processes relating to product manufacture, machining, motion control, batch processing, material handling, or other such industrial functions. Industrial controllers 118 typically execute respective control programs to facilitate monitoring and control of industrial devices 120 making up the controlled industrial systems)
providing the automation system, the automation system comprising an augmented reality system, the augmented reality system including: ([0054] FIG. 2 is a conceptual diagram illustrating presentation of augmented or virtual reality presentations 204 to a wearable appliance 206 or computing device worn by a user.)
a database system having stored data and spatial coordinates associated with respective equipment of the plurality of equipment; ([0081] This collected plant data 610 can be stored in memory associated with the VR/AR presentation system 302 (e.g., memory 322) and used by rendering component 308 to populate virtual and augmented reality presentations with live or historical data.)
augmented reality glasses configured to display the data; and ([0055] Data used to populate the presentations 204 can be obtained by the VR/AR presentation system from the relevant industrial devices and delivered as part of the VR/AR presentations 204. In some scenarios, wearable appliance 206 can also obtain at least a portion of the industrial data directly from the industrial devices via the industrial network by virtue of a communication stack that interfaces the wearable appliance 206 to the various devices on the network.)
executable program logic coupled to the augmented reality glasses, ([0110] the user can send a request to present system 302 (e.g., via a gesture or verbal command recognizable to the wearable appliance) for additional information about the control cabinet, including electrical schematics or line diagrams for the cabinet, ladder logic programming associated with an industrial controller mounted within the cabinet, diagnostic data for any of the devices, etc.)
receiving, by the executable program logic, an up-to-date spatial position of a user wearing the augmented reality glasses, from a positioning system; determining, by the database system, the data of equipment that are located in a proximity of the up-to-date spatial position of the user; receiving, by the executable program logic, the determined data of equipment from the database system; ([0090] the location and orientation component 410 of wearable appliance 206 can be configured to determine a current geographical location of the appliance 206. In some embodiments, location and orientation component 410 can leverage global positioning system (GPS) technology to determine the user's absolute location, or may be configured to exchange data with positioning sensors located within the plant facility in order to determine the user's relative location within the plant.)
controlling, by the executable program logic, the augmented reality glasses to display at least some of the received data, the displayed data enabling the user to control and/or maintain the automated production process. (Fig. 8-11)
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 6 are rejected under 35 U.S.C. 103 as being unpatentable over Schmirler (US 20200336706 A1) in view of Miller (US 20150302657 A1).
Regarding claim 6, Schmirler teaches The automation system of claim 5, wherein the augmented reality glasses comprise an acoustic output interface, wherein the augmented reality system is configured to provide a bidirectional acoustic communication channel between the user wearing the augmented reality glasses and the avatar, and wherein the program logic is configured to control the acoustic output interface within the coordinate system associated with the augmented reality glasses for the display of the data. (Fig. 8-10 [0090] Rendering component 308 can also render human icons 808 a and 808 b representing human operators present on in the production area. Returning briefly to FIG. 7, in some embodiments the locations and orientations of the human icons 808 a and 808 b within the VR/AR presentation can be determined based on location and orientation data 606 received by VR/AR presentation system 302 from the wearable appliances 206 associated with each user [0158] Since the wearable appliances 206 support audio communication, the users can exchange verbal communication via the wearable appliances 206 while sharing views in order to facilitate coordination of activities between the users.)
Schmirler does not expressly disclose but Miller discloses such that the volume of the voice of the avatar output via the acoustic output interface to the user wearing the augmented reality glasses negatively correlates with a distance of the user wearing the glasses and the avatar ([0158] the one or more parameters pertains to an intensity of the sound. [0227] As illustrated in FIG. 3, the audio subsystem 106 may take a variety of forms. For instance, the audio subsystem 106 may take the form of a simple two speaker 2 channel stereo system, or a more complex multiple speaker system (5.1, 7.1, 12.1 channels). In some implementations, the audio subsystem 106 may be operable to produce a three-dimensional sound field. [0576] Voice may be passed through to appear to be emanating from the avatar.)
Therefore, it would have been obvious to a person having ordinary skill in the art before the effective filling date of the claimed invention to modify Schmirler with the teachings of Miller with a reasonable expectation of success by facilitating virtual and/or augmented reality interaction for one or more users as taught by Miller ([0007]).
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SARAH TRAN whose telephone number is (313)446-6642. The examiner can normally be reached 8am-5pm M-F.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Khoi Tran can be reached at (571) 272-6919. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/S.A.T./Examiner, Art Unit 3656
/KHOI H TRAN/Supervisory Patent Examiner, Art Unit 3656