Prosecution Insights
Last updated: April 19, 2026
Application No. 19/002,434

OPERATION ASSISTANCE SYSTEM FOR A USER OF A CONTAINER TREATMENT PLANT

Non-Final OA §101§103
Filed
Dec 26, 2024
Examiner
DIVELBISS, MATTHEW H
Art Unit
3624
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Krones AG
OA Round
1 (Non-Final)
23%
Grant Probability
At Risk
1-2
OA Rounds
4y 1m
To Grant
46%
With Interview

Examiner Intelligence

Grants only 23% of cases
23%
Career Allow Rate
83 granted / 367 resolved
-29.4% vs TC avg
Strong +23% interview lift
Without
With
+23.4%
Interview Lift
resolved cases with interview
Typical timeline
4y 1m
Avg Prosecution
50 currently pending
Career history
417
Total Applications
across all art units

Statute-Specific Performance

§101
37.0%
-3.0% vs TC avg
§103
43.5%
+3.5% vs TC avg
§102
10.2%
-29.8% vs TC avg
§112
6.9%
-33.1% vs TC avg
Black line = Tech Center average estimate • Based on career data from 367 resolved cases

Office Action

§101 §103
DETAILED ACTION Claims 1-20 are pending in the present application and are under examination on the merits. This communication is the first action on the merits (FAOM). Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Information Disclosure Statement Applicant filed an Information Disclosure Statement (IDS) on 12/26/2024 and 7/22/2025. This filing is in compliance with 37 C.F.R. 1.97. As required by M.P.E.P. 609(C), the applicant's submission of the Information Disclosure Statement is acknowledged by the examiner and the cited references have been considered in the examination of the claims now pending. As required by M.P.E.P. 609(C), a copy of the PTOL -1449 form, initialed and dated by the examiner, is attached to the instant office action. Drawings The drawings filed on 12/26/2024 are acceptable as filed. Claim Rejections - 35 USC§ 101 35 U.S.C. 101 reads as follows: Whoever invents or discovers any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof, may obtain a patent therefor, subject to the conditions and requirements of this title. Claims 1, 2, and 4-20 are rejected under 35 U.S.C. 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea) without significantly more. Here, under considerations of the broadest reasonable interpretation of the claimed invention, Examiner finds that the Applicant invented a method and system for assisting a user based on position and orientation to determine an operating state of an apparatus and provide recommendations based on the operating state. Examiner formulates an abstract idea analysis, following the framework described in the MPEP as follows: Step 1: The claims are directed to a statutory category, namely a "method" (claims 15-20) and "system" (claims 1-14). Step 2A - Prong 1: The claims are found to recite limitations that set forth the abstract idea(s), namely, regarding claim 1: …detect a position and an orientation of the user in the container treatment plant; wherein the position detecting apparatus is configured, based on the position and the orientation of the user, to recognize a treatment apparatus in the container treatment plant that is operated by the user; … detect an operating state of the recognized treatment apparatus; … assist the user when operating the treatment apparatus using operating recommendations based on the operating state of the recognized treatment apparatus; Independent claims 14 and 15 recite substantially similar claim language. Dependent claims 2, 4-13, and 16-20 recite the same or similar abstract idea(s) as independent claims 1, 14, and 15 with merely a further narrowing of the abstract idea(s) to particular data characterization and/or additional data analyses performed as part of the abstract idea. The limitations in claims 1, 2, and 4-20 above falling well-within the groupings of subject matter identified by the courts as being abstract concepts, specifically the claims are found to correspond to the category of: "Certain methods of organizing human activity- fundamental economic principles or practices (including hedging, insurance, mitigating risk); commercial or legal interactions (including agreements in the form of contracts; legal obligations; advertising, marketing or sales activities or behaviors; business relations); managing personal behavior or relationships or interactions between people (including social activities, teaching, and following rules or instructions)" as the limitations identified above are directed to assisting a user based on position and orientation to determine an operating state of an apparatus and provide recommendations based on the operating state and thus is a method of organizing human activity including at least commercial or business interactions or relations and/or a management of user personal behavior; Step 2A - Prong 2: Claims 1, 2, and 4-20 are found to clearly be directed to the abstract idea identified above because the claims, as a whole, fail to integrate the claimed judicial exception into a practical application, specifically the claims recite the additional elements of: "further comprising a state simulation apparatus which is configured to simulate a digital image of the operating state of each treatment apparatus of a plurality of treatment apparatuses of the container treatment plant while in operation" (claims 5 and 17) "a display apparatus configured to display a direction of a shortest path to a further treatment apparatus by an arrow projected on the ground by the mobile operation assistance apparatus" (claims 8 and 20), "a display apparatus configured to display a direction of a shortest path to a further treatment apparatus " (claim 11), however the aforementioned elements directed to the receiving of user input/selection of data to view via a dashboard and displaying corresponding data via the dashboard merely amount to generic GUI elements of a general purpose computer used to "apply" the abstract idea (MPEP 2106.05(f)) and/or is merely an attempt at limiting the abstract idea of assisting a user based on position and orientation to determine an operating state of an apparatus and provide recommendations based on the operating state to a particular field of use/technological environment of a GUI dashboard (MPEP 2106.05(h)) and therefore the GUI dashboard input and display of data fails to integrate the abstract idea into a practical application; "An operation assistance system for a user of a container treatment plant, comprising: a position detecting apparatus… a treatment apparatus … an operation detecting apparatus… a mobile operation assistance apparatus / A container treatment plant system of a container treatment plant, comprising: at least one treatment apparatus configured to treat a container;" (claims 1, 14, and 15), “further comprising at least one data transmitter configured to exchange state information between the position detecting apparatus, the operation detecting apparatus, the mobile operation assistance apparatus, and a control station of the container treatment plant wherein the control station is an external server or a cloud system,” (claims 4 and 16), “the position detecting apparatus is configured to record and store paths of the user and the state information and the environmental boundary conditions of an operating environment of the operation assistance system, and the operation detecting apparatus is configured to record and store the environmental boundary conditions and the operating state of the treatment apparatus arranged in a path of the user” (claims 6 and 18), “including at least one of: the mobile operation assistance apparatus comprises a hearing/speaking unit which comprises speech recognition and is connected to a data transmitter of the operation assistance apparatus for exchanging speech information; the mobile operation assistance apparatus comprises ear protection for suppressing disturbing ambient noise using active noise cancellation; the mobile operation assistance apparatus comprises a camera for capturing objects in a field of vision of the user and for recording states occurring in the field of view of the user; the mobile operation assistance apparatus comprises a second camera which is configured to perform at least one of: a capture at least one eye of the user in order to identify the user; or a recognition of a viewing direction of the user; the mobile operation assistance apparatus comprises a gesture recognition unit for recognizing gestures of the user; or the mobile operation assistance apparatus is configured to display two different images to the user in order to show an actual state and a recommended state of the treatment apparatus,” (claims 10 and 12) however the aforementioned elements merely amount to generic components of a general purpose computer used to "apply" the abstract idea (MPEP 2106.0S(f)) and thus fails to integrate the recited abstract idea into a practical application, furthermore the high-level recitation of receiving data from a generic "operation assistance system" is at most an attempt to limit the abstract to a particular field of use (MPEP 2106.0S(h), e.g.: "For instance, a data gathering step that is limited to a particular data source (such as the Internet) or a particular type of data (such as power grid data or XML tags) could be considered to be both insignificant extra-solution activity and a field of use limitation. See, e.g., Ultramercial, 772 F.3d at 716, 112 USPQ2d at 1755 (limiting use of abstract idea to the Internet); Electric Power, 830 F.3d at 1354, 119 USPQ2d at 1742 (limiting application of abstract idea to power grid data); Intellectual Ventures I LLC v. Erie lndem. Co., 850 F.3d 1315, 1328-29, 121 USPQ2d 1928, 1939 (Fed. Cir. 2017) (limiting use of abstract idea to use with XML tags).") and/or merely insignificant extra-solution activity (MPE 2106.05(g)) and thus further fails to integrate the abstract idea into a practical application; Step 2B: Claims 1, 2, and 4-20 do not include additional elements that are sufficient to amount to significantly more than the judicial exception because the additional elements as described above with respect to Step 2A Prong 2 merely amount to a general purpose computer that attempts to apply the abstract idea in a technological environment (MPEP 2106.0S(f)), including merely limiting the abstract idea to a particular field of use of KPI analysis of a "operation assistance system" via a GUI "display apparatus", as explained above, and/or performs insignificant extra-solution activity, e.g. data gathering or output, (MPEP 2106.0S(g)), as identified above, which is further found under step 2B to be merely well-understood, routine, and conventional activities as evidenced by MPEP 2106.0S(d)(II) (describing conventional activities that include transmitting and receiving data over a network, electronic recordkeeping, storing and retrieving information from memory, electronically scanning or extracting data from a physical document, and a web browser's back and forward button functionality). Therefore, similarly the combination and arrangement of the above identified additional elements when analyzed under Step 2B also fails to necessitate a conclusion that the claims amount to significantly more than the abstract idea directed to assisting a user based on position and orientation to determine an operating state of an apparatus and provide recommendations based on the operating state. Claims 1, 2, and 4-20 are accordingly rejected under 35 USC§ 101 because the claimed invention is directed to a judicial exception (i.e., a law of nature, a natural phenomenon, or an abstract idea(s)) without significantly more. Note: The analysis above applies to all statutory categories of invention. As such, the presentment of any claim otherwise styled as a machine or manufacture, for example, would be subject to the same analysis. Claim 3 is found to recite eligible subject matter in that the claim recites where an apparatus is segmented reality glasses which are a physical component that goes beyond a routine and conventional computer component. As such, claim 3 is NOT rejected under 35 USC 101. For further authority and guidance, see: MPEP § 2106 https://www.uspto.gov/patents/laws/examination-policy/subject-matter-eligibility Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-5, 7, 9, 10, 13-17, and 19 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Application Publication Number 2021/0341902 to Barre (hereafter referred to as Barre) in view of U.S. Patent Application Publication Number 2019/0286108 to Zetzsche et al. (hereafter referred to as Zetzsche). As per claim 1, Barre teaches: An operation assistance system for a user of a container treatment plant, comprising: (Paragraph Number [0034] teaches showing a part of a production line 1, a manufacturing station for manufacturing containers, such as bottles made from plastic, comprises, from upstream to downstream, a preform-supplying module followed by a heating module supplying a blow molding or stretch blow molding module which turns out bottles through a conveyor module. Paragraph Number [0035] teaches computer assistance in the management of said production line 1, more precisely computerized assistance aiding an operator in the management of one or more of the modules 2 of said production line 1). a position detecting apparatus configured to detect a position and an orientation of the user in the container treatment plant (Paragraph Number [0037] teaches such a terminal 3 is designed to be portable. The terminal 3 can be in the form of a computer terminal, in the form of a tablet or of a cell phone or of a connected watch, that is to say a terminal 3 equipped with a screen. Said terminal 3 can also comprise a holographic display surface, such as a holographic pair of glasses or headset. Paragraph Number [0021] teaches the position of said terminal can be determined by means of at least one three-dimensional position sensor connected to said terminal. Paragraph Number [0022] teaches the orientation of said terminal can be determined by way of at least a first and a second three-dimensional position sensor which are spaced apart from each other and are connected to said terminal. Paragraph Number [0047] teaches at least one three-dimensional position sensor 8 is connected to said terminal 3. It is then possible to determine the position of said terminal 3 with respect to said module 2. Paragraph Number [0048] teaches at least a second three-dimensional position sensor 8 is connected to said terminal 3. Thus, it is possible to determine the orientation of said terminal 3 with respect to the production line 1, but above all with respect to a module 2 positioned nearby. In other words, it is possible to know how the terminal 3 is positioned with respect to the module 2—opposite, turned toward the module 2 or in another direction). wherein the position detecting apparatus is configured, based on the position and the orientation of the user, to recognize a treatment apparatus in the container treatment plant that is operated by the user (Paragraph Number [0034] teaches showing a part of a production line 1, a manufacturing station for manufacturing containers, such as bottles made from plastic, comprises, from upstream to downstream, a preform-supplying module followed by a heating module supplying a blow molding or stretch blow molding module which turns out bottles through a conveyor module. Paragraph Number [0040] teaches automatically selecting the module 2 with respect to the position of the operator, that is to say to the position of the terminal which they are carrying or wearing. Paragraph Number [0052] teaches displaying, via said display means and in superposition, a virtual model 7 limited to said module 2 according to said determined position of said terminal 3. In short, once the position of the terminal 3 has been identified, above all its orientation, it is possible to limit the possible interactions of the user with the modules 2 positioned nearby. This choice allows a part of the virtual model 3 to be displayed on the terminal 3 to be targeted, instead of loading and executing the one or more virtual models of the whole of the production line 1). Barre teaches detecting the position and orientation of a user in a container treatment plant but does not explicitly teach a mobile operation assistance apparatus which constitutes augmented reality glasses which is taught by the following citations from Zetzsche: an operation detecting apparatus configured to detect an operating state of the recognized treatment apparatus (Paragraph Number [0018] teaches at least one processing apparatus has a machine data transmission interface which communicates with the data server via a data connection, in particular also by using the Internet. It is thus made possible that current machine information such as, for example, operation and error states, maintenance needs, repair needs or the like can also be transferred to the operator via the data server and linked to further information and/or operational instructions. Paragraph Number [0037] teaches as can be seen in FIG. 1, this results overall in a complete system in which the data server 20 communicates via the data connections 15 with the mobile terminal device 10 as well as with the processing apparatus B1, B2, B3, with at least one processing apparatus B1, B2, B3 having a machine data transmission interface for this purpose. Paragraph Number [0039] teaches the data server 20 sends respective information or operational instructions to the mobile terminal device 10 such that these are displayed, for example, in the area of the lenses or in the area of the virtual representation 10a of the mobile terminal device 10). a mobile operation assistance apparatus configured to assist the user when operating the treatment apparatus using operating recommendations based on the operating state of the recognized treatment apparatus (Paragraph Number [0018] teaches at least one processing apparatus has a machine data transmission interface which communicates with the data server via a data connection, in particular also by using the Internet. It is thus made possible that current machine information such as, for example, operation and error states, maintenance needs, repair needs or the like can also be transferred to the operator via the data server and linked to further information and/or operational instructions. Paragraph Number [0039] teaches the complete system recognizes the current position of the operator or of the mobile terminal device 10 by way of the position detection means 11, and is moreover informed of the current state of the processing apparatus B1 as well as the future processing processes owing to the connection to the machine data interface of the processing apparatus B1. As soon as the operator has signaled his readiness to receive information and/or operational instructions by way of appropriate inputs or predefined behavior, the data server 20 sends respective information or operational instructions to the mobile terminal device 10 such that these are displayed, for example, in the area of the lenses or in the area of the virtual representation 10a of the mobile terminal device 10). Both Barre and Zetzsche are directed to container treatment plant operation. Barre discloses detecting the position and orientation of a user in a container treatment plant. Zetzsche improves upon Barre by disclosing a mobile operation assistance apparatus which constitutes augmented reality glasses. One of ordinary skill in the art would be motivated to further a mobile operation assistance apparatus which constitutes augmented reality glasses, to efficiently provide a person or with a compact device that incorporates information on tasks to assist in the completion of workflows. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the system and method of detecting the position and orientation of a user in a container treatment plant in Barre to further include a mobile operation assistance apparatus which constitutes augmented reality glasses as disclosed in Zetzsche, since the claimed invention is merely a combination of old elements, and in combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable. As per claim 14, Barre teaches: A container treatment plant system of a container treatment plant, comprising: at least one treatment apparatus configured to treat a container (Paragraph Number [0034] teaches showing a part of a production line 1, a manufacturing station for manufacturing containers, such as bottles made from plastic, comprises, from upstream to downstream, a preform-supplying module followed by a heating module supplying a blow molding or stretch blow molding module which turns out bottles through a conveyor module. Paragraph Number [0035] teaches computer assistance in the management of said production line 1, more precisely computerized assistance aiding an operator in the management of one or more of the modules 2 of said production line 1). The remainder of the claim limitations are substantially similar to those found in regard to claim 1 and are rejected for the same reasons put forth in regard to claim 1. As per claim 16, claim 16 recites a method that is substantially similar to the method performed by the system of claim 1 and is rejected for the same reasons put forth in regard to claim 1. As per claim 2, the combination of Barre and Zetzsche teaches each of the limitations of claim 1. Barre teaches detecting the position and orientation of a user in a container treatment plant but does not explicitly teach a mobile operation assistance apparatus which constitutes augmented reality glasses which is taught by the following citations from Zetzsche: wherein the position detecting apparatus, the operation detecting apparatus, and the mobile operation assistance apparatus are combined in one apparatus (Paragraph Number [0018] teaches at least one processing apparatus has a machine data transmission interface which communicates with the data server via a data connection, in particular also by using the Internet. Paragraph Number [0037] teaches this results overall in a complete system in which the data server 20 communicates via the data connections 15 with the mobile terminal device 10 as well as with the processing apparatus B1, B2, B3, with at least one processing apparatus B1, B2, B3 having a machine data transmission interface for this purpose. The data server 20 in such a complete system can be located both locally, for example within a local network, as well as “remotely”. In the latter case, the data server 20 can be connected via the Internet, for example, with it being said in this case that the data server 20 is located in the so-called “cloud”). One of ordinary skill in the art would be motivated to combine these references as described in regard to claim 1. As per claim 3, the combination of Barre and Zetzsche teaches each of the limitations of claims 1 and 2. Barre teaches detecting the position and orientation of a user in a container treatment plant but does not explicitly teach a mobile operation assistance apparatus which constitutes augmented reality glasses which is taught by the following citations from Zetzsche: wherein the one apparatus is augmented reality glasses which show information in a field of view of the user (Paragraph Number [0018] teaches at least one processing apparatus has a machine data transmission interface which communicates with the data server via a data connection, in particular also by using the Internet. Paragraph Number [0037] teaches this results overall in a complete system in which the data server 20 communicates via the data connections 15 with the mobile terminal device 10 as well as with the processing apparatus B1, B2, B3, with at least one processing apparatus B1, B2, B3 having a machine data transmission interface for this purpose. The data server 20 in such a complete system can be located both locally, for example within a local network, as well as “remotely”. In the latter case, the data server 20 can be connected via the Internet, for example, with it being said in this case that the data server 20 is located in the so-called “cloud”. Paragraph Number [0038] teaches the operation of the system according to the invention proceeds as follows, for example. As shown in FIG. 2, an operator is in the area of a processing apparatus B1 in order to carry out a specific operational process at the processing apparatus B1, for example, to place a workpiece W in the processing apparatus B1 in a specific manner. For the assistance thereof, the operator is equipped with the mobile terminal device 10 in the form of smart glasses which he wears on his head by means of the temples 16. In addition to various displays in the area of the lenses, the smart glasses 10 are configured to also generate a virtual display 10a which appears in the operator's field of vision. This virtual representation 10a can also be used as input means via a gesture control). One of ordinary skill in the art would be motivated to combine these references as described in regard to claim 1. As per claims 4 and 16, the combination of Barre and Zetzsche teaches each of the limitations of claims 1 and 15 respectively. Barre teaches detecting the position and orientation of a user in a container treatment plant but does not explicitly teach a mobile operation assistance apparatus which constitutes augmented reality glasses which is taught by the following citations from Zetzsche: further comprising at least one data transmitter configured to exchange state information between the position detecting apparatus, the operation detecting apparatus, the mobile operation assistance apparatus, and a control station of the container treatment plant wherein the control station is an external server or a cloud system (Paragraph Number [0018] teaches at least one processing apparatus has a machine data transmission interface which communicates with the data server via a data connection, in particular also by using the Internet. Paragraph Number [0037] teaches this results overall in a complete system in which the data server 20 communicates via the data connections 15 with the mobile terminal device 10 as well as with the processing apparatus B1, B2, B3, with at least one processing apparatus B1, B2, B3 having a machine data transmission interface for this purpose. The data server 20 in such a complete system can be located both locally, for example within a local network, as well as “remotely”. In the latter case, the data server 20 can be connected via the Internet, for example, with it being said in this case that the data server 20 is located in the so-called “cloud”.). One of ordinary skill in the art would be motivated to combine these references as described in regard to claim 1. As per claims 5 and 17, the combination of Barre and Zetzsche teaches each of the limitations of claims 1 and 4, and 15 and 16 respectively. In addition, Barre teaches: further comprising a state simulation apparatus which is configured to simulate a digital image of the operating state of each treatment apparatus of a plurality of treatment apparatuses of the container treatment plant while in operation (Paragraph Number [0030] teaches the mobile terminal device 10 moreover comprises information output means 12 which are configured to output information and/or operational instructions to the operator. The information output means can be, for example, image output means, sound output means and touch output means, in particular vibration output means. The image output means can thereby include at least one virtual or projected representation 10a which is illustrated in FIG. 2 and will be described in more detail below. Paragraph Number [0031] teaches the information output means 12 serve to output information and/or operational instructions for the respective operator with reference to the processing apparatus B1, B2, B3. The operator can thus be assisted in various ways. For example, the operator can be trained for the respective processing apparatus or can generally be provided with assistance in the operation, error recognition, troubleshooting, repair or maintenance. The information and/or operational instructions are thereby advantageously output together and with reference to an actual or virtual representation of the processing apparatus B1, B2, B3). based on environmental boundary conditions, to simulate operating scenarios and, based on results thereof, to transmit the operating recommendations to the mobile operation assistance apparatus using the data transmitter (Paragraph Number [0023] teaches in order to effectively and comprehensibly support the operator in various tasks, it is further provided in accordance with a further development of the invention that the information and/or operational instructions output by way of the information output means include individual images, image sequences and/or videos. Depending on the case of application, a comprehensible and effective briefing and instruction of the operator can thereby take place, with the operator certainly being able to select within the scope of the invention the (available) form of representation he desires for the respective case of application, as needed. Paragraph Number [0031] teaches the information output means 12 serve to output information and/or operational instructions for the respective operator with reference to the processing apparatus B1, B2, B3. The operator can thus be assisted in various ways. For example, the operator can be trained for the respective processing apparatus or can generally be provided with assistance in the operation, error recognition, troubleshooting, repair or maintenance. The information and/or operational instructions are thereby advantageously output together and with reference to an actual or virtual representation of the processing apparatus B1, B2, B3). As per claims 7 and 19, the combination of Barre and Zetzsche teaches each of the limitations of claims 1, 4, and 5, and 15-17 respectively. Barre teaches detecting the position and orientation of a user in a container treatment plant but does not explicitly teach a mobile operation assistance apparatus which constitutes augmented reality glasses which is taught by the following citations from Zetzsche: the operating recommendations are provided to the user by the mobile operation assistance apparatus and are at least one of: parameter setting specifications for manipulated variables of the treatment apparatus; virtually displaying the treatment apparatus being operated, preferably operated by a tool; measures for correcting errors in the treatment apparatus; or proposals for optimizing an arrangement of the treatment apparatus in the container treatment plant (Paragraph Number [0018] teaches at least one processing apparatus has a machine data transmission interface which communicates with the data server via a data connection, in particular also by using the Internet. It is thus made possible that current machine information such as, for example, operation and error states, maintenance needs, repair needs or the like can also be transferred to the operator via the data server and linked to further information and/or operational instructions. Paragraph Number [0037] teaches as can be seen in FIG. 1, this results overall in a complete system in which the data server 20 communicates via the data connections 15 with the mobile terminal device 10 as well as with the processing apparatus B1, B2, B3, with at least one processing apparatus B1, B2, B3 having a machine data transmission interface for this purpose. Paragraph Number [0039] teaches the data server 20 sends respective information or operational instructions to the mobile terminal device 10 such that these are displayed, for example, in the area of the lenses or in the area of the virtual representation 10a of the mobile terminal device 10 (Examiner asserts that this teaches at least the alternative of measures for correcting errors)). One of ordinary skill in the art would be motivated to combine these references as described in regard to claim 1. As per claim 9, the combination of Barre and Zetzsche teaches each of the limitations of claim 1. Barre teaches detecting the position and orientation of a user in a container treatment plant but does not explicitly teach a mobile operation assistance apparatus which constitutes augmented reality glasses which is taught by the following citations from Zetzsche: the operating recommendations are provided to the user by the mobile operation assistance apparatus and are at least one of: parameter setting specifications for manipulated variables of the treatment apparatus; virtually displaying the treatment apparatus being operated, preferably operated by a tool; measures for correcting errors in the treatment apparatus; or proposals for optimizing an arrangement of the treatment apparatus in the container treatment plant (Paragraph Number [0018] teaches at least one processing apparatus has a machine data transmission interface which communicates with the data server via a data connection, in particular also by using the Internet. It is thus made possible that current machine information such as, for example, operation and error states, maintenance needs, repair needs or the like can also be transferred to the operator via the data server and linked to further information and/or operational instructions. Paragraph Number [0037] teaches as can be seen in FIG. 1, this results overall in a complete system in which the data server 20 communicates via the data connections 15 with the mobile terminal device 10 as well as with the processing apparatus B1, B2, B3, with at least one processing apparatus B1, B2, B3 having a machine data transmission interface for this purpose. Paragraph Number [0039] teaches the data server 20 sends respective information or operational instructions to the mobile terminal device 10 such that these are displayed, for example, in the area of the lenses or in the area of the virtual representation 10a of the mobile terminal device 10 (Examiner asserts that this teaches at least the alternative of measures for correcting errors)). One of ordinary skill in the art would be motivated to combine these references as described in regard to claim 1. As per claim 10, the combination of Barre and Zetzsche teaches each of the limitations of claims 1 and 9. In addition, Barre teaches: including at least one of: the mobile operation assistance apparatus comprises a hearing/speaking unit which comprises speech recognition and is connected to a data transmitter of the operation assistance apparatus for exchanging speech information; the mobile operation assistance apparatus comprises ear protection for suppressing disturbing ambient noise using active noise cancellation; the mobile operation assistance apparatus comprises a camera for capturing objects in a field of vision of the user and for recording states occurring in the field of view of the user; the mobile operation assistance apparatus comprises a second camera which is configured to perform at least one of: a capture at least one eye of the user in order to identify the user; or a recognition of a viewing direction of the user; the mobile operation assistance apparatus comprises a gesture recognition unit for recognizing gestures of the user; or the mobile operation assistance apparatus is configured to display two different images to the user in order to show an actual state and a recommended state of the treatment apparatus (Paragraph Number [0014] teaches the information input means can be realized in various ways within the scope of the present invention. For example, they can be touch input means such as, particularly, a keyboard or a touchscreen. However, sound input means can be used alternatively or additionally, such as voice recognition means, for example. Generally, these have the same advantages as the abovementioned voice output means. Further alternatively or additionally, motion detection means can also be used. In this regard, typical motion detection means are gesture detection means with which the operator can input information into the terminal device by merely moving a body part. Another type of motion detection means is a so-called virtual touchscreen in which a virtual image is generated in or projected into the field of vision of the operator, and a movement of the user is put in relation to the virtual or projected image in order to effect an information input. (Examiner asserts that this teaches at least the alternative of recognizing gestures of the user)). As per claim 13, the combination of Barre and Zetzsche teaches each of the limitations of claim 1. Barre teaches detecting the position and orientation of a user in a container treatment plant but does not explicitly teach a mobile operation assistance apparatus which constitutes augmented reality glasses which is taught by the following citations from Zetzsche: the position detecting apparatus is configured to record and store paths of the user and state information and environmental boundary conditions of an operating environment of the treatment apparatus; and the operation detecting apparatus is configured to record and store the environmental boundary conditions and the operating state of the treatment apparatus arranged in a path of the user (Paragraph Number [0029] teaches the position detections means 11 can also have, for example, a module for a global or local positioning system, with which the position of the mobile terminal device 10 can be inferred by means of the transmission data of various transmitters (satellites or local transmitters, for example). In this manner, it becomes possible to constantly determine at which precise location the mobile terminal device 10 is in relation to the processing apparatus B1, B2, B3. Paragraph Number [0035] teaches the information respectively required for the information output is stored on the data server 20 and is transmitted in a customized manner to the information output means 12 of the mobile terminal device 10 via the data transmission interface 13. Paragraph Number [0037] teaches a complete system in which the data server 20 communicates via the data connections 15 with the mobile terminal device 10 as well as with the processing apparatus B1, B2, B3, with at least one processing apparatus B1, B2, B3 having a machine data transmission interface for this purpose). One of ordinary skill in the art would be motivated to combine these references as described in regard to claim 1. Claims 6, 8, 11, 12, 18, and 20 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Application Publication Number 2021/0341902 to Barre (hereafter referred to as Barre) in view of U.S. Patent Application Publication Number 2019/0286108 to Zetzsche et al. (hereafter referred to as Zetzsche) and in further view of U.S. Patent Application Publication Number 2019/0147655 to Galera et al. (hereafter referred to as Galera). As per claims 6 and 18, the combination of Barre and Zetzsche teaches each of the limitations of claims 1, 4, and 5, and 15-17 respectively. Barre teaches detecting the position and orientation of a user in a container treatment plant but does not explicitly teach determining a path that a user should take based on their position and orientation to complete a workflow which is taught by the following citations from Galera: the position detecting apparatus is configured to record and store paths of the user and the state information and the environmental boundary conditions of an operating environment of the operation assistance system (Paragraph Number [0150] teaches notifications can take the form of a superimposed message rendered on the user's wearable appliance 206 identifying the nature of the issue. In some embodiments, if the user is located on the plant floor at the time of the notification, rendering component 308 can render a VR/AR presentation that superimposes directional arrows over the user's natural view of his or her environment directing the user to the source of the issue. The directional arrows may first guide the user to the machine or area at which the issue was detected. The direction of the arrows, as well as the location of the arrow graphics on the display screen of the wearable appliance 206, are a function of the user's current location and orientation, as determined by the location and orientation data 606. Once at the location, further directional arrows can be generated that indicate the particular industrial device, machine, or machine component experiencing the issue. Again, the direction and display locations for these arrows are based on the current location and orientation data 606. As the user changes location and orientation, rendering component 308 will update the directions and/or display locations of the arrows and other graphical indicators in accordance with the updated location and orientation data 606 to ensure that the graphical indications continuously direct the user's attention in the correct direction or toward the correct devices or components. Paragraph Number [0157] teaches to facilitate generation of workflow presentations for assistance with detected issues, VR/AR presentation system 302 can store (e.g., on memory 322) workflow data 1608 defining actions to be taken to correct various issues, as well as VR/AR presentation instructions for rendering guidance in connection with performing these actions. In one or more embodiments, sets of workflow data 1608 can be stored in association with the event or machine to which the workflow relates). the operation detecting apparatus is configured to record and store the environmental boundary conditions and the operating state of the treatment apparatus arranged in a path of the user (Paragraph Number [0157] teaches to facilitate generation of workflow presentations for assistance with detected issues, VR/AR presentation system 302 can store (e.g., on memory 322) workflow data 1608 defining actions to be taken to correct various issues, as well as VR/AR presentation instructions for rendering guidance in connection with performing these actions. In one or more embodiments, sets of workflow data 1608 can be stored in association with the event or machine to which the workflow relates. Paragraph Number [0165] teaches based on a comparison of the user's interactions with the automation system with the steps of the preferred workflow, rendering component 308 can generate and deliver workflow feedback data 1712 to the user's wearable appliance 206 in response to determining that the user has deviated from the workflow. Such feedback may comprise, for example, corrective instructions intended to inform the user of the deviation and to guide the user to the correct sequence of operations dictated by the workflow. In some embodiments, monitoring component 316 can also calculate and record performance metrics for the user that rate the user's performance of the workflow, based on the user's measured degree of compliance with or deviation from the workflow). Both the combination of Barre and Zetzsche and Galera are directed to container treatment plant operation. The combination of Barre and Zetzsche discloses detecting the position and orientation of a user in a container treatment plant. Galera improves upon the combination of Barre and Zetzsche by disclosing determining a path that a user should take based on their position and orientation to complete a workflow. One of ordinary skill in the art would be motivated to further include determining a path that a user should take based on their position and orientation to complete a workflow, to efficiently direct a user of the system to perform certain actions and provide the information and tools needed to complete the actions. Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the system and method of detecting the position and orientation of a user in a container treatment plant in the combination of Barre and Zetzsche to further utilize determining a path that a user should take based on their position and orientation to complete a workflow as disclosed in Galera, since the claimed invention is merely a combination of old elements, and in combination each element merely would have performed the same function as it did separately, and one of ordinary skill in the art would have recognized that the results of the combination were predictable. As per claims 8 and 20, the combination of Barre and Zetzsche teaches each of the limitations of claims 1, 4, and 5, and 15-17 respectively. Barre teaches detecting the position and orientation of a user in a container treatment plant but does not explicitly teach a mobile operation assistance apparatus which constitutes augmented reality glasses which is taught by the following citations from Zetzsche: wherein the further treatment apparatus is a treatment apparatus on which an operating recommendation is to be carried out (Paragraph Number [0018] teaches at least one processing apparatus has a machine data transmission interface which communicates with the data server via a data connection, in particular also by using the Internet. It is thus made possible that current machine information such as, for example, operation and error states, maintenance needs, repair needs or the like can also be transferred to the operator via the data server and linked to further information and/or operational instructions. Paragraph Number [0037] teaches as can be seen in FIG. 1, this results overall in a complete system in which the data server 20 communicates via the data connections 15 with the mobile terminal device 10 as well as with the processing apparatus B1, B2, B3, with at least one processing apparatus B1, B2, B3 having a machine data transmission interface for this purpose. Paragraph Number [0039] teaches the data server 20 sends respective information or operational instructions to the mobile terminal device 10 such that these are displayed, for example, in the area of the lenses or in the area of the virtual representation 10a of the mobile terminal device 10). One of ordinary skill in the art would be motivated to combine these references as described in regard to claim 1. Barre teaches detecting the position and orientation of a user in a container treatment plant but does not explicitly teach determining a path that a user should take based on their position and orientation to complete a workflow which is taught by the following citations from Galera: a display apparatus configured to display a direction of a shortest path to a further treatment apparatus by an arrow projected on the ground by the mobile operation assistance apparatus (Paragraph Number [0150] teaches notifications can take the form of a superimposed message rendered on the user's wearable appliance 206 identifying the nature of the issue. In some embodiments, if the user is located on the plant floor at the time of the notification, rendering component 308 can render a VR/AR presentation that superimposes directional arrows over the user's natural view of his or her environment directing the user to the source of the issue. The directional arrows may first guide the user to the machine or area at which the issue was detected. The direction of the arrows, as well as the location of the arrow graphics on the display screen of the wearable appliance 206, are a function of the user's current location and orientation, as determined by the location and orientation data 606. Once at the location, further directional arrows can be generated that indicate the particular industrial device, machine, or machine component experiencing the issue. Again, the direction and display locations for these arrows are based on the current location and orientation data 606. As the user changes location and orientation, rendering component 308 will update the directions and/or display locations of the arrows and other graphical indicators in accordance with the updated location and orientation data 606 to ensure that the graphical indications continuously direct the user's attention in the correct direction or toward the correct devices or components). One of ordinary skill in the art would be motivated to combine these references as described in regard to claim 6. As per claim 11, the combination of Barre and Zetzsche teaches each of the limitations of claim 1. In addition, Barre teaches: a display apparatus configured to display a direction of a shortest path to a further treatment apparatus (Paragraph Number [0055] teaches the virtual model 7 can comprise digital, preferably graphical, elements, such as three-dimensional animations or videos. It can form a virtual and/or augmented reality, preferably a mixed reality, assisting the operator in their supervision of the module 2 and in the tasks which they have to accomplish. The virtual model 7 can incorporate interactive virtual elements triggered by the operator, for example a tutorial or an educational program). wherein the further treatment apparatus is a treatment apparatus on which an operating recommendation is to be carried out (Paragraph Number [0011] teaches once an element has been identified, the HMI provides the operator with interactivity, allowing information relating to said element to be collected, notably by way of a centralized computer system, and to be displayed. It is then also possible to transmit virtual instructions, notably relating to steps for the maintenance of such an element. These additions consist of a virtual reality which, combined with the augmented reality, form a mixed reality, lending computer assistance to the operator in their supervision and their tasks to accomplish. Paragraph Number [0055] teaches the virtual model 7 can comprise digital, preferably graphical, elements, such as three-dimensional animations or videos. It can form a virtual and/or augmented reality, preferably a mixed reality, assisting the operator in their supervision of the module 2 and in the tasks which they have to accomplish. The virtual model 7 can incorporate interactive virtual elements triggered by the operator, for example a tutorial or an educational program). Barre teaches detecting the position and orientation of a user in a container treatment plant but does not explicitly teach determining a path that a user should take based on their position and orientation to complete a workflow which is taught by the following citations from Galera: by an arrow projected on the ground by the mobile operation assistance apparatus (Paragraph Number [0150] teaches notifications can take the form of a superimposed message rendered on the user's wearable appliance 206 identifying the nature of the issue. In some embodiments, if the user is located on the plant floor at the time of the notification, rendering component 308 can render a VR/AR presentation that superimposes directional arrows over the user's natural view of his or her environment directing the user to the source of the issue. The directional arrows may first guide the user to the machine or area at which the issue was detected. The direction of the arrows, as well as the location of the arrow graphics on the display screen of the wearable appliance 206, are a function of the user's current location and orientation, as determined by the location and orientation data 606. Once at the location, further directional arrows can be generated that indicate the particular industrial device, machine, or machine component experiencing the issue. Again, the direction and display locations for these arrows are based on the current location and orientation data 606. As the user changes location and orientation, rendering component 308 will update the directions and/or display locations of the arrows and other graphical indicators in accordance with the updated location and orientation data 606 to ensure that the graphical indications continuously direct the user's attention in the correct direction or toward the correct devices or components). One of ordinary skill in the art would be motivated to combine these references as described in regard to claim 6. As per claim 12, the combination of Barre, Zetzsche, and Galera teaches each of the limitations of claims 1 and 11. In addition, Barre teaches. including at least one of: the mobile operation assistance apparatus comprises a hearing/speaking unit which comprises speech recognition and is connected to a data transmitter of the operation assistance apparatus for exchanging speech information; the mobile operation assistance apparatus comprises ear protection for suppressing disturbing ambient noise using active noise cancellation; the mobile operation assistance apparatus comprises a camera for capturing objects in a field of vision of the user and for recording states occurring in the field of view of the user; the mobile operation assistance apparatus comprises a second camera which is configured to perform at least one of: a capture at least one eye of the user in order to identify the user; or a recognition of a viewing direction of the user; the mobile operation assistance apparatus comprises a gesture recognition unit for recognizing gestures of the user; or the mobile operation assistance apparatus is configured to display two different images to the user in order to show an actual state and a recommended state of the treatment apparatus. (Paragraph Number [0014] teaches the information input means can be realized in various ways within the scope of the present invention. For example, they can be touch input means such as, particularly, a keyboard or a touchscreen. However, sound input means can be used alternatively or additionally, such as voice recognition means, for example. Generally, these have the same advantages as the abovementioned voice output means. Further alternatively or additionally, motion detection means can also be used. In this regard, typical motion detection means are gesture detection means with which the operator can input information into the terminal device by merely moving a body part. Another type of motion detection means is a so-called virtual touchscreen in which a virtual image is generated in or projected into the field of vision of the operator, and a movement of the user is put in relation to the virtual or projected image in order to effect an information input. (Examiner asserts that this teaches at least the alternative of recognizing gestures of the user)). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to MATTHEW H DIVELBISS whose telephone number is (571)270-0166. The examiner can normally be reached on 7:30 am - 6:00 PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Jerry O'Connor can be reached on (571) 272-6787. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about PAIR, see https://ppair-my.uspto.gov/pair/PrivatePair. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). /M. H. D./ Examiner, Art Unit 3624 /Jerry O'Connor/Supervisory Patent Examiner,Group Art Unit 3624
Read full office action

Prosecution Timeline

Dec 26, 2024
Application Filed
Feb 11, 2026
Non-Final Rejection — §101, §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12572889
Optimization of Large-scale Industrial Value Chains
2y 5m to grant Granted Mar 10, 2026
Patent 12503000
OPTIMIZATION PROCEDURE FOR THE ENERGY MANAGEMENT OF A SOLAR ENERGY INSTALLATION WITH STORAGE MEANS IN COMBINATION WITH THE CHARGING OF AN ELECTRIC VEHICLE AND SYSTEM
2y 5m to grant Granted Dec 23, 2025
Patent 12493860
WASTE MANAGEMENT SYSTEM AND METHOD
2y 5m to grant Granted Dec 09, 2025
Patent 12482011
FAMILIARITY DEGREE ESTIMATION APPARATUS, FAMILIARITY DEGREE ESTIMATION METHOD, AND RECORDING MEDIUM
2y 5m to grant Granted Nov 25, 2025
Patent 12450574
METHOD FOR WASTE MANAGEMENT UTILIZING ARTIFICAL NEURAL NETWORK SYSTEM
2y 5m to grant Granted Oct 21, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
23%
Grant Probability
46%
With Interview (+23.4%)
4y 1m
Median Time to Grant
Low
PTA Risk
Based on 367 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month