CTNF 18/106,122 CTNF 82166 DETAILED ACTION This action is in response to the submission filed on 2/6/2023. Claims 1-17 and 21-23 are presented for examination. Notice of Pre-AIA or AIA Status 07-03-aia AIA 15-10-aia The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA. Claim Rejections - 35 USC § 112 07-30-02 AIA The following is a quotation of 35 U.S.C. 112(b): (b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention. The following is a quotation of 35 U.S.C. 112 (pre-AIA), second paragraph: The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention. 07-34-01 Claims 4-8, and 14-17 are rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention. Claims 4 and 14 recites “an event sequence tunnel is specified by at entry space”. It is unclear if this should read “is specified by an entry space”. Claims 5-7 and 15-17 are rejected by virtue of their dependency. Claim 8 recites “the event sequence tunnel” which lacks antecedent support. Appropriate correction is required. Claim Rejections - 35 USC § 103 07-20-aia AIA The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 07-23-aia AIA The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. 07-21-aia AIA Claim s 1-2, 9-10, 12, and 21-22 are rejected under 35 U.S.C. 103 as being unpatentable over US 20190129436 A1 (“Sun”) in view of US 20220204009 A1 (“Choi”) . Regarding claims 1, 11 and 21, Sun teaches: A system, comprising: at least one processor, and at least one non-transitory storage media storing instructions that, when executed by the at least one processor (Sun: Fig. 4) , cause the at least one processor to: observe at least one characteristic of a simulated agent at a first timestamp during a simulation of a scenario (Sun: para [0025], “performing a simulation or operational phase to generate a vicinal scenario for each simulated vehicle in an iteration of a simulation, the vicinal scenarios corresponding to different locations, traffic patterns, or environmental conditions being simulated, provide vehicle intention data corresponding to a data representation of various types of simulated vehicle or driver intentions, generate a trajectory corresponding to perception data and the vehicle intention data, execute at least one of the plurality of trained trajectory prediction models to generate a distribution of predicted vehicle trajectories for each of a plurality of simulated vehicles of the simulation based on the vicinal scenario and the vehicle intention data, select at least one vehicle trajectory from the distribution based on pre-defined criteria, and update a state and trajectory of each of the plurality of simulated vehicles based on the selected vehicle trajectory from the distribution (processing block 1040 )”) ; specify the simulated agent at subsequent timestamps according to the at least one characteristic in the scenario (Sun: para [0022], “in FIG. 2, the autonomous vehicle trajectory simulation system 202 can be configured to include or connect with a plurality of trained prediction models 180 used to simulate or predict the behavior and trajectories of simulated vehicles with simulated drivers in a simulation scenario. In particular, the plurality of trained prediction models 180 enable the generation of a distribution of predicted vehicle trajectories for each of a plurality of simulated vehicles of the simulation based on a particular vicinal scenario dataset configured by the vicinal scene data generator module 173 and a corresponding vehicle intention dataset 174 . For each of the vehicle trajectories in each of the generated trajectory distributions, a confidence value, likelihood rating, or probability value can also be computed by the prediction models 180 to specify a degree of likelihood or probability that the particular simulated vehicle will actually traverse the corresponding trajectory of the distribution. The trained prediction models 180 represent a simulated virtual world configured to be identical (as possible) to the real world where vehicles are operated by human drivers. The virtual world can be used to train and improve a control system for an autonomous vehicle. Thus, the simulation produced by the autonomous vehicle trajectory simulation system 202 can be indirectly useful for configuring the control systems in autonomous vehicles. The plurality of trained prediction models 180 can be parameterized models, which may be configured or trained using either (or both) real-world input provided by the data collection system 201 or randomized variables. In one example, the trained prediction models 180 may simulate the typical and atypical driver behaviors, such as steering or heading control, speed or throttle control, and stopping or brake control. In one example, the trained prediction models 180 may use, for example, sensory-motor transport delay, dynamic capabilities, and preferred driving behaviors. In some implementations, the trained prediction models 180 may include modeling of the transport time delay between a stimulus and the simulated driver's control response. In some implementations, this delay may represent the time necessary for the driver to sense a stimulus, process it, determine the best corrective action, and respond”) ; determine at least one expected parameter associated with the simulated agent at the subsequent timestamps, wherein the at least one expected parameter is modified responsive to other simulation features (Sun: para [0022], “in FIG. 2, the autonomous vehicle trajectory simulation system 202 can be configured to include or connect with a plurality of trained prediction models 180 used to simulate or predict the behavior and trajectories of simulated vehicles with simulated drivers in a simulation scenario. In particular, the plurality of trained prediction models 180 enable the generation of a distribution of predicted vehicle trajectories for each of a plurality of simulated vehicles of the simulation based on a particular vicinal scenario dataset configured by the vicinal scene data generator module 173 and a corresponding vehicle intention dataset 174 . For each of the vehicle trajectories in each of the generated trajectory distributions, a confidence value, likelihood rating, or probability value can also be computed by the prediction models 180 to specify a degree of likelihood or probability that the particular simulated vehicle will actually traverse the corresponding trajectory of the distribution. The trained prediction models 180 represent a simulated virtual world configured to be identical (as possible) to the real world where vehicles are operated by human drivers. The virtual world can be used to train and improve a control system for an autonomous vehicle. Thus, the simulation produced by the autonomous vehicle trajectory simulation system 202 can be indirectly useful for configuring the control systems in autonomous vehicles. The plurality of trained prediction models 180 can be parameterized models, which may be configured or trained using either (or both) real-world input provided by the data collection system 201 or randomized variables. In one example, the trained prediction models 180 may simulate the typical and atypical driver behaviors, such as steering or heading control, speed or throttle control, and stopping or brake control. In one example, the trained prediction models 180 may use, for example, sensory-motor transport delay, dynamic capabilities, and preferred driving behaviors. In some implementations, the trained prediction models 180 may include modeling of the transport time delay between a stimulus and the simulated driver's control response. In some implementations, this delay may represent the time necessary for the driver to sense a stimulus, process it, determine the best corrective action, and respond. The trained prediction models 180 may also include a speed control model with an absolute maximum vehicle speed (e.g., the maximum speed of the vehicle, the speed a driver is not comfortable exceeding, etc.) and a cornering aggressiveness measure to reduce the speed based on the turning radius. In the example, this may replicate the tendency of drivers to slow down through a turn. In the example, once the turning radius drops below the cornering threshold in the scenario, the speed may be reduced in proportion to the tightness of the turn”) ; evaluate the simulated agent at the first timestamp and the subsequent timestamps, wherein associations among characteristics associated with the simulated agent are determined starting at the first timestamp through the subsequent timestamps, and wherein the associations evolve across iterative simulations of the scenario (Sun: para [0025], “Referring now to FIG. 3, a flow diagram illustrates an example embodiment of a system and method 1000 for providing trajectory simulation of autonomous vehicles. The example embodiment can be configured for: receiving training data from a real world data collection system (processing block 1010 ); obtaining ground truth data corresponding to the training data (processing block 1020 ); performing a training phase to train a plurality of trajectory prediction models (processing block 1030 ); and performing a simulation or operational phase to generate a vicinal scenario for each simulated vehicle in an iteration of a simulation, the vicinal scenarios corresponding to different locations, traffic patterns, or environmental conditions being simulated, provide vehicle intention data corresponding to a data representation of various types of simulated vehicle or driver intentions, generate a trajectory corresponding to perception data and the vehicle intention data, execute at least one of the plurality of trained trajectory prediction models to generate a distribution of predicted vehicle trajectories for each of a plurality of simulated vehicles of the simulation based on the vicinal scenario and the vehicle intention data”) ; and wherein the simulated agent consistently simulated according to the at least one characteristic at the first timestamp and subsequent timestamps (Sun: para [0024], “for each iteration of the simulation in the simulation or operational phase, enable the generation of a distribution of vehicle trajectories with corresponding likelihood ratings or probability values for each of a plurality of simulated vehicles of the simulation based on a particular vicinal scenario dataset configured by the vicinal scene data generator module 173 and a corresponding vehicle intention dataset 174 . For each iteration of the simulation and for each of the plurality of trained prediction models 180 , one of the plurality of trajectory samplers 182 can select a particular trajectory from the distribution of vehicle trajectories generated for the particular iteration. The particular trajectory can be selected based on a variety of pre-defined criteria including a maximal or minimal likelihood rating or probability value, conformity with pre-defined safety parameters, conformity with pre-defined economy parameters, conformity with pre-defined timing or distance parameters, and the like. The particular trajectory selected by the one of the plurality of trajectory samplers 182 can be stored in the memory 172 as vehicle trajectory data 176 . Once the particular trajectory is selected and the vehicle trajectory data 176 for the current iteration is stored, the state updater module 175 can update the states and predicted trajectories of all simulated vehicles in the simulation according to the selected trajectory. The predicted vehicle trajectories retained as vehicle trajectory data 176 can be saved and fed back into the state updated module 175 to improve the accuracy of the predicted trajectories. At this point, the current iteration of the simulation is complete and control is passed back to the vicinal scene data generator module 173 for the start of the next iteration of the simulation in the simulation or operational phase. As described above, the vicinal scene data generator module 173 generates a new vicinal scenario for each simulated vehicle in the simulation system for the next iteration. The new vicinal scenarios are passed to the plurality of trained prediction models 180 . A new set of vehicle intention data 174 is also selected or generated and passed to the plurality of trained prediction models 180 . The trained prediction models 180 use the new vicinal scenarios and the new set of vehicle intention data 174 to produce a new distribution of vehicle trajectories for each of a plurality of simulated vehicles with corresponding likelihood ratings or probability values for the next iteration as described above. As a result, the autonomous vehicle trajectory simulation system 202 of an example embodiment can produce an interactive, realistic traffic simulation”) . Sun does not teach but Choi does teach: validate a response of an autonomous system in the iterative simulations of the scenario (Choi: para [0022], “. Using the sensor simulation software validated in the manner described herein may result in a more realistic future simulation of autonomous vehicle navigation. More log data may be simulated rather than collected over many runs and many hours of driving on a roadway. More accurate tests of autonomous vehicle software may be performed in simulation, which may be more efficient and safer than running tests on a roadway. The autonomous vehicle software may be continually improved using the simulation technology”; para [0044], “0044] In some examples, client computing device 440 may be an operations workstation used by an administrator or operator to review simulation outcomes, handover times, and validation information”). Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined Sun (directed to simulation of autonomous agents) and Choi (directed to validating responses of autonomous agents) and arrived at simulation and validation of autonomous agents. One of ordinary skill in the art would have been motivated to make such a combination because “It is critical that the autonomous control software used by these vehicles to operate in the autonomous mode is tested and validated before such software is actually used to control the vehicles in areas where the vehicles are interacting with other objects” (Choi: para [0001]) . Regarding claims 2, 12 and 22, Sun and Choi teach: The system of claim 1, wherein the at least one characteristic is a location of the simulated agent and the at least one expected parameter is a next location of the simulated agent (Sun: Abstract, “performing a simulation or operational phase to generate a vicinal scenario for each simulated vehicle in an iteration of a simulation, the vicinal scenarios corresponding to different locations, traffic patterns,”) . Regarding claim 9, Sun and Choi teach: The system of claim1, wherein the at least one characteristic is an agent identification, an agent type (Sun: para [0021], “The vehicle intention data 174 corresponds to a data representation of various types of simulated vehicle and/or driver intentions.”) , an agent velocity, an agent dimension, an agent shape, an agent color, or any combinations thereof. Regarding claim 10, Sun does not teach but Choi does teach: The system of claim1, wherein validating the response of the autonomous system to the simulation of the scenario comprises executing a tracking algorithm of the autonomous system on time series data of the scenario and determining that the tracking algorithm identified a threshold amount of consistent characteristics across frames of the scenario (Choi: para [0062], “Based on the one or more metrics 910 , the server computing devices 410 or other one or more processors may perform an evaluation 920 of how the simulation performed. The evaluation may be for the simulated sensor or for the constructed environment. For example, the evaluation may be for realism, or how well the simulation matches what occurred in the perception system 172 of the vehicle 100 . Additionally or alternatively, the evaluation may be for how well the constructed environment matches the ground truths in the environment. The one or more metrics may be tracked over multiple simulations of a same scenario or different scenarios to determine whether the simulated sensor data matches or nearly matches the logged sensor data”) . Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined Sun (directed to simulation of autonomous agents) and Choi (directed to validating responses of autonomous agents) and arrived at simulation and validation of autonomous agents. One of ordinary skill in the art would have been motivated to make such a combination because “It is critical that the autonomous control software used by these vehicles to operate in the autonomous mode is tested and validated before such software is actually used to control the vehicles in areas where the vehicles are interacting with other objects” (Choi: para [0001]) . 07-21-aia AIA Claim s 3, 13 and 23 are rejected under 35 U.S.C. 103 as being unpatentable over US 20190129436 A1 (“Sun”) in view of US 20220204009 A1 (“Choi”), further in view of US 20200250363 A1 (“Partridge”) . Regarding claims 3, 13 and 23, Sun and Choi do not teach but Partridge teaches: The system of claim 1, wherein the at least one characteristic is an object color of the simulated agent and the at least one expected parameter is a subsequent color of the simulated agent (Partridge: para [0026], “Through the Designer 102 , the user may define one or more parameterized features of the scenario or the vehicle. For example, one or more of the following features may be parameterized… (iii) features of the vehicle itself (e.g., dimensions, weight, engine parameters, gearbox parameters, tires, acceleration and turning performance, color, etc.)”; para [0074], “Each of the buildings may include one or more simulated materials that may have a simulated reflectivity to one or more types of sensors that may be used by an ego vehicle (e.g., radar, laser, infrared, different colors of visible light).”) . Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined Sun and Choi (directed to simulation of autonomous agents) and Partridge (directed to color of autonomous agents) and arrived at simulation of colored autonomous agents. One of ordinary skill in the art would have been motivated to make such a combination for “creating a static environment and/or reproducible events that permit testing of specific aspects of the vehicle system while controlling environmental variables” (Partridge: para [0003]) . 07-21-aia AIA Claim s 4-8 and 14-17 are rejected under 35 U.S.C. 103 as being unpatentable over US 20190129436 A1 (“Sun”) in view of US 20220204009 A1 (“Choi”), further in view of “Impact of the connected vehicle environment on tunnel entrance zone” (“Li”) . Regarding claims 4 and 14, Sun and Choi do not teach but Li teaches: The system of claim1, wherein the at least one characteristic is an object identification and an event sequence tunnel is specified by at entry space at the first timestamp and an exit space at a subsequent timestamp (Li: Fig. 1. Driving simulation platform; page 6, “within the first 300 m after entering the tunnel entrance”; Fig. 3, “Tunnel entrance zone”, “Tunnel zone” (3600m); page 6, “real-time and accurate acquisition of behavioral data in the simulation driving process”) . Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined Sun and Choi (directed to autonomous agent simulation) and Choi (directed to tunnel entry and exit) and arrived at autonomous vehicle simulation of tunnel entry and exit. One of ordinary skill in the art would have been motivated to make such a combination because “drastic changes of the space environment at the tunnel entrance can lead to frequent accidents with higher levels. The connected vehicle environment provides drivers with surrounding traffic information and improve their driving behavior by helping them make safe decisions efficiently. As such, this study is to examine the effects of the connected vehicle environment on driving behavior and safety at the tunnel entrance zone” and for “specific tunnel entrance scenarios, which provides a reference for realizing active protection of vehicles at the tunnel entrance” (Li: Abstract) . Regarding claims 5 and 15, Sun and Choi do not teach but Li teaches: The system of claim 4, wherein the at least one expected parameter is one or more dimensions of the event sequence tunnel, wherein at least one factor is applied to dimensions of the event sequence tunnel at the entry space and propagated through the event sequence tunnel (Li: page 4, “Shixia Tunnel is 6.4 km in length and is placed in the second segment of the road scene. The tunnel scene design (one-way) is shown in Fig. 3”; Fig. 3, “Tunnel zone (3600m)”) . Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined Sun and Choi (directed to autonomous agent simulation) and Choi (directed to tunnel entry and exit) and arrived at autonomous vehicle simulation of tunnel entry and exit. One of ordinary skill in the art would have been motivated to make such a combination because “drastic changes of the space environment at the tunnel entrance can lead to frequent accidents with higher levels. The connected vehicle environment provides drivers with surrounding traffic information and improve their driving behavior by helping them make safe decisions efficiently. As such, this study is to examine the effects of the connected vehicle environment on driving behavior and safety at the tunnel entrance zone” and for “specific tunnel entrance scenarios, which provides a reference for realizing active protection of vehicles at the tunnel entrance” (Li: Abstract) . Regarding claims 6 and 16, Sun, Choi and Li teach: The system of claim 4, wherein evaluating the simulated agent at the first timestamp and the subsequent timestamps comprises associating a consistent identification of the simulated agent in the iterative simulations of the scenario (Sun: para [0024], “for each iteration of the simulation in the simulation or operational phase, enable the generation of a distribution of vehicle trajectories with corresponding likelihood ratings or probability values for each of a plurality of simulated vehicles of the simulation based on a particular vicinal scenario dataset configured by the vicinal scene data generator module 173 and a corresponding vehicle intention dataset 174 . For each iteration of the simulation and for each of the plurality of trained prediction models 180 , one of the plurality of trajectory samplers 182 can select a particular trajectory from the distribution of vehicle trajectories generated for the particular iteration. The particular trajectory can be selected based on a variety of pre-defined criteria including a maximal or minimal likelihood rating or probability value, conformity with pre-defined safety parameters, conformity with pre-defined economy parameters, conformity with pre-defined timing or distance parameters, and the like. The particular trajectory selected by the one of the plurality of trajectory samplers 182 can be stored in the memory 172 as vehicle trajectory data 176 . Once the particular trajectory is selected and the vehicle trajectory data 176 for the current iteration is stored, the state updater module 175 can update the states and predicted trajectories of all simulated vehicles in the simulation according to the selected trajectory. The predicted vehicle trajectories retained as vehicle trajectory data 176 can be saved and fed back into the state updated module 175 to improve the accuracy of the predicted trajectories. At this point, the current iteration of the simulation is complete and control is passed back to the vicinal scene data generator module 173 for the start of the next iteration of the simulation in the simulation or operational phase. As described above, the vicinal scene data generator module 173 generates a new vicinal scenario for each simulated vehicle in the simulation system for the next iteration”) . Regarding claims 7 and 17, Sun and Choi do not teach but Li teaches: The system of claim 4, wherein at least one waypoint or at least one intermediate space is injected into in the event sequence tunnel identified for the simulated agent, and the simulated agent is evaluated at the entry space, then at least at one waypoint or at one intermediate space, and the exit space (Li: Fig. 1. Driving simulation platform; page 4, “Shixia Tunnel is 6.4 km in length and is placed in the second segment of the road scene. The tunnel scene design (one-way) is shown in Fig. 3”; Fig. 3, “Tunnel zone (3600m)”) . Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined Sun and Choi (directed to autonomous agent simulation) and Choi (directed to tunnel entry and exit) and arrived at autonomous vehicle simulation of tunnel entry and exit. One of ordinary skill in the art would have been motivated to make such a combination because “drastic changes of the space environment at the tunnel entrance can lead to frequent accidents with higher levels. The connected vehicle environment provides drivers with surrounding traffic information and improve their driving behavior by helping them make safe decisions efficiently. As such, this study is to examine the effects of the connected vehicle environment on driving behavior and safety at the tunnel entrance zone” and for “specific tunnel entrance scenarios, which provides a reference for realizing active protection of vehicles at the tunnel entrance” (Li: Abstract) . Regarding claim 8, Sun and Choi do not teach but Li teaches: The system of claim 1, wherein the at least one characteristic is a value associated with an agent captured at an entry space corresponding to the first timestamp, until an exit space corresponding to a last timestamp, through at least one waypoint or at least one intermediate space of the event sequence tunnel (Li: Fig. 1. Driving simulation platform; page 4, “Shixia Tunnel is 6.4 km in length and is placed in the second segment of the road scene. The tunnel scene design (one-way) is shown in Fig. 3”; Fig. 3, “Tunnel zone (3600m)”) . Accordingly, it would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have combined Sun and Choi (directed to autonomous agent simulation) and Choi (directed to tunnel entry and exit) and arrived at autonomous vehicle simulation of tunnel entry and exit. One of ordinary skill in the art would have been motivated to make such a combination because “drastic changes of the space environment at the tunnel entrance can lead to frequent accidents with higher levels. The connected vehicle environment provides drivers with surrounding traffic information and improve their driving behavior by helping them make safe decisions efficiently. As such, this study is to examine the effects of the connected vehicle environment on driving behavior and safety at the tunnel entrance zone” and for “specific tunnel entrance scenarios, which provides a reference for realizing active protection of vehicles at the tunnel entrance” (Li: Abstract) . Additional References Cited 07-96 AIA The prior art made of record and not relied upon is considered pertinent to applicant's disclosure and are cited in the attached PTOL-892 . Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to NITHYA J. MOLL whose telephone number is (571)270-1003. The examiner can normally be reached Monday-Friday 10am-6pm EST. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Rehana Perveen can be reached at 571-272-3676. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /NITHYA J. MOLL/Primary Examiner, Art Unit 2189 Application/Control Number: 18/106,122 Page 2 Art Unit: 2189 Application/Control Number: 18/106,122 Page 3 Art Unit: 2189 Application/Control Number: 18/106,122 Page 4 Art Unit: 2189 Application/Control Number: 18/106,122 Page 5 Art Unit: 2189 Application/Control Number: 18/106,122 Page 6 Art Unit: 2189 Application/Control Number: 18/106,122 Page 7 Art Unit: 2189 Application/Control Number: 18/106,122 Page 8 Art Unit: 2189 Application/Control Number: 18/106,122 Page 9 Art Unit: 2189 Application/Control Number: 18/106,122 Page 10 Art Unit: 2189 Application/Control Number: 18/106,122 Page 11 Art Unit: 2189 Application/Control Number: 18/106,122 Page 12 Art Unit: 2189 Application/Control Number: 18/106,122 Page 13 Art Unit: 2189 Application/Control Number: 18/106,122 Page 14 Art Unit: 2189