DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/05/2025 has been entered.
Response to Arguments
Applicant's arguments filed on 11/05/2025 have been fully considered but they are not persuasive.
Generally, Examiner notes that the present claims are directed to a computer vision system. The computer vision algorithm and camera structures do not change based on how we name objects in the environment, the computer requires programs and data structures. Examiner suggests amending the claims with specific modifications to the computer vision system that improve operation particular to the intended environment. Specification provides several examples.
Applicant argues: “Applicant has further amended claim 1 by adding …”
Examiner notes that the newly amended language is addressed by the updated reasons for rejection below.
Examiner particularly notes that the claimed optical tracking system appears to be generic. Claims indicate a preferred location to use the system, a preferred object to be tracked by the system, and preferred names for regions tracked by used by the system, but there appears to be no modification to the structure or the function of the optical tracking system itself. For example, the newly amended limitation “a hazard region in the chamber around the machines, the hazard region within or wholly or partially beyond the maximum gripping range” appears to indicate that even the definition of regions is separate for the optical tracking system and for the properties of the environment.
Examiner suggests claiming particular ways in which the optical tracking system is physically modified or reprogrammed to track objects (gloves) in Applicant’s aseptic work chamber which would not be performed in another type of chamber. Is there a user interface or a data structure storing data that is particular to Applicant’s intended application?
Claim Construction
Note that, for purposes of compact prosecution, multiple reasons for rejection may be provided for a claim or a part of the claim. The rejection reasons are cumulative, and Applicant should review all the stated reasons as guides to improving the claim language and advancing the prosecution toward an allowance.
Claim scope is not limited by claim language that suggests or makes optional but does not require steps to be performed by a method claim, or by claim language that does not limit an apparatus claim to a particular structure. However, examples of claim language, although not exhaustive, that may raise a question as to the limiting effect of the language in a claim are: (A) “adapted to” or “adapted for” clauses; (B) “wherein” clauses; and (C) “whereby” clauses. M.P.E.P. 2111.04. Other examples are where the claim passively indicates that a function is performed or a structure is used without requiring that the function or structure is a limitation on the claim itself. The clause may be given some weight to the extent it provides "meaning and purpose” to the claimed invention but not when “it simply expresses the intended result” of the invention. In Hoffer v. Microsoft Corp., 405 F.3d 1326, 1329, 74 USPQ2d 1481, 1483 (Fed. Cir. 2005). Further, during prosecution, claim language that may or may not be limiting should be considered non-limiting under the standard of the broadest reasonable interpretation. See M.P.E.P. 904.01(a); In re Morris, 127 F.3d 1048, 44 USPQ2d 1023 (Fed. Cir. 1997).
A claim containing a “recitation with respect to the manner in which a claimed apparatus is intended to be employed does not differentiate the claimed apparatus from a prior art apparatus” if the prior art apparatus teaches all the structural limitations of the claim. Ex parte Masham, 2 USPQ2d 1647 (Bd. Pat. App. & Inter. 1987).
Component arrangements or rearrangements which do not modify operation of the device cannot be relied upon to patentably distinguish the claimed invention from the prior art. In re Seid, 161 F.2d 229, 73 USPQ 431 (CCPA 1947); In re Japikse, 181 F.2d 1019, 86 USPQ 70 (CCPA 1950) (shifting the position of the starting switch was not patentable because it would not have modified the operation of the device.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1-2, 6-7, 10, 14-18, 21-27 are rejected under 35 U.S.C. 103 as being unpatentable over US 20130076898 to Philippe (“Philippe”) in view of US 20160376044 to Procyshyn (“Procyshyn“) also cited in an IDS.
Regarding Claim 1: “An arrangement for monitoring a state and movement sequence in an aseptic work chamber of a containment located in an installation room, comprising: (Note that, a preamble is generally not accorded any patentable weight where it merely recites the purpose of a process or the intended use of a structure, and where the body of the claim does not depend on the preamble for completeness but, instead, the process steps or structural limitations are able to stand alone. See In re Hirao, 535 F.2d 67, 190 USPQ 15 (CCPA 1976) and Kropa v. Robie, 187 F.2d 150, 152, 88 USPQ 478, 481 (CCPA 1951). Also note that arrangements or rearrangements which do not modify operation of the device cannot be relied upon to patentably distinguish the claimed invention from the prior art. In re Seid; In re Japikse. Also see treatment of prior art capability below.)
a work chamber; (For example, the chamber can be a storage compartment, the storage depot, or the room in front of the storage depot as in Philippe, Paragraphs 30 and 92. Cumulativley, “In one example, the chamber 20 includes one or more glove holes” Procyshyn, Paragraph 25. See statement of motivation below.)
machines installed in the work chamber; (Under the broadest reasonable interpretation consistent with the specification and ordinary skill in the art, machines are names of objects that can be recognized by the computer vision. See Specification, Page 5, lines 10-18. Prior art provides examples of such objects: “A non-exhaustive list of exemplary medical products could include medications, intravenous solutions, catheters, tubes, implants, pacemakers, gloves, needles, syringes, and so on.” Philippe, Paragraph 27. Cumulatively, “The glove holes 21 can be used to manually manipulate objects within the chamber 20” where some objects are machines: “In one example, a filling arm 40 is disposed within the chamber 20. The filling arm 40, in one example, is a robotic arm. The filling arm 40 includes filling tubing 42 extending from a pump unit 44 to a point at an end of the filling arm 40.” See Procyshyn, Paragraphs 25-26. See statement of motivation below.)
the work chamber having at least one work glove which projects into the work chamber, for extension in the work chamber as far as a maximum gripping range in the three spatial axes; (“A non-exhaustive list of exemplary medical products could include … gloves” that can be worn and used within any portion of the work chamber within reach. Philippe, Paragraph 27. Cumulatively, the object can be a hand wearing a work glove where “The movement and/or shape of the hand Bl can then be tracked” wherever the glove extends within view of the cameras as in Phlippe Paragraphs 62, 95 and similarly in Procyshyn, Paragraph 25. See detailed treatment of this embodiment below.)
a tracking system (“a medical product tracking system … Each imaging unit includes one or more cameras adapted to visually inspect storage compartments” Phillipe, Paragraphs 30, 36.)
within the work chamber (“For example, with reference to the screenshot 200 shown in FIG. 7, the imaging unit 20 may observe an image 201 of the room in front of the storage depot 12” in this case the room can embody the working chamber and the storage compartments embody the inspection areas. See Phillipe, Paragraphs 92, 98.)
including a plurality of cameras for recording continuous three-dimensional localization of the at least one work [glove] within the maximum gripping range along x, y and z axes by the tracking system (“In some embodiments, two cameras (e.g. the cameras 22, 30 or other cameras) may be oriented in the same or a similar direction, which could assist with determining spatial depth [3D] by comparing the images between the two cameras. This may be particularly useful for determining hand signals as described below.” Phillipe, Paragraphs 63, 98. Thus, cameras capture 3D for localization of user’s hands wherever they reach within the room, and this computer vision process is substantively identical to tracking the user’s hands wearing gloves. See treatment of user’s hands that use gloves below.)
a computer unit for storing the output of cameras; and (“elements may be implemented in computer programs executing on programmable computers which may include at least one processor, at least one data storage device” Philippe, Paragraphs 25, 78.)
a hazard region in the chamber around the machines … a prohibited region around the hazard region;” (First, naming regions to be observed does not differentiate the claimed apparatus structures from a prior art apparatus if the prior art apparatus teaches all the structural limitations of the claim. Since prior art defines and arrangement of a tracking system above and storage of the coordinates of the monitoring region below, it performs the claimed function.
Cumulatively, it is ordinary to define detection zones and regions around objects and locations of interest in order to monitor events and activities in those regions: “During use the imaging unit 20 can define one or more detection zones 56, 58 around one or more of the medical products 52 ( e.g. the detection zone 56 is around the syringe in storage region 40a) and/or within one or more regions 40a, 40b, 42a, 42b ( e.g. the detection zone 58 is around storage region 42a). The detection zones 56, 58 allow the imaging unit 20 to determine whether there is an object in the corresponding region 40a, 40b, 42a, 42b.” Phillipe, Paragraph 71. Although Phillipe does not name the regions in the same manner as the claim (i.e. prohibited, hazard), before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to designate various regions in the area under optical inspection for detecting particular objects, motions, and events (objects + motions) that trigger computer vision detection and actions. See Phillipe, Paragraphs 69-70 and 40-42.
Finally, it is well understood that some chemicals and equipment in the clean room may be hazardous, they may be designated so by markings or barriers, and understood to be subjected to limited human contact. Procyshyn, Paragraph 2. See statement of motivation below. Note that specific applications with respect to named regions and their designated properties are addressed in the dependent claims below.)
the hazard region within or wholly or partially beyond the maximum gripping range; (This element appears to indicate that there is no required relationship between the definition of the hazard region and the maximum gripping range. As noted above, prior art tracks hand movements wherever hands may reach within the tracked region, as in Phillipe, Paragraphs 63, 98. Also as noted above various regions can be defined and named for tracking objects including gloves and hands in areas of interest or concern, as in Phillipe, Paragraphs 69-71 and 40-42. So both elements naturally exist in the prior art and can be covered by a camera tracking system.)
if the at least one work [glove] penetrates the prohibited region.” (“The detection zones 56, 58 allow the imaging unit 20 to determine whether there is an object in the corresponding region 40a, 40b, 42a, 42b.” Philippe, Paragraph 71. The detected object can be a medical product such as a glove in Philippe, Paragraph 27, or a person or a person’s hand, see Philippe, Paragraph 62 and Claim 1. It is implicit or obvious that the detected hands can be wearing gloves in the sterile environment of Philippe and as noted below.)
alarmed region coinciding with the prohibited region to issue a signal (“In some embodiments, other techniques may be used to indicate states or communicate information to a user. For example, a speaker could be used to generate audible alerts ( e.g. beeps) or express a recorded voice. … the wrong medical products have been detected in a particular storage compartment 14).” Phillipe, Paragraphs 60, 69. As noted above, a state can be a presence of a person or a hand in a region, and a wrong medical product in the region can be a glove. As described in Phillipe, the computer vision tracking system and method do not change by selecting a region using the region selection feature, and selecting an object (glove) to track out of known objects that can be tracked, or by placing the tracking system in a different chamber or room where the same process can be performed. Such substitutions appear to be obvious.)
Phillipe does not teach the claim elements below. Phillipe is relevant for teaching a tracking system that monitors compartments in a room which operates in the same manner as the claimed tracking system. However, Philipe does not provide an example of a room with the claimed objects.
Procyshyn teaches using optical monitoring in the context of a substantively similar room and equipment:
arrangement for monitoring … in an aseptic work chamber of a containment located in an installation room, comprising: (First, as noted above, a recitation with respect to the manner in which a claimed apparatus is intended to be employed or arranged does not differentiate the claimed apparatus from a prior art apparatus where the prior art apparatus teaches all the structural limitations of the claim.
Cumulatively, under the broadest reasonable interpretation consistent with the specification and ordinary skill in the art, an aseptic work chamber can be a clean room that is cleaned or sterilized. See Specification, Page 5, lines 17-18. The cleanroom in Procyshyn is similarly cleaned and sterilized: “Articulated cleanroom robots have been employed which utilize internal negative pressure with an exhaust to generate cleanroom capability. With the chemical sterilization and handling of potent drugs within the isolator [chamber]” Procyshyn, Paragraph 4. Thus, the camera tracking system as in the prior art is known to be employed in many varieties of locations including locations that are similar in function and purpose to the intended location of the claims.)
at least one work glove [which projects into the work chamber, wherein the respective work glove can be extended in the work chamber as far as a maximum gripping range in the three spatial axes, and wherein:] (“In one example, the chamber 20 includes one or more glove holes 21 disposed in the walls of the chamber 20. The glove holes 21 can be used to manually manipulate objects within the chamber 20 without opening the chamber 20 or otherwise compromising the environmental conditions within the chamber 20.” Procyshyn, Paragraph 25. It is understood that such a work glove naturally has a maximum gripping range within the chamber such that it operates “without opening the chamber 20 or otherwise compromising the environmental conditions within the chamber.”)
Finally, Procyshyn also provides the context of optical monitoring: “the sensor 12 is an optical sensor, a camera system, or a laser system. The sensor 12, in one example, is mounted at a top surface of the chamber 20 and is positioned to sense an area within the chamber 20, as portrayed by a sensing cone 14. For instance, the optical sensor 12 can be configured to locate containers 90 within the sensing cone 14 and target centers of the openings of the containers 90.” Procyshyn, Paragraph 27.
Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to use an optical monitoring and computer vision tracking system that track objects in a clean hospital setting of Philipe to perform optical tracking and computer vision in a containment room such as cleanroom with chambers accessible by a glove, as taught in Procyshyn, in order to locate and monitor objects of interest in a medication filling environment of Procyshyn as well as in a medication storage environment of Philipe. See, Procyshyn, Paragraph 27.
Finally, in reviewing the present application, there does not seem to be objective evidence that the claim limitations are particularly directed to: addressing a particular problem which was recognized but unsolved in the art, producing unexpected results at the level of the ordinary skill in the art, or any other objective indicators of non-obviousness.
Regarding Claim 2: “The arrangement as claimed in claim 1, characterized in that wherein:
a) the at least one work glove comprises two work gloves and each work glove may respectively hold an object; and (Prior art provides for multiple “glove holes 21 can be used to manually manipulate objects within the chamber 20 without opening the chamber 20 or otherwise compromising the environmental conditions within the chamber 20.” See, Procyshyn, Paragraph 25, Fig. 1, and statement of motivation in Claim 1.)
b) recordings of the tracking system are used for continuous three-dimensional localization both work gloves.” (Note that this claim is directed to a physical arrangement of components, it is not a method and does not affirmatively perform a step of “performing three-dimensional localization.” To the extent this claim element recites an intended use of the arrangement of objects in a room for the purposes of localizing the objects based on a visual recording, Phillippe indicates that the tracking system locates the objects (such as gloves) in the room by their 3D location in the room and with respect to other objects in the room in Paragraphs 27-32. Therefore, this is a known use of a tracking system in a room of objects.)
Regarding Claim 6: “The arrangement as claimed in claim 1, wherein the tracking system comprises: … a) a first camera, which has a first acquisition angle; and … b) a second camera, which has a second acquisition angle; wherein: … ca) the first camera monitors and records two spatial axes and the second camera monitors and records the third spatial axis; … cb) the first acquisition angle and the second acquisition angle overlap so that all regions in the work chamber are monitored and recorded; and … cc) the recorded states and movement sequences are stored as 3D data in the computer unit.” (“In some embodiments, two cameras (e.g. the cameras 22, 30 or other cameras) may be oriented in the same or a similar direction, which could assist with determining spatial depth by comparing the images between the two cameras.” Thus the first camera captures 2D and the addition of the second camera enables capturing 3D. See Phillipe, Paragraph 63 and imaging the entire compartment in Fig. 16. Cumulatively, “In such embodiments the imaging units 20 may send raw image data 112 (e.g. a raw video stream) to an image processing server 110” See Phillipe, Paragraph 84.)
Regarding Claim 7: “The arrangement as claimed in claim 6, further compnsing
a third camera with a third acquisition angle wherein: (“Each imaging unit includes one or more cameras adapted to visually inspect storage compartments e.g. drawers in a cabinet, bins on open shelves, etc.) and determine the presence or absence of medical products therein … one or more of the imaging units 20 may communicate with the server 102” where more cameras may be naturally required for larger compartments, and more imaging units may be required for larger number of compartments. See Phillipe, Paragraphs 30, 77.)
a) the first camera with its first acquisition angle monitors and records the first spatial axis, the second camera with its second acquisition angle monitors and records the second spatial axis and the third camera with its third acquisition angle monitors and records the third spatial axis; … b) the acquisition angles overlap so that all regions in the work chamber are recorded; and … c) the recorded states and movement sequences are stored as 3D data in the computer unit.” (As noted in Claim 6: “In some embodiments, two cameras (e.g. the cameras 22, 30 or other cameras) may be oriented in the same or a similar direction, which could assist with determining spatial depth by comparing the images between the two cameras.” Thus, the first camera captures 2D and the addition of the second camera enables capturing 3D. See Phillipe, Paragraph 63 and imaging the entire compartment in Fig. 16. Cumulatively, “In such embodiments the imaging units 20 may send raw image data 112 (e.g. a raw video stream) to an image processing server 110” See Phillipe, Paragraph 84. See addition of further cameras with further fields of view above.)
Regarding Claim 10: “The arrangement as claimed in claim 9, characterized in that wherein the tracking system further comprises biometric detection apparatus to detect the biometric features of an operator positioned at the work chamber.” (“For example, at least one of the first and second cameras 22, 30 may be used to capture facial features [biometric features] and/or gestures (such as facial gestures or hand signals) to communicate additional information to the imaging unit 20. This may be particularly useful for associating patient information with particular medical products, which can be important for accurate patient charging, for facial recognition and/or for access control.” Phillipe, Paragraph 87.)
Regarding Claim 14: “The arrangement of claim 1, wherein the prohibited region comprises a three-dimensional region.” (For example, the 3D monitored region (designated for some objects in the video and prohibited to others) can be a storage compartment. See Phillippe, Paragraphs 62, 69-71.)
Regarding Claim 15: “The arrangement of claim 1 wherein the prohibited region comprises a two-dimensional region.” (For example, the 2D monitored region (designated for some objects in the video and prohibited to others) can be a designated storage region in a particular storage compartment, for example: “the various storage regions 40a, 40b, 42a, 42b in a drawer 14a can be defined by one or more dividers 44, 46.” See Phillippe, Paragraphs 66, 69-71, and Fig. 3.)
Regarding Claim 16: “The arrangement of claim 1 wherein a warning region adjoins the prohibited region.” (For example, regions designated for certain objects can border regions prohibited for those objects: “the various storage regions 40a, 40b, 42a, 42b in a drawer 14a can be defined by one or more dividers 44, 46. … a first type of medical product 52 ( e.g. a syringe) is located within each of the storage regions 40a, 40b, a second type of medical product 54 ( e.g. a box of needles) is located in the secondary storage region 42b,” See Phillippe, Paragraphs 66, 70, and Fig. 3.)
Claim 17 is rejected for reasons stated for Claim 16 in view of the Claim 14 rejection.
Claim 18 is rejected for reasons stated for Claim 16 in view of the Claim 15 rejection.
Regarding 21: “The arrangement of claim 16 wherein the signal unit alerts of unallowed penetration into a warning region.” (For example, a speaker could be used to generate audible alerts ( e.g. beeps) or express a recorded voice,” in response to the wrong object being present in the wrong region or in the wrong part of the monitored [designated] region. Phillipe, Paragraphs 60, 69-72. Note that while Philippe does not name its regions prohibited or warning, it does designate regions by number and by content of intended / unintended objects. See, Phillipe, Paragraphs 66-72.
Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to issue an alert of a prohibited event, (such as an unallowed penetration of a particular object into a particular region in the room) when such an event is detected.)
Regarding Claim 22: “The arrangement of claim 1 wherein the signal unit alerts with an acoustic signal.” (“For example, a speaker could be used to generate audible alerts” Philippe, Paragraph 60.)
Regarding Claim 23: “The arrangement of claim 1 wherein the signal unit alerts with a vibration signal.” (“For example, a speaker could be used to generate audible alerts” which is a vibration signal. Philippe, Paragraph 60.)
Regarding Claim 24: “The arrangement of claim 1, wherein individual surface sections are a prohibited region.” (Note that objects can be within defined detection zones (such as a prohibited region) that can be within defined regions (such as a hazard region); “During use the imaging unit 20 can define one or more detection zones 56, 58 around one or more of the medical products 52 ( e.g. the detection zone 56 is around the syringe in storage region 40a) and/or within one or more regions 40a, 40b, 42a, 42b ( e.g. the detection zone 58 is around storage region 42a). The detection zones 56, 58 allow the imaging unit 20 to determine whether there is an object in the corresponding region 40a, 40b, 42a, 42b.” See Phillipe, Paragraph 71. See obviousness of naming regions in Claim 1.)
Regarding Claim 25: “The arrangement of claim 24, wherein the coordinates of the prohibited region are stored in a computer unit.” (“In some embodiments, the imaging unit 20 may adapted to learn the particular layout of each particular storage compartment 14 (e.g. the size and shape of the regions 40a, 40b, 42a, 42b may be observed and stored in a database” Phillipe, Paragraph 68.)
Regarding Claim 26: “The arrangement of claim 1, wherein the entire bottom is a hazard.” (Note that objects can be within defined detection zones around one or more products (such as a prohibited region) that can be within defined regions (such as a hazard region); “During use the imaging unit 20 can define one or more detection zones 56, 58 around one or more of the medical products 52 ( e.g. the detection zone 56 is around the syringe in storage region 40a) and/or within one or more regions 40a, 40b, 42a, 42b ( e.g. the detection zone 58 is around storage region 42a). The detection zones 56, 58 allow the imaging unit 20 to determine whether there is an object in the corresponding region 40a, 40b, 42a, 42b.” See Phillipe, Paragraph 71. See obviousness of naming regions in Claim 1.)
Regarding Claim 27: “The arrangement of claim 26, wherein coordinates of the hazard are stored in a computer unit.” (“It will be appreciated that the detection region 24 may vary in size and shape, particularly to accommodate different storage compartments 14. … In some embodiments, the imaging unit 20 may adapted to learn the particular layout of each particular storage compartment 14 (e.g. the size and shape of the regions 40a, 40b, 42a, 42b may be observed and stored in a database” Phillipe, Paragraphs 47, 68.)
Claims 3-5, 12-13, 19-20 are rejected under 35 U.S.C. 103 as being unpatentable over US 20130076898 to Philippe (“Philippe”) in view of US 20160376044 to Procyshyn (“Procyshyn“) also cited in an IDS, and in view of US 20200368616 to Delamont (“Delamont”).
Regarding Claim 3: “The arrangement as claimed in at least one of claims 1 and 2, characterized in that claim 2, wherein:
a) the spatial positions of work gloves and objects within the xyz coordinate system formed by the three spatial axes are recorded with [a timestamp] in the computer unit; and (“In some embodiments, two cameras (e.g. the cameras 22, 30 or other cameras) may be oriented in the same or a similar direction, which could assist with determining spatial depth by comparing the images between the two cameras. This may be particularly useful for determining hand signals as described below.” Phillipe, Paragraph 63. Thus cameras capture 3D for localization of user’s hands. See treatment of user’s hands that use gloves in Claim 1.)
b) a real-time comparison with the prohibited regions is established in the computer unit.” (“The detection zones 56, 58 allow the imaging unit 20 to determine [comparison] whether there is an object in the corresponding region 40a, 40b, 42a, 42b.” Phillipe, Paragraph 71. “For example, if the user A removes a particular medical product, this information can be sent to the server 102 (in some embodiments in real time or substantially real time, … The image processing server 110 can then analyze the raw image data 112” Phillipe, Paragraphs 82, 84. Thus, the regions can be designated for certain objects and prohibited for others.)
Phillipe and Procyshyn do not teach: “recorded with a timestamp” however, video is commonly recorded using timestamps in order to track playback and join different streams of video:
Delamont teaches the above claim feature in the context of tracking user movements and synchronizing captured, tracked, and virtual images: “allows adjustments to be made in real-time in the 3D projection mapping of augmented images over the real-world AI Character 94. Here the game server 88 or host 89 may know in advance of the move and be provided with timestamps for the coordinates, where movements are a result of a rag doll simulation for example, in which in precise time sequence with the movements of the AI Characters 94 body adjustments may be made” Delamont, Paragraph 1065.
Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to supplement the teachings of P Phillipe and Procyshyn to provide video with timestamps as taught in Delamont, in order to provide a “precise time sequence with the movements” that are being tracked. See, Delamont, Paragraph 1065.
Finally, in reviewing the present application, there does not seem to be objective evidence that the claim limitations are particularly directed to: addressing a particular problem which was recognized but unsolved in the art, producing unexpected results at the level of the ordinary skill in the art, or any other objective indicators of non-obviousness.
Regarding Claim 4: “The arrangement as claimed in claim 1, wherein:
a) containers are temporarily located in the work chamber; and (“recognizing and properly interpreting user activities, such as a user approaching a storage compartment, opening a drawer, removing a medical product from the storage compartment, putting an item in the storage compartment (e.g. restocking a medical product), or changing the position of a medical product” Phillipe, Paragraph 39. See similarly in Procyshyn, Paragraph 7 and statement of motivation in Claim 1.)
b) the spatial positions of the containers are recorded in three dimensions [with a timestamp] in the computer unit.” (“The detection zones 56, 58 allow the imaging unit 20 to determine whether there is an object in the corresponding region 40a, 40b, 42a, 42b.” Phillipe, Paragraph 71. “A non-exhaustive list of exemplary medical products could include medications, intravenous solutions, catheters, tubes, implants, pacemakers, gloves, needles, syringes, and so on,” which include examples of containers. Phillipe, Paragraph 27. See recording of 3D video with determination of spatial depth in Claim 1.)
“recorded with a timestamp” (Delamont teaches the above claim feature in the context of tracking user movements and synchronizing captured, tracked, and virtual images: “allows adjustments to be made in real-time in the 3D projection mapping of augmented images over the real-world AI Character 94. Here the game server 88 or host 89 may know in advance of the move and be provided with timestamps for the coordinates, where movements are a result of a rag doll simulation for example, in which in precise time sequence with the movements of the AI Characters 94 body adjustments may be made” Delamont, Paragraph 1065. See statement of motivation in Claim 3.)
Regarding Claim 5: “The arrangement as claimed in claim 1, further coprising [virtual reality glasses] a) to display the prohibited regions and optional warning regions defined in the computer unit as graphics elements; and b) to display a surface to be cleaned according to the cleaning and disinfection program saved in the computer unit.” (Phillipe uses a display to guide the user to proper zones (and thus away from improper zones):“In some embodiments, the imagining unit 20 may inform the user of the proper storage compartment in other ways, which may be audible (e.g. a wired or wireless speaker), visual ( e.g. using a display such as an LCD display,” for example: “medical products from the corresponding secondary storage region 42b [prohibited or warning regions] can be rotated or moved into the primary storage region 42a.” Phillipe, Paragraphs 103-104.
However, Phillipe and Procyshyn do not teach to display using: “virtual reality glasses.”
Delamont teaches the above claim feature in the context of tracking user movements and synchronizing captured, tracked, and virtual images: “Alternatively, virtual retinal display (VRD) may be used. For example, low cost alternative liquid crystal display (LCD) shutter 3D glasses or wearable display could be used to display augmented scenes” Delamont, Paragraph 419.) See statement of motivation in Claim 3.
Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to supplement the teachings of Phillipe and Procyshyn to use virtual reality glasses as taught in Delamont, “for presenting images that create the illusion of an image having three dimensional form.” Delamont, Paragraph 420.
Finally, in reviewing the present application, there does not seem to be objective evidence that the claim limitations are particularly directed to: addressing a particular problem which was recognized but unsolved in the art, producing unexpected results at the level of the ordinary skill in the art, or any other objective indicators of non-obviousness.
Regarding Claim 12: “The arrangement as claimed in at least one of claims l to 11, characterized in that claim 1, wherein:
a) the at least one work glove of the work chamber includes systematically distributed marking points; and (“the real-world 3D mesh is formed of multiple surface display panel faces 57L, 57R, 57 overlaid on the physical IR Laser Gun Apparatus 47 device and real-world game object in which the object/device is formed of real world vertices/points, edges, faces, polygons and surfaces [another words marking points] in three-dimensional space of the real-world” Delmont, Paragraphs 568, 574. See statement of motivation in Claim 5.)
b) the volume body formed by the work glove is recorded as a CAD wire model by the tracking system.” (“Other usages of the devices cameras 50 include the generating of 3D mesh data, wireframes and 3D models containing geometric, depth and volumetric data of the real-world objects, surfaces and surrounds using spatial mapping techniques, which is used in the display of augmented images via the users Augmented Reality ("AR") display” Delmont, Paragraph 513. See statement of motivation in Claim 5.)
Claim 13 is rejected for reasons stated for Claim 12 in view of the Claim 2 rejection.
Regarding Claim 19: “The arrangement of claim 6 wherein the 3D data comprises video streams.” (Phillippe teaches generating stereoscopic 3D but does not discuss streaming this information. See Philippe, Paragraph 84.
Delamont teaches “data stream sources of differing 2D stereoscopic images [3D data] or video to be displayed in front of each of the user's eye's respectively to achieve the display of a virtual image which appears to the user as a 3D formed image” Delamont, Paragraph 1065. See statement of motivation in Claim 3.)
Claim 20 is rejected for reasons stated for Claim 19 in view of the Claim 7 rejection.
Claim 11 is rejected under 35 U.S.C. 103 as being unpatentable over US 20130076898 to Philippe (“Philippe”) in view of US 20160376044 to Procyshyn (“Procyshyn“) also cited in an IDS, and in view of US 20200368616 to Delamont (“Delamont”) and further in view of US 20150077528 to Awdeh (“Awdeh”).
Regarding Claim 11: “The arrangement as claimed in claim 5, characterized in that wherein the virtual reality glasses display,
a prescribed intensity for the surface cleaning,
As noted in Claim 5, Phillipe uses a display to guide the user to perform a desired action such as rearranging, ordering, and orienting stored objects, which is a form of cleaning. Phillipe, Paragraphs 103-104.
However, Philippe, Procyshyn, and Delamont are not concerned with cleaning surfaces at pressure, and so do not provide guidance to these actions specifically.
Awdeh teaches the above claim feature (cleaning surfaces at specified intensities) in the context of “medical visualization systems” similar to visual systems in Philippe, Procyshyn: “Although not shown, one skilled in the art will also appreciate that system 10 may be adopted to assist the surgeon to perform various other procedures known in the art. For example, after the sculpt procedure, eyewear device 7 may display any one or more of the following messages: … [first intensity] a text message "QUADRANT REMOVAL" prompting the surgeon to remove large chunks of the cataract using the vacuum instrument; … [second intensity] a text message "EPINUCLEUS" prompting the surgeon to remove remaining epinucleus layer of the lens; … [third intensity] a text message "IRRIGATION & ASPIRATION" prompting the surgeon to clean and remove any leftover material more thoroughly using smaller instruments with lower vacuum pressure; … [fourth intensity] a text message "POLISH" prompting the surgeon to posh the lens capsule;” Awdeh, Paragraph 85.
Therefore, before the effective filing date of the claimed invention, it would have been obvious to one of ordinary skill in the art to supplement the teachings of Philippe, Procyshyn, and Delamont to clean surfaces at a prescribed intensity as taught in Awdeh, so “that system 10 may be adopted to assist the surgeon to perform various other procedures known in the art.” Awdeh, Paragraph 85.
Finally, in reviewing the present application, there does not seem to be objective evidence that the claim limitations are particularly directed to: addressing a particular problem which was recognized but unsolved in the art, producing unexpected results at the level of the ordinary skill in the art, or any other objective indicators of non-obviousness.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MIKHAIL ITSKOVICH whose telephone number is (571)270-7940. The examiner can normally be reached Mon. - Thu. 9am - 8pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Joseph Ustaris can be reached at (571)272-7383. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MIKHAIL ITSKOVICH/Primary Examiner, Art Unit 2483