DETAILED ACTION
Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Continued Examination Under 37 CFR 1.114
A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 12/01/2025 has been entered.
Priority
Acknowledgment is made of applicant’s claim for foreign priority under 35 U.S.C. 119 (a)-(d). The certified copy has been filed in parent Application No.JP2021045514, filed on 03/19/2021.
Response to Amendment
The amendment filed on 12/01/2025, in response to the Final Office Action dated on 08/29/2025, has been received and made of record. Claims 1 and 9 are amended, and Claims 1 and 5-9 are pending in the current application.
Response to Arguments
Applicant’s arguments filed on 12/01/2025, have been fully considered.
In the Arguments/Remarks:
Re: Rejection of the Claims Under 35 U.S.C. 103
Applicant’s arguments regarding claims 1 and 5-9 as being rejected under 35 U.S.C. 103 have been fully considered but are directed towards the newly amended limitations not previously examined by the examiner. Applicant’s arguments are moot in view of the new grounds of rejection (see below) necessitated by the applicant’s amendments.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made.
The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows:
1. Determining the scope and contents of the prior art.
2. Ascertaining the differences between the prior art and the claims at issue.
3. Resolving the level of ordinary skill in the pertinent art.
4. Considering objective evidence present in the application indicating obviousness or nonobviousness.
Claims 1, 5-7 and 9 are rejected under 35 U.S.C. 103 as being unpatentable over Peh (US 2018/0323095 A1) in view of Goto (US 2019/0344446 A1) and in further view of Krupyshev (US 2019/0237351 A1).
Regarding claim 1, Peh teaches a horizontally articulated substrate transfer robot that transfers a plurality of substrates comprising: an arm; [(see at least Fig.2, paragraph 28) “Alternatively, or in combination with the actuator 118, the motion assembly 106 may include a robotic arm. FIG. 2 depicts an exemplary robotic arm 210.”] a hand attached to the arm that supports and transfers each substrate; [(see at least Fig.2, paragraph 28) “The robotic arm 210 includes a rear end 214 and a blade 216. The rear end 214 includes a mounting surface 218. A mounting fixture 220 supports the high resolution camera 102 atop the mounting surface 218. The mounting fixture 220 may be any suitable holder such as a bracket, or a clamp, or the like. The blade 216 can have an edge gripper 222 or other suitable mechanism to secure substrates thereto during transfer (e.g. during transfer of substrate 208 into and out of substrate storage cassette 202, discussed below).”] a single camera attached to the hand that captures images of at least some of the plurality of substrates placed at a take-out position from a plurality of viewpoints to acquire images of the at least some of the plurality of substrates; [(see at least Fig.2, paragraph 28) “A mounting fixture 220 supports the high resolution camera 102 atop the mounting surface 218. The mounting fixture 220 may be any suitable holder such as a bracket, or a clamp, or the like.”]
Peh teaches wherein the substrates are accommodated in an accommodating body [(see at least Fig.2, paragraph 26) “The substrate storage cassette 202 (e.g., FOUP) has a bottom 203, a front opening 204, and a top 205. The substrate storage cassette 202 further comprises an array of spaced apart slots 206. Each slot 206 is configured to receive and support a substrate 208. As illustrated in the exemplary embodiment of FIG. 2, a total of N vertically arranged slots 206 may be labeled as slots 206-1 to 206-N, from the bottom 203 to the top 205,”]
Peh does not explicitly teach a calculator that calculates three-dimensional information of each substrate based on the images acquired by the camera; and a motion controller that moves the hand to take out the substrate based on the three-dimensional information of the substrate calculated by the calculator; wherein the motion controller moves the hand to a first shooting position to obtain a first image by the camera, the first shooting position being located at a first height relative to the accommodating body, and then moves the hand to a second shooting position to obtain a second image by the camera, the second shooting position being located at the same first height, and the calculator calculates three-dimensional information of all the substrates accommodated in the accommodating body based on first and second images obtained by the camera.
However, Goto teaches a calculator that calculates three-dimensional information of each substrate based on the images acquired by the camera; [(see at least paragraphs 7,66) As in 7 “a hand attached to a tip end of the robot arm; a camera fixed and attached to a portion of the hand other than a substrate placing portion of the hand so as to take an image of the substrate placing portion, the substrate being placed at the substrate placing portion; an image data acquirer configured to acquire image data taken by the camera, the image data including a teaching substrate and the substrate placing portion of the hand, the teaching substrate being arranged as a teaching target at the substrate target position” As in 66 “a plurality of cameras 3 may be arranged, or a three-dimensional camera may be used. The three-dimensional camera simultaneously takes images of a target object from different directions to generate parallaxes, thereby acquiring the distance information to the target object. The three-dimensional camera can generate coordinate data of a surface of the target object in a predetermined coordinate system. The three-dimensional camera is a type of stereo camera. The three-dimensional camera includes a pair of cameras arranged away from each other by a predetermined distance, and these cameras include respective image pickup elements.”] and a motion controller that moves the hand to take out each substrate based on the three-dimensional information of each substrate calculated by the calculator [(see at least paragraphs 54-62) As in 61 “only by taking the image of the space S including the teaching substrate W at the target position T and the blade 2b of the hand 2 by the camera 3, the distance information from the blade 2b to the teaching substrate W can be calculated based on the camera image, and the operation of the robot 1 can be controlled based on the distance information such that the virtual substrate VW virtually arranged at the blade 2b coincide with the teaching substrate W. By storing the position of the hand 2 at this time as the teaching data, the position of the hand 2 corresponding to the target position T of the substrate W can be taught to the robot 1. With this, the correct position can be accurately taught to the robot 1 without depending on the skill of the operator.”]
Goto teaches the calculator calculates three-dimensional information of all the substrates accommodated in the accommodating body based on the first and second images obtained by the camera. [(see at least paragraph 66) “a plurality of cameras 3 may be arranged, or a three-dimensional camera may be used. The three-dimensional camera simultaneously takes images of a target object from different directions to generate parallaxes, thereby acquiring the distance information to the target object. The three-dimensional camera can generate coordinate data of a surface of the target object in a predetermined coordinate system. The three-dimensional camera is a type of stereo camera. The three-dimensional camera includes a pair of cameras arranged away from each other by a predetermined distance, and these cameras include respective image pickup elements. The three-dimensional camera may acquire not only the distance information but also color information (RGB, etc.) of the target object. Further, the three-dimensional camera may emit a laser and acquire the distance information to a reflection point from a reflection position and a reflection time.”]
Krupyshev teaches wherein the motion controller moves the hand to a first shooting position to obtain a first image by the camera, the first shooting position being located at a first height relative to the accommodating body, and then moves the hand to a second shooting position to obtain a second image by the camera, the second shooting position being located at the same first height [(see at least Figs.10-12, paragraph 60) “while the first image of the part 580 of the robot arm 210A is shown in dashed lines). For example, the calibration image 590 includes an end effector 1000 in the predetermined repeatable position 650, 600 before placement into the substrate station modules 130. As may be realized and shown in FIG. 7, a predetermined repeatable position (or more than one) 650′ may be located further offset in the direction of extension (R, θ) from the predetermined repeatable retracted position 650, 600 so as to provide a series, or at least a pair (650, 600) of predetermined repeatable positions (650′, 600). As may be realized, the first image 570 may be generated with the arm a position 600/650. A second image from the series is generated with the arm at position 650′/650 and so on. The first, second, and each other image in the series generated with the arm in a different predetermined repeatable position is compared to, a corresponding calibrated image with the arm in the predetermined position. During operation of the at least one robot arm 210A, 211A in the transport chamber 125B′, and as the processing temperature of the substrate processing equipment changes, radial transitions of the robot arm 210A, 211A may drift (e.g., the position of the imaged end effector and thus point 1010 as well as the center point 1000WC that has a fixed relation to point 1010 will vary from the position in calibration and as defined in the calibration image registered by the controller 110). As such it is possible to measure the resultant thermal effects and/or the other variabilities by comparing the position data of the calibration image 590 to their relative values in at least the first image 570 for the series of predetermined repeatable positions 650, 650′, 600.”] Examiner notes that Krupyshev does not disclose any relative height information in regards to the first and second images. Krupyshev appears unconcerned in regards to the relative height of the first and second images. Examiner further notes the first and second shooting positions being at the first height do not seem to be advantageous or essential to applicant’s claimed invention. Applicant’s specification paragraph 32 discloses “In the present embodiment, the distance from the container 20 to the first shooting position and the distance from the container 20 to the second shooting position are the same, but may be different. The height of the second shooting position is the height at which the camera 15 is located below the midpoint of the container 20 in a height direction, as shown in FIG. 5. In the present embodiment, the height of the second shooting position is the same as the height of the first embodiment, but may be different.” The height being the same for the first and second shooting positions does not seem to solve any stated problem or is for any particular purpose and it appears that the invention would perform equally well with heights being different. See MPEP2143.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Peh to incorporate the teachings of Goto of a calculator that calculates three-dimensional information of each substrate based on the images acquired by the camera in order to calculate distance information from the substrate placing portion to the teaching substrate based on the image data of the camera [(Goto 10)] and of a motion controller that moves the hand to take out each substrate based on the three-dimensional information of each substrate calculated by the calculator in order for the correct position of the robotic arm can be accurately taught/sent to the robot without depending on the skill of the operator [(Goto 61)] and of the calculator calculates three-dimensional information of all the substrates accommodated in the accommodating body based on first and second images obtained by the camera in order to acquire the distance information to a reflection point from a reflection position and a reflection time [(Goto 66)] and to incorporate the teachings of Krupyshev of the motion controller moves the hand to a first shooting position to obtain a first image by the camera, the first shooting position being located at a first height relative to the accommodating body, and then moves the hand to a second shooting position to obtain a second image by the camera, the second shooting position being located at the same first height in order to use the images to compare the calibration to identify positional variance. [(Krupyshev 64)]
Regarding claim 5, It view of the above combination of references, Peh further teaches wherein the camera is disposed on a top surface of the hand, and the first height is a position lower than the center of the accommodating body in a height direction. [(see at least paragraph 30) “In operation in accordance with the embodiment depicted in FIG. 2, the prior processing the substrates 208, the high resolution camera 102 is, for example, lowered to the bottom of the substrate storage cassette 202. Starting with the lowermost slot 206 (e.g. slot 206-1), the high resolution camera 102 performs a continuous substrate presence scan from the bottommost slot 206 (e.g. slot 206-1) to the topmost slot (e.g. slot 206-25). When the topmost slot (e.g. slot 206-25) has been scanned, the camera comes to a stop. The substrate presence scan is performed to verify if a substrate 208 is present in each slot 206. The high resolution camera 102 relays the findings of the substrate presence scan to the DAI 108. The substrate presence scan can also be performed in any other suitable order, such as top to bottom, or the like.”]
Regarding claim 6, Modified Peh has all of the elements of claim 1 as discussed above.
Peh does not explicitly teach wherein the motion controller moves the hand to align a reference position of each substrate with a reference position of the hand to take out each substrate.
However, Goto teaches wherein the motion controller moves the hand to align a reference position of each substrate with a reference position of the hand to take out each substrate. [(see at least paragraph 63) “ the storage unit 72 stores in advance as the operation program the rough route to the thirteenth shelf of the FOUP 101 at which a teaching substrate W13 at the target position T is placed, and the control command generator 76 corrects the position command values of the servo motors of the driving portions 15, 25, 35, and 19. However, the position command values of the servo motors of the driving portions 15, 25, 35, and 19 may be generated based on the distance information from the blade 2b of the hand 2 to the teaching substrate W such that the virtual substrate VW coincides with the teaching substrate W.”]
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of modified Peh to incorporate the teachings of Goto of the motion controller moves the hand to align a reference position of each substrate with a reference position of the hand to take out each substrate in order for the correct position for the robotic arm can be accurately taught/sent to the robot without depending on the skill of the operator. [(Goto 2)]
Regarding claim 7, Modified Peh has all of the elements of claim 1 as discussed above.
Peh does not explicitly teach wherein the motion controller moves the hand to take out each substrate based on a three-dimensional position and a three-dimensional shape of each substrate calculated by the calculator.
However, Goto teaches wherein the motion controller moves the hand to take out each substrate based on a three-dimensional position and a three-dimensional shape of each substrate calculated by the calculator. [(see at least Fig.7, paragraphs 58-60) As in 58 “As shown in FIG. 7, at the initial position, the virtual substrate VW is virtually arranged at the blade 2b. At this time, based on the camera image taken by the camera 3, the distance information calculator 74 calculates the distance information from the blade 2b (substrate placing portion) of the hand 2 to the teaching substrate W. In the present embodiment, the distance information calculator 74 calculates the distance information from the blade 2b to the teaching substrate W by pattern matching between the image of the virtual substrate VW and the image of the teaching substrate W.” As in 59 “Next, based on the distance information from the blade 2b to the teaching substrate W, the control unit 70 controls the operation of the robot arm 4 such that the virtual substrate VW coincides with the teaching substrate W (Step S4 in FIG. 5). It should be noted that as the operation program, the storage unit 72 stores in advance a rough route to the thirteenth shelf of the FOUP 101. The control command generator 76 generates the position command values of the servo motors of the driving portions 15, 25, 35, and 19 in accordance with the operation program. To be specific, since the control unit 70 can recognize the distance information from the blade 2b to the teaching substrate W in a robot coordinate system, the control unit 70 corrects the position command values of the servo motors of the driving portions 15, 25, 35, and 19 such that the coordinates of the surfaces of these substrates coincide with each other. FIG. 8 is a third example of the image taken by the camera 3. The servo control unit 71 controls the driving portions 15, 25, 35, and 19 based on the supplied control commands to operate the robot arm 4. With this, as shown in FIG. 8, the teaching substrate W13 and the virtual substrate VW can coincide with each other”]
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of modified Peh to incorporate the teachings of Goto of the motion controller moves the hand to take out each substrate based on a three-dimensional position and a three-dimensional shape of each substrate calculated by the calculator in order for the correct position for the robotic arm can be accurately taught/sent to the robot without depending on the skill of the operator. [(Goto 2)]
Regarding claim 9, Peh teaches a substrate take-out method of taking out a plurality of substrates placed at a take-out position using a horizontally articulated robot comprising: a photographing process in which a single camera attached to a hand included in the robot is used to capture images of at least some of the plurality of substrates placed at the take-out position from a plurality of viewpoints to acquire images of the at least some of the plurality of substrates [(see at least Fig.2, 28-33) As in 31 “Subsequently and optionally, for example starting with a first slot—such as the topmost slot 206 (e.g. slot 206-25)—the high resolution camera 102 begins capturing images of the planar principal surfaces of the substrates 208 (e.g., x-y planes of substrates 208-25 to 208-1). The images may be obtained at an angle to the principal planar surface due to their location in the substrate carrier. The high resolution camera 102 comes to a stop after capturing the x-y plane image of the last substrate—such as the bottommost substrate (e.g. substrate 208-1). Images of the planar principal surfaces of the substrates 208 (e.g., x-y planes of substrates 208-25 to 208-1) are sent to the image processor 124 for substrate mapping.” As in 32 “Following mapping of the planar principal surfaces of each substrate 208 contained in the substrate storage cassette 202, the height of the high resolution camera 102 is readjusted so that the viewfinder 112 is level and pointed to the center of the front facing edge of a first substrate”]
Peh teaches wherein the substrates are accommodated in an accommodating body [(see at least Fig.2, paragraph 26) “The substrate storage cassette 202 (e.g., FOUP) has a bottom 203, a front opening 204, and a top 205. The substrate storage cassette 202 further comprises an array of spaced apart slots 206. Each slot 206 is configured to receive and support a substrate 208. As illustrated in the exemplary embodiment of FIG. 2, a total of N vertically arranged slots 206 may be labeled as slots 206-1 to 206-N, from the bottom 203 to the top 205,”]
Peh does not explicitly teach a calculation process in which three-dimensional information of the substrate is calculated based on the images acquired in the photographing process; and a take-out process in which the hand is moved to take out the substrate based on the three-dimensional information of the substrate calculated in the calculation process; the hand is moved to a first shooting position to obtain a first image by the camera, the first shooting position being located at a first height relative to the accommodating body, and then the hand is moved to a second shooting position to obtain a second image by the camera, the second shooting position being located at the same first height; and the calculation process calculates three-dimensional information of all the substrates accommodated in the accommodating body based on the first and second images obtained by the camera.
However, Goto teaches a calculation process in which three-dimensional information of each substrate is calculated based on the images acquired in the photographing process; [(see at least paragraphs 7,66) As in 7 “a hand attached to a tip end of the robot arm; a camera fixed and attached to a portion of the hand other than a substrate placing portion of the hand so as to take an image of the substrate placing portion, the substrate being placed at the substrate placing portion; an image data acquirer configured to acquire image data taken by the camera, the image data including a teaching substrate and the substrate placing portion of the hand, the teaching substrate being arranged as a teaching target at the substrate target position” As in 66 “a plurality of cameras 3 may be arranged, or a three-dimensional camera may be used. The three-dimensional camera simultaneously takes images of a target object from different directions to generate parallaxes, thereby acquiring the distance information to the target object. The three-dimensional camera can generate coordinate data of a surface of the target object in a predetermined coordinate system. The three-dimensional camera is a type of stereo camera. The three-dimensional camera includes a pair of cameras arranged away from each other by a predetermined distance, and these cameras include respective image pickup elements.”] and a take-out process in which the hand is moved to take out each substrate based on the three-dimensional information of each substrate calculated in the calculation process [(see at least paragraphs 54-62) As in 61 “only by taking the image of the space S including the teaching substrate W at the target position T and the blade 2b of the hand 2 by the camera 3, the distance information from the blade 2b to the teaching substrate W can be calculated based on the camera image, and the operation of the robot 1 can be controlled based on the distance information such that the virtual substrate VW virtually arranged at the blade 2b coincide with the teaching substrate W. By storing the position of the hand 2 at this time as the teaching data, the position of the hand 2 corresponding to the target position T of the substrate W can be taught to the robot 1. With this, the correct position can be accurately taught to the robot 1 without depending on the skill of the operator.”]
Goto teaches the calculation process calculates three-dimensional information of all the substrates accommodated in the accommodating body based on first and second images obtained by the camera. [(see at least paragraph 66) “a plurality of cameras 3 may be arranged, or a three-dimensional camera may be used. The three-dimensional camera simultaneously takes images of a target object from different directions to generate parallaxes, thereby acquiring the distance information to the target object. The three-dimensional camera can generate coordinate data of a surface of the target object in a predetermined coordinate system. The three-dimensional camera is a type of stereo camera. The three-dimensional camera includes a pair of cameras arranged away from each other by a predetermined distance, and these cameras include respective image pickup elements. The three-dimensional camera may acquire not only the distance information but also color information (RGB, etc.) of the target object. Further, the three-dimensional camera may emit a laser and acquire the distance information to a reflection point from a reflection position and a reflection time.”]
Krupyshev teaches the hand is moved to a first shooting position to obtain a first image by the camera, the first shooting position being located at a first height relative to the accommodating body, and then the hand is moved to a second shooting position to obtain a second image by the camera, the second shooting position being located at the same first height [(see at least Figs.10-12, paragraph 60) “while the first image of the part 580 of the robot arm 210A is shown in dashed lines). For example, the calibration image 590 includes an end effector 1000 in the predetermined repeatable position 650, 600 before placement into the substrate station modules 130. As may be realized and shown in FIG. 7, a predetermined repeatable position (or more than one) 650′ may be located further offset in the direction of extension (R, θ) from the predetermined repeatable retracted position 650, 600 so as to provide a series, or at least a pair (650, 600) of predetermined repeatable positions (650′, 600). As may be realized, the first image 570 may be generated with the arm a position 600/650. A second image from the series is generated with the arm at position 650′/650 and so on. The first, second, and each other image in the series generated with the arm in a different predetermined repeatable position is compared to, a corresponding calibrated image with the arm in the predetermined position. During operation of the at least one robot arm 210A, 211A in the transport chamber 125B′, and as the processing temperature of the substrate processing equipment changes, radial transitions of the robot arm 210A, 211A may drift (e.g., the position of the imaged end effector and thus point 1010 as well as the center point 1000WC that has a fixed relation to point 1010 will vary from the position in calibration and as defined in the calibration image registered by the controller 110). As such it is possible to measure the resultant thermal effects and/or the other variabilities by comparing the position data of the calibration image 590 to their relative values in at least the first image 570 for the series of predetermined repeatable positions 650, 650′, 600.”] Examiner notes that Krupyshev does not disclose any relative height information in regards to the first and second images. Krupyshev appears unconcerned in regards to the relative height of the first and second images. Examiner further notes the first and second shooting positions being at the first height do not seem to be advantageous or essential to applicant’s claimed invention. Applicant’s specification paragraph 32 discloses “In the present embodiment, the distance from the container 20 to the first shooting position and the distance from the container 20 to the second shooting position are the same, but may be different. The height of the second shooting position is the height at which the camera 15 is located below the midpoint of the container 20 in a height direction, as shown in FIG. 5. In the present embodiment, the height of the second shooting position is the same as the height of the first embodiment, but may be different.” The height being the same for the first and second shooting positions does not seem to solve any stated problem or is for any particular purpose and it appears that the invention would perform equally well with heights being different. See MPEP2143.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of Peh to incorporate the teachings of Goto of a calculation process in which three-dimensional information of each substrate is calculated based on the images acquired in the photographing process in order to calculate distance information from each substrate placing portion to the teaching substrate based on the image data of the camera [(Goto 10)] and of a take-out process in which the hand is moved to take out each substrate based on the three-dimensional information of each substrate calculated in the calculation process in order for the correct position of the robotic arm can be accurately taught/sent to the robot without depending on the skill of the operator [(Goto 61)] and of the calculation process calculates three-dimensional information of all the substrates accommodated in the accommodating body based on first and second images obtained by the camera in order to acquire the distance information to a reflection point from a reflection position and a reflection time [(Goto 66)] and to incorporate the teachings of Krupyshev of the hand is moved to a first shooting position to obtain a first image by the camera, the first shooting position being located at a first height relative to the accommodating body, and then the hand is moved to a second shooting position to obtain a second image by the camera, the second shooting position being located at the same first height in order to use the images to compare the calibration to identify positional variance. [(Krupyshev 64)]
Claim 8 is rejected under 35 U.S.C. 103 as being unpatentable over Peh in view of Goto and Krupyshev and in further view of Mariyama (US 2020/0030970 A1).
Regarding claim 8, In view of the above combination of references, Peh further teaches camera is disposed on the hand [(see at least Fig.2, paragraphs 29-33) As in 31 “the high resolution camera 102 begins capturing images of the planar principal surfaces of the substrates 208 (e.g., x-y planes of substrates 208-25 to 208-1). The images may be obtained at an angle to the principal planar surface due to their location in the substrate carrier. The high resolution camera 102 comes to a stop after capturing the x-y plane image of the last substrate—such as the bottommost substrate (e.g. substrate 208-1).”]
Peh does not explicitly teach wherein the camera is a monocular camera with a single imager.
However, Mariyama teaches wherein the camera is a monocular camera with a single imager. [(see at least paragraph 24) “The robot arm 100 is provided with a gripping unit 101 to grip a male connector 110, and a monocular camera 102 is attached to the robot arm 100 at a position where the monocular camera 102 can view the gripping unit. When the gripping unit 101 at the end of the robot arm 100 is gripping the male connector 110, the monocular camera 102 is positioned so as to be able to view the end of the gripped male connector 110 and the female connector 120 into which it will be inserted.”]
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to have modified the teachings of modified Peh to incorporate the teachings of Mariyama because Peh generally discloses high-def cameras, and Mariyama teaches a monocular camera with a single imager for capturing high-def images ([0150]). There are a finite number of types of high-def cameras on the market. Therefore, because Peh generally discloses a high-def camera used in the device, and Mariyama specifically teaches a monocular camera with a single imager for capturing high-def image in similar art, it would have been obvious to try the camera of Mariyama with the system of Peh with a reasonable expectation of capturing high-def images. [(see MPEP 2143)]
The Examiner has cited particular paragraphs or columns and line numbers in the references applied to the claims above for the convenience of the Applicant. Although the specified citations are representative of the teachings of the art and are applied to specific limitations within the individual claim, other passages and figures may apply as well. It is respectfully requested of the Applicant in preparing responses, to fully consider the references in their entirety as potentially teaching all or part of the claimed invention, as well as the context of the passage as taught by the prior art or disclosed by the Examiner. See MPEP 2141.02 [R-07.2015] VI. A prior art reference must be considered in its entirety, i.e., as a whole, including portions that would lead away from the claimed Invention. W.L. Gore & Associates, Inc. v. Garlock, Inc., 721 F.2d 1540, 220 USPQ 303 (Fed. Cir. 1983), cert, denied, 469 U.S. 851 (1984). See also MPEP §2123.
Conclusion
Any inquiry concerning this communication or earlier communications from the examiner should be directed to MOHAMMED YOUSEF ABUELHAWA whose telephone number is (571)272-3219. The examiner can normally be reached Monday-Friday 8:30-5:00 with flex.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Wade Miles can be reached at 571-270-7777. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/MOHAMMED YOUSEF ABUELHAWA/Examiner, Art Unit 3656
/WADE MILES/Supervisory Patent Examiner, Art Unit 3656