DETAILED ACTION
Notice of Pre-AIA or AIA Status
1. The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Priority
2. Acknowledgment is made of applicant's claim for foreign priority based on an application filed in Europe on February 25, 2022. It is noted, however, that applicant has not filed a certified copy of the EP22158997.1 application as required by 37 CFR 1.55.
Specification
3. The amendments to the Specification and Abstract were received on October 3, 2025 and entered.
Response to Amendment
4. The amendment filed October 3, 2025 has been entered. Claims 1-4, 6-7, and 9-10 remain pending in the application. Applicant’s amendments to the Specification, Abstract, and Claims have overcome each and every objection. The Applicant’s amendments to the claims have also overcome the 35 U.S.C. 112(b) and 35 U.S.C. 101 rejections previously set forth in the Non-Final Office Action mailed May 2, 2025.
Response to Arguments
5. Applicant's arguments filed October 3, 2025 have been fully considered but they are not persuasive.
6. Applicant argues that Furukawa et al. ("Towards Internet-Scale Multi-View Stereo") -- cited in IDS, hereinafter referred to as Furukawa, and Park et al. ("3D Modeling of Optically Challenging Objects") -- cited in IDS, hereinafter referred to as Park, fail to disclose the amended claim 1 limitation of “wherein a conflicting point is a point of the set that cannot validly coexist with the respective point based on assumption that the points of the set are connected by coherent surface with closest neighboring surface points and which coherent surface would block light.” The Applicant asserts on Page 11 of the Remarks that Park does not use an assumption that the “assumed coherent surface of the path blocks light.”
Examiner replies that Park discloses in Section 5.1 Paragraph 1-2 that the data of the points for the surface is obtained using a range measurement from a range sensor. Range sensors are known to detect points by detecting reflecting light. This proves that the coherent surface detected in Park Section 5.1 does block some light in order for the surface points to be detected for the surface test. The claim limitation also does not require the entire coherent surface to block light. Thus, if the range sensor is able to detect data used for this surface test, then it proves that parts of the surface does block light.
Additional support can also be found in Section 7 Paragraph 4 which teaches that the method disclosed in Park does not work on transparent or translucent surfaces. Thus, if Park’s method detects a coherent surface, then the surface is not transparent or translucent, meaning the surface would block light. Further additional support can also be found in Section 6 which teaches the method is run on specular, Lambertian, and highly absorptive surfaces which are all surfaces that block light. Therefore, Park’s method is run on the assumption that the coherent surface blocks light.
7. Applicant argues that even if coherent surfaces block light in Park, Park’s 5.1 local surface test does not use this to identify conflicting points. The Applicant asserts that Furukawa and Park’s local surface test does not disclose that conflicting points are identified based on assumption that the patches block light.
Examiner replies that Claim 1 does not claim that the conflicting points are identified using a test to detect coherent surfaces. The claim only claims that the conflicting points are identified and requires that the conflicting points cannot validly coexist with the respective point according to predefined one or more criteria. Claim 1’s wherein clause on line 4 only asserts that the points are a point on a coherent surface that blocks light. It does not claim that identifying the conflicting point is done through using a method that detects coherent surfaces that block light. The wherein clause only requires that the points are on a coherent surface that blocks light which Park’s local surface test teaches in Section 5.1 and explained above.
However, even if the claim language did claim that the conflicting points are identified using a method that checks that the surface is a coherent surface that blocks light, Park teaches in Section 5.1 Paragraph 4 that “All range measurements that do not satisfy either constraint are eliminated.” Thus, Park also teaches using the local surface test, which checks for coherent surfaces that block light as explained above, to identify conflicting points which are then removed.
8. Conclusion: The rejections set in the previous Office Action are shown to have been proper, and the claims are rejected below. New citations and parenthetical remarks can be considered new grounds of rejection and such new grounds of rejection are necessitated by the Applicant’s amendments to the claims. Therefore, the present Office Action is made final.
Claim Objections
9. Claim 10 objected to because of the following informalities: Should line 16 "for each time conflicting point in the set of points for the respective point" be "for each . Appropriate correction is required.
Claim Rejections - 35 USC § 112
10. The following is a quotation of 35 U.S.C. 112(b):
(b) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of 35 U.S.C. 112 (pre-AIA ), second paragraph:
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
11. Claims 1-4, 6-7, and 9-10 rejected under 35 U.S.C. 112(b) or 35 U.S.C. 112 (pre-AIA ), second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor (or for applications subject to pre-AIA 35 U.S.C. 112, the applicant), regards as the invention.
Claims 1 and 10 on lines 19 and 18 respectively recite “or a combination thereof.” It is unclear to the Examiner what the “combination thereof” entails. The claim only recites one way of the respective point being involved in a conflict which is when there is a conflicting point in the set of points for the respective point in lines 16-19 of claim 1 and lines 15-18 of claim 10. The Examiner is unclear as to what other combinations there are for wherein the respective point is considered involved in a conflict. Thus, the metes and bounds of claims 1 and 10 are unclear and unknown.
Claims 2-4, 6-7, and 9 are also rejected by dependency on claim 1.
Claim Rejections - 35 USC § 103
12. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
13. The text of those sections of Title 35, U.S. Code not included in this action can be found in a prior Office action.
14. Claim(s) 1-4 and 9-10 is/are rejected under 35 U.S.C. 103 as being unpatentable over Furukawa et al. ("Towards Internet-Scale Multi-View Stereo") -- cited in IDS, hereinafter referred to as Furukawa, in view of Park et al. ("3D Modeling of Optically Challenging Objects") -- cited in IDS, hereinafter referred to as Park.
15. Regarding claim 1, Furukawa teaches a method for removing erroneous points from a set of points of a three dimensional (3D) virtual object (Section 1, Paragraph 6 mentions removing filtering out reconstruction errors in a reconstruction of 3D points) provided by 3D imaging of a corresponding real world object by means of a camera with image sensor (331) (Section 1, Paragraph 6 mentions reconstructing objects in a photo with 3D points; Section 2, Paragraph 1 mentions processing input images from a camera. Image sensors are inherent to digital cameras), wherein the method comprises:
obtaining said set of points (Section 2, Paragraph 1 mentions processing input images and obtaining a set of points from them);
identifying, for a respective point of the set of points, conflicting points in the set of points, wherein a conflicting point is a point of the set of points that cannot validly coexist with the respective point according to predefined one or more criteria (Section 3.2, Paragraph 1 mentions a visibility filter that checks for conflicts between a point and points in reconstructions from other clusters; Figure 5 shows in the visibility filter where a red point is identified to conflict with the green point. The visibility filter is one criteria, the red point can be considered the conflicting point, and the green point can be considered the respective point)
and removing, from the set of points, based on said identification for the respective point of the set of points, one or more points of the set of points that have been involved in conflicts a greater number of times than other points of the set of points involved in conflicts (Section 3.2, Paragraph 2 mentions calculating a conflict count for each point and removing the point if the conflict count is greater than a threshold. This removes points involved in more conflicts that other points since it meets a threshold and other points may not), wherein the respective point is considered involved in a conflict for each conflicting point in the set of points for the respective point each time the respective point itself is identified as a conflicting point to another point of the set of points, or a combination thereof (Section 3.2, Paragraph 1 mentions checking for conflicts between a point and points in reconstructions from other clusters. Mentions incrementing a count each time it conflicts with another reconstruction).
However, Furukawa fails to teach wherein a conflicting point is a point of the set that cannot validly coexist with the respective point based on assumption that the points of the set of points are connected by coherent surface with closest neighboring surface points and which coherent surface would block light.
Park teaches wherein a conflicting point is a point of the set that cannot validly coexist with the respective point based on assumption that the points of the set of points are connected by coherent surface with closest neighboring surface points and which coherent surface would block light (Section 5.1 mentions assuming there is a planar patch for a surface. The planar patch indicates a continuous surface which requires all points within a specific distance, defined by equation 1, to have similar normals. Equation 3 shows the fitting error allowed for variance between normals of each point in the plane. Conflicting points would be identified if the error goes beyond the fitting error. Equation 1 selects the closest neighboring surface points and Equation 3 would prove a coherent surface. A coherent surface as disclosed in Park, identified by equations 1 and 3, can be understood to reflect, scatter, or block light).
Park discloses in Section 5.1 Paragraph 1-2 that the data of the points for the surface is obtained using a range measurement from a range sensor. Range sensors are known to detect points by detecting reflecting light. This proves that the coherent surface detected in Park Section 5.1 does block some light in order for the surface points to be detected for the surface test. The claim limitation also does not require the entire coherent surface to block light. Thus, if the range sensor is able to detect data used for this surface test, then it proves that parts of the surface does block light.
Furukawa and Park are considered analogous to the claimed invention as because both are in the same field of 3D modeling an object and eliminating false measurements. Thus, it would have been obvious to a person holding ordinary skill in the art before the effective filing date to modify the method of removing erroneous points taught by Furukawa with the identification of a conflicting point based on coherent surface assumptions taught by Park in order to eliminate false measurements generated by optically challenging surfaces (Park, Introduction Paragraph 4)
16. Regarding claim 2, Furukawa in view of Park teaches the limitations of claim 1. Furukawa further teaches wherein points involved in conflicts multiple times that exceed a predefined threshold are removed from the set of points (Section 3.2, Paragraphs 1-2 mention calculating a conflict count for each point and removing the point if the conflict count is greater than a threshold. They also mention an example of removing points with conflict counts that are three and above, which is a predefined threshold).
17. Regarding claim 3, Furukawa in view of Park teaches the limitations of claim 1. Furukawa further teaches the method wherein said one or more predefined criteria at least are based on what the camera virtually can view from its corresponding position in a coordinate system of said set of points (Section 2, Paragraph 1 mentions obtaining the camera poses from each point through the SFM algorithm. Figure 5 shows the green and red cameras viewing and generating the respective green and red points in the visibility filter section, which is one predefined criteria. There is a conflict between the red point ‘P’ and the green point based on the green camera’s position and viewpoint).
18. Regarding claim 4, Furukawa in view of Park teaches the limitations of claim 1. Furukawa further teaches the method wherein the method further comprises;
obtaining, for the respective point of the set of points, a respective camera direction corresponding to direction of light emission from a corresponding point of the real world object towards the camera, which light emission was sensed by the image sensor during said 3D imaging (Section 2, Paragraph 1 mentions obtaining the camera poses from each point through the SFM algorithm. Figure 5 shows the camera direction, or direction of light emission, from the green point on the object to the green camera which is used in the visibility filtering process. Sensing light emission by a sensor is inherent to a digital camera);
wherein the conflicting point cannot validly coexist with the respective point based on at least a camera direction (Figure 5 shows the camera direction from the green point on the object to the green camera. Also shows a red point ‘P’ that conflicts with the green point and cannot validly coexist based on the camera direction of the green camera. The red point ‘P’ is the conflicting point and the green point is the respective point).
19. Regarding claim 9, Furukawa teaches a non-transitory computer readable storage medium comprising instructions that, when executed by one or more processors, causes one or more devices (Section 4, Paragraph 1 teaches running the algorithm on a PC with processors. PCs are known to have memories which is a non-transitory computer readable storage medium) to perform the method according to claim 1 (See rejection for claim 1 above).
20. Regarding claim 10, Furukawa teaches a device for removing erroneous points from a set of points of a three dimensional (3D) virtual object provided by 3D imaging of a corresponding real world object (Section 1, Paragraph 6 mentions removing filtering out reconstruction errors in a reconstruction of 3D points) by means of a camera with image sensor (Section 1, Paragraph 6 mentions reconstructing objects in a photo with 3D points; Section 2, Paragraph 1 mentions processing input images from a camera. Image sensors are inherent to cameras), wherein said one or more devices are configured to:
obtain said set of points (Section 2, Paragraph 1 mentions processing input images and obtaining a set of points from them);
identify, for a respective point, conflicting points in the set of points, wherein a conflicting point is a point of the set of points that cannot validly coexist with the respective point according to predefined one or more criteria (Section 3.2, Paragraph 1 mentions a visibility filter that checks for conflicts between a point and points in reconstructions from other clusters; Figure 5 shows in the visibility filter where a red point is identified to conflict with the green point. The visibility filter is one criteria, the red point can be considered the conflicting point, and the green point can be considered the respective point)
and remove, from the set of points, based on said identification for respective point of the set of points, one or more points of the set of points that have been involved in conflicts a greater number of times than other points of the set of points involved in conflicts (Section 3.2, Paragraph 2 mentions calculating a conflict count for each point and removing the point if the conflict count is greater than a threshold. This removes points involved in more conflicts that other points since it meets a threshold and other points may not), wherein the respective point of the set of points is considered involved in a conflict for each time conflicting point in the set of points for the respective point, each time the respective point itself is identified as the conflicting point to another point of the set of points, or a combination thereof (Section 3.2, Paragraph 1 mentions checking for conflicts between a point and points in reconstructions from other clusters. Mentions incrementing a count each time it conflicts with another reconstruction).
However, Furukawa fails to teach wherein a conflicting point is a point of the set that cannot validly coexist with the respective point based on assumption that the points of the set of points are connected by coherent surface with closest neighboring surface points and which coherent surface would block light.
Park teaches wherein a conflicting point is a point of the set that cannot validly coexist with the respective point based on assumption that the points of the set of points are connected by coherent surface with closest neighboring surface points and which coherent surface would block light (Section 5.1 mentions assuming there is a planar patch for a surface. The planar patch indicates a continuous surface which requires all points within a specific distance, defined by equation 1, to have similar normals. Equation 3 shows the fitting error allowed for variance between normals of each point in the plane. Conflicting points would be identified if the error goes beyond the fitting error. Equation 1 selects the closest neighboring surface points and Equation 3 would prove a coherent surface. A coherent surface disclosed by Park, identified by equations 1 and 3, can be understood to reflect, scatter, or block light).
Park discloses in Section 5.1 Paragraph 1-2 that the data of the points for the surface is obtained using a range measurement from a range sensor. Range sensors are known to detect points by detecting reflecting light. This proves that the coherent surface detected in Park Section 5.1 does block some light in order for the surface points to be detected for the surface test. The claim limitation also does not require the entire coherent surface to block light. Thus, if the range sensor is able to detect data used for this surface test, then it proves that parts of the surface does block light.
Furukawa and Park are considered analogous to the claimed invention as because both are in the same field of 3D modeling an object and eliminating false measurements. Thus, it would have been obvious to a person holding ordinary skill in the art before the effective filing date to modify the device for removing erroneous points taught by Furukawa with the identification of a conflicting point based on coherent surface assumptions taught by Park in order to eliminate false measurements generated by optically challenging surfaces (Park, Introduction Paragraph 4)
21. Claim(s) 6-7 is/are rejected under 35 U.S.C. 103 as being unpatentable over Furukawa et al. ("Towards Internet-Scale Multi-View Stereo") -- cited in IDS, hereinafter referred to as Furukawa, in view of Park et al. ("3D Modeling of Optically Challenging Objects") -- cited in IDS, hereinafter referred to as Park, as applied to claim 1 and 4 above, and further in view of Homma (U.S. Patent Application Publication No. 2020/0340800 A1).
22. Regarding claim 6, Furukawa in view of Park teaches the limitations of claim 1. However, Furukawa and Park fail to teach the method wherein the identification of conflicting points for respective point is limited to points of the set that are present within a certain distance from the respective point.
Homma teaches the method wherein the identification of conflicting points for respective point is limited to points of the set of points that are present within a certain distance from the respective point (Paragraph 36, Figure 8 mentions detecting conflicting points that exist in a blind spot area in respect to the respective point, Ap. Only points in the blind spot area are identified as erroneous or conflicting. This can be considered a certain distance from the respective point because only points in the blind spot region bounded by sides with length Sd1, Sd2, and Hy are considered).
Furukawa, Park, and Homma are considered analogous to the claimed invention as because both are in the same field of eliminating erroneous points. Thus, it would have been obvious to a person holding ordinary skill in the art before the effective filing date to modify the method of deleting erroneous points taught by Furukawa in view of Park with the limiting identification of conflicting points to a certain distance taught by Homma in order to identify false measurements in blind spot areas (Homma Paragraph 5).
23. Regarding claim 7, Furukawa in view of Park teaches the limitations of claim 4. However, Furukawa and Park fail to teach the method wherein the 3D imaging is based on light triangulation comprising illumination of said real world object by a light source, wherein said light emission is reflected light from a surface of said real world object and resulting from said illumination.
Homma teaches the method wherein the 3D imaging is based on light triangulation comprising illumination of said real world object by a light source, wherein said light emission is reflected light from a surface of said real world object and resulting from said illumination (Paragraph 4 teaches projecting light and using triangulation to get data indicating a 3D shape of the object; Paragraph 26 and Figure 2 teach a diagram depicting light triangulation using a light source 6 to project light, and an image sensor 13 to receive the reflected light. The reflected light can be seen through marker L2 which results from the illumination from the light source 6).
Furukawa, Park, and Homma are considered analogous to the claimed invention as because both are in the same field of creating a three-dimensional model of an object and eliminating erroneous points. Thus, it would have been obvious to a person holding ordinary skill in the art before the effective filing date to modify the method of deleting erroneous points and use of a camera taught by Furukawa in view of Park with the light triangulation method taught by Homma in order to identify false measurements in blind spot areas (Homma Paragraph 5).
Conclusion
24. Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action.
25. Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTINE Y AHN whose telephone number is (571)272-0672. The examiner can normally be reached M-F 8-5pm.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Alicia Harrington can be reached at (571)272-2330. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/CHRISTINE YERA AHN/Examiner, Art Unit 2615
/ALICIA M HARRINGTON/Supervisory Patent Examiner, Art Unit 2615