Prosecution Insights
Last updated: April 19, 2026
Application No. 19/020,167

ROBOT FOR PROJECTING IMAGE AND METHOD FOR PROJECTING IMAGE THEREOF

Non-Final OA §102
Filed
Jan 14, 2025
Examiner
TRAN, TRANG U
Art Unit
2422
Tech Center
2400 — Computer Networks
Assignee
Samsung Electronics Co., Ltd.
OA Round
1 (Non-Final)
79%
Grant Probability
Favorable
1-2
OA Rounds
2y 10m
To Grant
94%
With Interview

Examiner Intelligence

Grants 79% — above average
79%
Career Allow Rate
719 granted / 915 resolved
+20.6% vs TC avg
Strong +16% interview lift
Without
With
+15.9%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
20 currently pending
Career history
935
Total Applications
across all art units

Statute-Specific Performance

§101
6.2%
-33.8% vs TC avg
§103
45.9%
+5.9% vs TC avg
§102
35.2%
-4.8% vs TC avg
§112
2.7%
-37.3% vs TC avg
Black line = Tech Center average estimate • Based on career data from 915 resolved cases

Office Action

§102
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Claim Rejections - 35 USC § 102 The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action: A person shall be entitled to a patent unless – (a)(1) the claimed invention was patented, described in a printed publication, or in public use, on sale, or otherwise available to the public before the effective filing date of the claimed invention. Claims 1-6, 9, 11-16 and 19-20 are rejected under 35 U.S.C. 102(a)(1) as being anticipate by Choi et al. (US Patent No. 10,893,245 B1). In considering claim 1, Choi et al. discloses all the claimed subject matter, note 1) the claimed a projector is met by the projector 110 (Figs. 1-2, col. 3, line 50 to col. 4, line 46), 2) the claimed a sensor is met by the sensing unit 118 (Figs. 1-2, col. 3, line 50 to col. 4, line 46), 3) the claimed memory storing instructions is met by the memory 120 (Figs. 1-2, col. 3, line 50 to col. 4, line 46), 4) the claimed one or more processors is met by the processor 122 (Figs. 1-2, col. 3, line 50 to col. 4, line 46), 5) the claimed wherein the instructions that, when collectively or individually executed by the one or more processors, cause the robot to: identify a plurality of candidate projection areas based on first information obtained by sensing surroundings of a user via the sensor is met by in step S333, the controller 124 searches for one or more projectable areas adjacent to the first user 510 based on the position information for the first user 510 and the position information for the one or more objects (Figs. 4-5, col. 8, line 30 to col. 9, line 8), 6) the claimed identify a priority order of the plurality of candidate projection areas is met by the controller 124 assigns a priority to each of one or more projectable areas based on the number of users and the physical features of the one or more projectable areas (Fig. 10, col. 12, line 42 to col. 13, line 21), 7) the claimed control the projector to: project a first image at an area comprising the plurality of candidate projection areas, based on a plurality of positions of the plurality of candidate projection areas and the priority order is met by if the highest-priority projectable area is present in the first subspace, the controller 124 selects the highest-priority projectable area present in the first subspace as the projection area in step S3433 (Fig. 10, col. 12, line 42 to col. 13, line 21 and col. 14, lines 4-51), 8) the claimed display second information on the priority order at the plurality of candidate projection areas is met by the robot 100 may efficiently provide the second type of image to the users via projection of the test images and the priorities of one or more projectable areas (Fig. 10, col. 14, lines 4-51), 9) the claimed identify a candidate projection area selected based on a user input from among the plurality of candidate projection areas as a projection area is met by the controller 124 may select the projectable area to which the first type of image is projected most clearly as the projection area based on the physical features of the projectable areas which may be previously input from the user or may be obtained via the sensing unit 118 (Figs. 4 and 6, col. 9, line 39 to col. 10, line 47), and 10) the claimed project image content at the projection area via the projector area is met by the controller 124 controls the projector 110 to project the first type of image to the projection area (Figs. 4 and 6, col. 9, line 39 to col. 10, line 47). In considering claim 2, Choi et al. discloses all the claimed subject matter, note 1) the claimed wherein the one or more processors are configured to execute the instructions to cause the robot to: identify, based on the plurality of positions, a plurality of areas of a second image to be projected in the area, the plurality of areas corresponding to the plurality of candidate projection areas is met by in step S333, the controller 124 searches for one or more projectable areas adjacent to the first user 510 based on the position information for the first user 510 and the position information for the one or more objects (Figs. 4-5, col. 8, line 30 to col. 9, line 8), 2) the claimed control the projector to project the second image, and wherein the second image comprises a plurality of sub images corresponding to the plurality of areas is met by the projectable area to which an image may be projected (Figs. 4-5, col. 8, line 30 to col. 9, line 8), and 3) the claimed the plurality of sub images comprise a plurality of indicators corresponding to the priority order is met by the controller 124 assigns a priority to each of one or more projectable areas based on the number of users and the physical features of the one or more projectable areas (Fig. 10, col. 12, line 42 to col. 13, line 21). In considering claim 3, the claimed wherein the plurality of indicators comprise a plurality of numbers indicating the priority order is met by the controller 124 assigns a priority to each of one or more projectable areas based on the number of users and the physical features of the one or more projectable areas (Fig. 10, col. 12, line 42 to col. 13, line 21). In considering claim 4, Choi et al. discloses all the claimed subject matter, note 1) the claimed wherein the one or more processors are configured to execute the instructions to cause the robot to: identify, based on the plurality of positions, a plurality of areas corresponding to the plurality of candidate projection areas is met by in step S333, the controller 124 searches for one or more projectable areas adjacent to the first user 510 based on the position information for the first user 510 and the position information for the one or more objects (Figs. 4-5, col. 8, line 30 to col. 9, line 8), 2) the claimed control the projector to consecutively project a plurality of sub images corresponding to the plurality of areas is met by the projectable area to which an image may be projected (Figs. 4-5, col. 8, line 30 to col. 9, line 8), and 3) the claimed wherein the plurality of sub images comprise a plurality of indicators corresponding to the priority order is met by the controller 124 assigns a priority to each of one or more projectable areas based on the number of users and the physical features of the one or more projectable areas (Fig. 10, col. 12, line 42 to col. 13, line 21). Claim 5 is rejected for the same reason as discussed in claim 3 above. In considering claim 6, Choi et al. discloses all the claimed subject matter, note 1) the claimed wherein the one or more processors are configured to execute the instructions to cause the robot to identify the priority order based on: a plurality of sizes of the plurality of candidate projection areas is met by the controller 124 assigns a priority to each of one or more projectable areas based on the number of users and the physical features of the one or more projectable areas and the physical features may include at least one of the size (e.g., horizontal and vertical lengths and thickness or thicknesses), material, color, pattern, and irregularity of the projectable areas (Figs. 9-10, col. 12, line 26 to col. 13, line 21), and 2) the claimed a plurality of distances between the user and the plurality of candidate projection areas is met by the controller 124 selects a projectable area in a space present within a preset second distance of a first user who is a recipient of video call (i.e., positioned adjacent to the first user) as a projection area (Fig. 3, col. 6, line 31 to col. 7, line 55). In considering claim 9, Choi et al. discloses all the claimed subject matter, note 1) the claimed wherein the one or more processors are configured to execute the instructions to cause the robot to: obtain a third image via the sensor, identify a position of the user in the third image the image sensor (e.g., the camera) included in the sensing unit 118 may obtain an image including the first user (Figs. 4-5, col. 8, line 30 to col. 9, line 8), and 2) the claimed identify a rotation angle range of the sensor based on the position of the user and a field of view of the sensor, and obtain the first information via the sensor while the sensor rotates within the rotation angle range is met by the controller 124 may obtain the face direction information for the first user 510 using a depth image obtained via the IR sensor (Figs. 4-5, col. 8, line 30 to col. 9, line 8). Claims 11-16 are rejected for the same reason as discussed in claims 1-6, respectively. Claim 19 is rejected for the same reason as discussed in claim 9 above. Claim 20 is rejected for the same reason as discussed in claim 1 above. Allowable Subject Matter Claims 7-8, 10 and 17-18 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. Conclusion The prior art made of record and not relied upon is considered pertinent to applicant's disclosure. Orlov et al. (US Patent No. 12,248,325 B2) disclose mobile robot and a method for controlling the mobile robot. Suzuki et al. (US Patent No. 11,417,135 B2) disclose information processing apparatus, information processing method, and program. Rico et al. (US Patent No. 11,219,837 B2) disclose robot utility and interface device. Any inquiry concerning this communication or earlier communications from the examiner should be directed to TRANG U TRAN whose telephone number is (571)272-7358. The examiner can normally be reached M-F 10:00AM- 6:00PM. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, JOHN W. MILLER can be reached at 571-272-7353. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. January 24, 2026 /TRANG U TRAN/Primary Examiner, Art Unit 2422
Read full office action

Prosecution Timeline

Jan 14, 2025
Application Filed
Jan 24, 2026
Non-Final Rejection — §102 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12603986
INFORMATION PROCESSING DEVICE, INFORMATION PROCESSING METHOD, AND INFORMATION PROCESSING SYSTEM
2y 5m to grant Granted Apr 14, 2026
Patent 12598288
METHOD AND DEVICE FOR DETECTING POWER STABILITY OF IMAGE SENSOR
2y 5m to grant Granted Apr 07, 2026
Patent 12596077
Passive Camera Lens Smudge Detection
2y 5m to grant Granted Apr 07, 2026
Patent 12591995
METHOD AND APPARATUS FOR DEFORMATION MEASUREMENT, ELECTRONIC DEVICE, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 31, 2026
Patent 12576717
DRIVING ASSISTANCE APPARATUS
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
79%
Grant Probability
94%
With Interview (+15.9%)
2y 10m
Median Time to Grant
Low
PTA Risk
Based on 915 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month