Prosecution Insights
Last updated: April 19, 2026
Application No. 18/355,640

ASSIST FOR ORIENTING A CAMERA AT DIFFERENT ZOOM LEVELS

Final Rejection §103
Filed
Jul 20, 2023
Examiner
AGGARWAL, YOGESH K
Art Unit
2637
Tech Center
2600 — Communications
Assignee
Qualcomm Incorporated
OA Round
4 (Final)
90%
Grant Probability
Favorable
5-6
OA Rounds
2y 7m
To Grant
96%
With Interview

Examiner Intelligence

Grants 90% — above average
90%
Career Allow Rate
998 granted / 1113 resolved
+27.7% vs TC avg
Moderate +7% lift
Without
With
+6.8%
Interview Lift
resolved cases with interview
Typical timeline
2y 7m
Avg Prosecution
32 currently pending
Career history
1145
Total Applications
across all art units

Statute-Specific Performance

§101
5.3%
-34.7% vs TC avg
§103
49.8%
+9.8% vs TC avg
§102
36.4%
-3.6% vs TC avg
§112
5.1%
-34.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 1113 resolved cases

Office Action

§103
64Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments Applicant’s arguments with respect to claim(s) 28-30, 32-39 and 41-45 have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claim(s) 28-30, 32-39 and 41-45 are rejected under 35 U.S.C. 103 as being unpatentable over Lee et al. (US PGPUB 20160050351) in view of Shikata (US PGPUB 20160212358). [Claim 28] Lee teaches a device having a multicamera system (Paragraph 91, fig. 1A-1D, cameras 120-1 and 120-2), the device configured to process and display image data (Paragraph 100, processing operations such as live view generation, image resolution adjustment, scaling etc.), the device comprising: a first camera (120), having a first field of view (FOV), configured to capture a first image stream associated with a first preview of a scene (Paragraph 91, the cameras 120-1 and 120-2 may have different specifications such as types, photographing magnifications, focal distances, view angles, etc. For example, the camera 120-1 may include a wide-angle lens having a relatively short focal distance, and the camera 120-2 may include a telephoto lens having a relatively long focal distance); a second camera (220), having a second FOV that is different than the first FOV, configured to capture a second image stream associated with a second preview of a first portion of the scene, where the second image stream is different from the first image stream (Paragraph 161 and 162, figs. 8, 9a-9c, A method of displaying a live view through the display 140 will now be described with reference to FIGS. 9A through 9C. The image processor 130 of the image photographing apparatus 100 that is combined with the image photographing apparatus 200 may generate a live view by using a first image 910 captured by the camera 120 of the image photographing apparatus 100 and a second image 920 captured by the camera 220 of the image photographing apparatus 200 and Paragraph 163, If the lens 121 (see FIG. 8) of the camera 120 of the image photographing apparatus 100 has a short focal distance, and a lens of the camera 220 of the image photographing apparatus 200 has a long focal distance, the image photographing apparatus 100 may overlap the first and second images 910 and 920 to include the second image 920 in a local area of the first image 910 and display the overlapped first and second images 910 and 920 as shown in FIG. 9A), and at least one processor (130) in communication with the first camera and the second camera (Paragraph 162, The image processor 130 of the image photographing apparatus 100 that is combined with the image photographing apparatus 200), the at least one processor being configured to output, for concurrent preview, the first image stream and the second image stream (fig. 9a and Paragraphs 162 and 163); and a touch-sensitive display (Paragraph 108 for touch display) configured to: concurrently preview the first image stream associated with the first FOV and the second image stream associated with the second FOV (fig. 9a and Paragraphs 162 and 163); receive a first user input indicative of recording only the second image stream, wherein the at least one processor is further configured to, in response to receipt of the first user input, process, for recording, image data of the second image stream (Paragraph 173, In detail, as shown in FIG. 10A, if the part inside the border line 930 is touched, the image photographing apparatus 100 may capture the second image 920 to display a captured still image on the display 140 or store the captured image). Lee fails to teach display, during the concurrent preview of the first image stream and the second image stream, a visual indication generated as a layer on the first image stream indicative of the first portion of the scene, as captured by the first camera, associated with the second image stream. However Shikata teaches in FIG. 4, “b” shows a display configuration example when the frame display mode is set. The frame display mode is a display mode of overlapping (layer) and displaying the frame (the additional image) representing the image capturing range of the image generated using the second image capturing unit 112, on the image generated using the first image capturing unit 111. For example, the image generated using the first image capturing unit 111 is displayed in the first image display area 301 in the display unit 172, and the frame 305 representing the image capturing range of the image generated using the second image capturing unit 112 is overlapped and displayed on the image (Paragraph 115). Therefore taking the combined teachings of Lee and Shikata, it would be obvious to one skilled in the art before the effective filing date of the invention to have been motivated to have displayed, during the concurrent preview of the first image stream and the second image stream, a visual indication generated as a layer on the first image stream indicative of the first portion of the scene, as captured by the first camera, associated with the second image stream in order to easily and rapidly recognize the image capturing ranges of the image capturing units (the first image capturing unit and the second image capturing unit). [Claim 29] Lee teaches wherein the touch-sensitive display is further configured to receive the first user input for recording only the second image stream during 1) the concurrent preview of the first image stream and the second image stream and 2) the display of the visual indication (Paragraph 173, fig. 10a). [Claim 30] Lee teaches wherein the touch-sensitive display is further configured to preview the second image stream as a picture in picture (PiP) view relative to a preview of the first image stream (fig. 10a shows picture 920 in picture 910 as a PIP). [Claim 32] Lee teaches wherein the at least one processor is further configured to: generate the first image stream (fig. 9a); generate, during generation of the first image stream, the visual indication and output the first image stream and the visual indication (Paragraphs 164, fig. 9a, 10a). [Claim 33] Lee teaches wherein the touch-sensitive display is further configured to receive a second user input indicative of an adjustment to the portion of the scene, and the at least one processor is further configured to adjust, based on the second user input, the second image stream being output for concurrent preview with the first image stream, wherein the touch-sensitive display is further configured to display a second visual indication based on the adjustment to the second image stream (Paragraphs 187 and 188, Also, as shown in FIG. 11B, a photographing magnification may be adjusted by a touch command of the user for adjusting a location and/or size of the border line 930. In detail, if a user command for touching the border line 930 to stretch the border line 930 to an outer line 1010 is input, the image photographing apparatus 100 may increase the photographing magnification through the camera 120. Also, the image photographing apparatus 100 may transmit a command for increasing the photographing magnification to the image photographing apparatus 200 through the interface 115. The image photographing apparatus 200 may capture an image according to the received photographing magnification). [Claim 34] Lee teaches wherein the image data of the second image stream, processed for recording, includes a single still image (Paragraph 173). [Claim 35] Lee teaches wherein the image data of the second image stream, processed for recording, includes a succession of still images (Paragraph 277, In operation S2140, the photographing apparatus 100 displays the generated captured image. In particular, the image photographing apparatus 100 may display the generated captured image as a live view image. Here, if an image capturing command is input through an inputter of the image photographing apparatus 100 or the image photographing apparatus 200, the image photographing apparatus 100 may capture a live view image to display or store the live view image. Live view is a succession of images). [Claim 36] Lee teaches wherein the touch-sensitive display is further configured to continue to preview the first image stream after receipt of first user input (Paragraph 277, In operation S2140, the photographing apparatus 100 displays the generated captured image. In particular, the image photographing apparatus 100 may display the generated captured image as a live view image. Here, if an image capturing command is input through an inputter of the image photographing apparatus 100 or the image photographing apparatus 200, the image photographing apparatus 100 may capture a live view image to display or store the live view image). [Claim 37] Lee teaches a device having a multicamera system (Paragraph 91, fig. 1A-1D, cameras 120-1 and 120-2), the device configured to process and display image data (Paragraph 100, processing operations such as live view generation, image resolution adjustment, scaling etc.), the device comprising a first camera (120), having a first field of view (FOV), configured to capture a first image stream associated with a first preview of a scene (Paragraph 91, the cameras 120-1 and 120-2 may have different specifications such as types, photographing magnifications, focal distances, view angles, etc. For example, the camera 120-1 may include a wide-angle lens having a relatively short focal distance, and the camera 120-2 may include a telephoto lens having a relatively long focal distance); a second camera (220), having a second FOV that is different than the first FOV, configured to capture a second image stream associated with a second preview of a first portion of the scene, where the second image stream is different from the first image stream (Paragraph 161 and 162, figs. 8, 9a-9c, A method of displaying a live view through the display 140 will now be described with reference to FIGS. 9A through 9C. The image processor 130 of the image photographing apparatus 100 that is combined with the image photographing apparatus 200 may generate a live view by using a first image 910 captured by the camera 120 of the image photographing apparatus 100 and a second image 920 captured by the camera 220 of the image photographing apparatus 200 and Paragraph 163, If the lens 121 (see FIG. 8) of the camera 120 of the image photographing apparatus 100 has a short focal distance, and a lens of the camera 220 of the image photographing apparatus 200 has a long focal distance, the image photographing apparatus 100 may overlap the first and second images 910 and 920 to include the second image 920 in a local area of the first image 910 and display the overlapped first and second images 910 and 920 as shown in FIG. 9A), and at least one processor (130) in communication with the first camera and the second camera (Paragraph 162, The image processor 130 of the image photographing apparatus 100 that is combined with the image photographing apparatus 200), the at least one processor being configured to: output, for concurrent preview, the first image stream and the second image stream (fig. 9a and Paragraphs 162 and 163); process a first user input indicative of recording only the second image stream (Paragraph 173); in response to the first user input, process, for recording, image data of the second image stream (Paragraph 173, In detail, as shown in FIG. 10A, if the part inside the border line 930 is touched, the image photographing apparatus 100 may capture the second image 920 to display a captured still image on the display 140 or store the captured image), and a display configured to: concurrently preview the first image stream associated with the first FOV and the second image stream associated with the second FOV(fig. 9a); and display, during the concurrent preview of the first image stream and the second image stream, a visual indication (border line 930, fig. 9a) indicative of the first portion of the scene associated with the second image stream (Paragraph 164, Also, the display 140 may display a border line 930 to display which one of the plurality of cameras 120 and 220 captures the image. The border line 930 is a graphic line indicating an outer border of a local area. Thus, in FIG. 9A, the border line 930 indicates the image taken by the camera 220 of the image photographing apparatus 200). Lee fails to teach display, during the concurrent preview of the first image stream and the second image stream, a visual indication generated as a layer on the first image stream indicative of the first portion of the scene, as captured by the first camera, associated with the second image stream. However Shikata teaches in FIG. 4, “b” shows a display configuration example when the frame display mode is set. The frame display mode is a display mode of overlapping (layer) and displaying the frame (the additional image) representing the image capturing range of the image generated using the second image capturing unit 112, on the image generated using the first image capturing unit 111. For example, the image generated using the first image capturing unit 111 is displayed in the first image display area 301 in the display unit 172, and the frame 305 representing the image capturing range of the image generated using the second image capturing unit 112 is overlapped and displayed on the image (Paragraph 115). Therefore taking the combined teachings of Lee and Shikata, it would be obvious to one skilled in the art before the effective filing date of the invention to have been motivated to have displayed, during the concurrent preview of the first image stream and the second image stream, a visual indication generated as a layer on the first image stream indicative of the first portion of the scene, as captured by the first camera, associated with the second image stream in order to easily and rapidly recognize the image capturing ranges of the image capturing units (the first image capturing unit and the second image capturing unit). [Claims 38, 39, 41-44] These claims are similar to claims 29, 30, 32-35 and are therefore analyzed and rejected based upon claims 29, 30, 32-35 respectively. [Claim 45] Lee teaches wherein the display is further configured to continue to preview the first image stream after the at least one processor has processed the first user input indicative of the selection for recording only the second image stream (Paragraph 277, In operation S2140, the photographing apparatus 100 displays the generated captured image. In particular, the image photographing apparatus 100 may display the generated captured image as a live view image. Here, if an image capturing command is input through an inputter of the image photographing apparatus 100 or the image photographing apparatus 200, the image photographing apparatus 100 may capture a live view image to display or store the live view image). Allowable Subject Matter Claims 46-52 are allowed. The prior art fails to teach or suggest “a touch-sensitive display configured to concurrently preview the first image stream associated with the first FOV and the second image stream associated with the second FOV; display, during the concurrent preview of the first image stream and the second image stream, a first visual indication indicative of the first portion of the scene associated with the second image stream; receive a second first user input to change a zoom level; and switch to output for preview, based on the second first user input to change the zoom level, the first image stream without the second image stream” in combination with other claimed elements. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any nonprovisional extension fee (37 CFR 1.17(a)) pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the mailing date of this final action. Any inquiry concerning this communication or earlier communications from the examiner should be directed to YOGESH K AGGARWAL whose telephone number is (571)272-7360. The examiner can normally be reached Monday - Friday 9:30-6. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sinh Tran can be reached at 5712727564. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /YOGESH K AGGARWAL/Primary Examiner, Art Unit 2637
Read full office action

Prosecution Timeline

Jul 20, 2023
Application Filed
Feb 08, 2024
Non-Final Rejection — §103
May 14, 2024
Response Filed
Sep 10, 2024
Request for Continued Examination
Sep 11, 2024
Response after Non-Final Action
Dec 17, 2024
Request for Continued Examination
Dec 18, 2024
Response after Non-Final Action
Jan 24, 2025
Non-Final Rejection — §103
Apr 11, 2025
Interview Requested
Apr 18, 2025
Applicant Interview (Telephonic)
Apr 19, 2025
Examiner Interview Summary
Apr 25, 2025
Response Filed
Aug 29, 2025
Non-Final Rejection — §103
Dec 03, 2025
Response Filed
Mar 12, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12604079
INFORMATION PROCESSING SYSTEM AND INFORMATION PROCESSING METHOD
2y 5m to grant Granted Apr 14, 2026
Patent 12604100
IMAGE PROCESSING METHOD AND ELECTRONIC DEVICE
2y 5m to grant Granted Apr 14, 2026
Patent 12598265
COOPERATIVE PHOTOGRAPHING METHOD AND APPARATUS, ELECTRONIC DEVICE, AND COMPUTER-READABLE STORAGE MEDIUM
2y 5m to grant Granted Apr 07, 2026
Patent 12587735
IMAGING APPARATUS, METHOD FOR CONTROLLING THE SAME, AND STORAGE MEDIUM
2y 5m to grant Granted Mar 24, 2026
Patent 12579842
METHOD FOR ADAPTING THE QUALITY AND/OR FRAME RATE OF A LIVE VIDEO STREAM BASED UPON POSE
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
90%
Grant Probability
96%
With Interview (+6.8%)
2y 7m
Median Time to Grant
High
PTA Risk
Based on 1113 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month