Prosecution Insights
Last updated: April 19, 2026
Application No. 18/505,001

METHOD OF DISPLAYING PROFILE VIEW ON INSTANT MESSAGING SERVICE

Non-Final OA §103
Filed
Nov 08, 2023
Examiner
CADORNA, CHRISTOPHER PALACA
Art Unit
2444
Tech Center
2400 — Computer Networks
Assignee
Kakao Corp.
OA Round
5 (Non-Final)
68%
Grant Probability
Favorable
5-6
OA Rounds
3y 3m
To Grant
89%
With Interview

Examiner Intelligence

Grants 68% — above average
68%
Career Allow Rate
150 granted / 222 resolved
+9.6% vs TC avg
Strong +21% interview lift
Without
With
+21.3%
Interview Lift
resolved cases with interview
Typical timeline
3y 3m
Avg Prosecution
38 currently pending
Career history
260
Total Applications
across all art units

Statute-Specific Performance

§101
9.0%
-31.0% vs TC avg
§103
51.7%
+11.7% vs TC avg
§102
16.1%
-23.9% vs TC avg
§112
21.3%
-18.7% vs TC avg
Black line = Tech Center average estimate • Based on career data from 222 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The preEzzeddinet application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Response to Arguments 1. Applicant’s arguments have been considered but are moot because the new ground of rejection does not rely on any reference applied in the prior rejection of record for any teaching or matter specifically challenged in the argument, Ezzeddine et al. (US 20020165732 A1). Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. 2. Claims 1-2, 4, 6-12, and 15-20 are rejected under 35 U.S.C. 103 as being unpatentable over Ham et al. (US 20210118013 A1) in view of Ezzeddine et al. (US 20020165732 A1) and Moon et al. (US 20230064599 A1). Claim 1 Ham teaches a method of operating a terminal of a first account to interact with a second account through a profile view of the first account, the method comprising: determining a profile item applicable to the profile view for the first account based on an input received by the terminal of the first account, (Ham, FIG. 2, step 210 and 220, ¶0064-¶0066, determining a profile item applicable to the profile view based on user selection, ¶0049, wherein user A selects the set item, ¶0041, via input received from User A’s device) wherein the profile item comprises at least one of a first item for at least one touch input-based interaction (Ham, FIG. 6, item 675, ¶0098-¶0099, wherein the profile item comprises a link item; ¶0111, wherein the profile item interfaces are capable of touch-based interaction; however, Examiner notes that “for at least one touch input-based interaction” and is an intended use statements that does not have patentable weight) or a second item for slider input-based interaction, and a coordinate indicating a position where the profile item is provided on the profile view; (Ham, FIG. 4, ¶0049, wherein the profile item comprises a position wherein the profile item is displayed on the profile view, Examiner notes that the position that the coordinate indicates the position of would be functionally identically to a coordinate indicating position) wherein the profile view of the first account includes a profile image and profile information corresponding to the first account; (Ham, FIG. 4, ¶0049, wherein the profile view comprises images, i.e. profile images, and information, i.e. profile information) causing the profile item to be displayed on based on the determined profile item and the determined coordinate; (Ham, FIG. 6, 670 profile view screen, ¶0098, displaying the profile item on the second users terminal screen) However, Ham does not explicitly teach receiving, via a server, a series of inputs interacting with the profile item from a terminal of the second account; Identifying a number of inputs in the series of inputs; identifying a visual effect to display based on the number of inputs in the series of inputs; and causing the visual effect corresponding to the number of inputs in the series of inputs to be displayed on a screen of the terminal of the first account and a screen of the terminal of the second account by adding the visual effect to the profile view of the first account. From a related technology, Ezzeddine teaches receiving, via a server, a series of inputs interacting with the profile item (Ezzeddine, FIG. 30 step 1512, ¶0210, receiving a series of inputs interacting with a profile item in the form of a physician name 71, FIG. 43) from a terminal of the second account; (Ezzeddine, FIGs. 38-40, ¶0136, wherein the server receives the inputs from a user account, wherein the user comprises the second account and a physician comprises a first account) Identifying a number of inputs in the series of inputs; (Ezzeddine, ¶0210, wherein the series of inputs is identified as a double-click, i.e. a number of inputs, specifically two inputs) identifying a visual effect to display based on the number of inputs in the series of inputs. (Ezzeddine, FIG. 30, step 1512, ¶0210, in response to the inputs, displaying a visual effect comprising a “Physician Profile” screen FIG. 75-77) It would be obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Ham to further collecting interacts from secondary profiles in order to better connect and provide information across users as desired. However, Ham in view of Ezzeddine does not explicitly teach identifying a visual effect to display based on the number of inputs in the series of inputs; and causing the visual effect corresponding to the number of inputs in the series of inputs to be displayed. From a related technology, Moon teaches identifying a visual effect to display based on the number of inputs in the series of inputs; (Moon, FIG. 13, step 1321, ¶0134, identifying a repreEzzeddinetation based on inputs) and causing the visual effect corresponding to the number of inputs in the series of inputs to be displayed. (Moon, FIG. 13, step 1323, ¶0134, providing an image preview corresponding to the series of inputs to be displayed on the screen of the terminal of the first account or the screen of the terminal of the second account) It would be obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Ham in view of Ezzeddine to incorporate the well-known techniques used by Moon to account for various desired inputs and displays for profile items in order to more effectively utilize network resources according to user desire. Claim 2 Ham in view of Ezzeddine and Moon teaches Claim 1, and further teaches wherein the profile item further comprises at least one of a third item indicating a number of times the first item is touched by another account or a fourth item indicating a number of views of a profile. (Ham, ¶0062, wherein the item comprise the number of views or number of clicks, i.e. touched, by the profile) Claim 4 Ham in view of Ezzeddine and Moon teaches Claim 1, and further teaches wherein profile item includes the first item (Moon, ¶0039, touch based interaction items) and the series of inputs is a series of consecutive touch inputs, (Moon, FIG. 13, step 1311, ¶0134, receiving a series of input(s) related to profile item) receiving the series of consecutive touch inputs at the coordinate at which the first item is provided; (Moon, FIG. 13, step 1311, ¶0134, receiving a series of input(s) related to profile item, wherein the inputs are received at the profile item, i.e. the coordinates) determining that the series of consecutive touch inputs are received at a regular time interval; (Moon, FIG. 13, step 1311, ¶0134, determining the interval of the inputs) increasing a number of times the first item is touched and displaying the visual effect corresponding to the plurality of consecutive touch inputs on the screen. (Ham, ¶0062, wherein the item comprise the number of views or number of clicks, i.e. touch inputs) Claim 6 Ham in view of Ezzeddine and Moon teaches Claim 1, and further teaches displaying an emoticon in the first item on at least one coordinate of the profile view. (Moon, FIG. 13, step 1323, ¶0134, displaying an emoticon at profile item, i.e. the coordinate of the profile item) Claim 7 Ham in view of Ezzeddine and Moon teaches Claim 6, and further teaches wherein a size of the emoticon displayed on the at least one coordinate is determined based on a selected first reference, (Moon, ¶0134, wherein the size is determined based on a first reference, being the emotion repreEzzeddinetation contents) wherein the at least one coordinate is determined based on a selected second reference. (Ham, ¶0049, wherein the at least coordinate is determined based on a second reference, being the user input) Claim 11 Ham in view of Ezzeddine and Moon teaches Claim 1, wherein the second item comprises at least one of text, a slider bar, a slider pointer capable of receiving a drag input, an image displayed on the slider pointer, or an emoticon displayed on the slider pointer. (Ham, FIG. 6, profile item 675, ¶0098-¶0099, wherein the profile item comprises at least text) Claim 18 Ham in view of Ezzeddine and Moon teaches Claim 1, and further teaches based on a plurality of profiles corresponding to the first account existing: creating profile views corresponding to each of the plurality of profiles; (Ham, FIG. 2, step 210 and 220, ¶0064-¶0066, determining a profile item applicable to the profile view based on user selection for a plurality of profile views) and determining at least one profile item to be displayed for each of the profile views based on an input signal received from the terminal of the first account. (Ham, FIG. 6, 670 profile view screen, ¶0098, displaying the profile item on the second users terminal screen) Claim 19 is taught by Ham in view of Ezzeddine and Moon as described for Claim 1. Claim 21 is taught by Ham in view of Ezzeddine and Moon as described for Claim 1. 3. Claims 3, 5, and 13-14 are rejected under 35 U.S.C. 103 as being unpatentable over Ham et al. (US 20210118013 A1) in view of Ezzeddine et al. (US 20020165732 A1) and Moon et al. (US 20220094679 A1) and in further view of Baker et al (US 20210067476 A1). Claim 3 Ham in view of Ezzeddine and Moon teaches Claim 1, but does not explicitly teach wherein the first item comprises an emoticon indicating emotion. From a related technology, Baker teaches a first item comprises an emoticon indicating emotion. (FIG. 2D, emoticon 270, ¶0037, wherein the item comprises an emoticon) It would be obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Ham in view of Tsai to incorporate well-known visual elements such as emoticons as applied to profiles in the teachings of Baker in order to more effectively facilitate desired user profiles and displays in an efficient manner. Claim 5 Ham in view of Ezzeddine and Moon teaches Claim 4, but does not explicitly teach wherein the visual effect comprises an effect in which at least one emoticon in the first item is displayed. From a related technology, Baker teaches a visual effect comprises an effect in which at least one emoticon in the first item is displayed. (FIG. 2D, emoticon 270, ¶0037, wherein the item comprises an emoticon) It would be obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Ham in view of Tsai to incorporate well-known visual elements such as emoticons as applied to profiles in the teachings of Baker in order to more effectively facilitate desired user profiles and displays in an efficient manner. Claim 13 Ham in view of Ezzeddine and Moon teaches Claim 12, but does not explicitly teach wherein the visual effect comprises an effect in which an emoticon is displayed at a selected coordinate on at least one of the screen of the terminal of the first account or the screen of the terminal of the second account. From a related technology, Baker teaches a visual effect comprises an effect in which an emoticon is displayed at a selected coordinate on the screen of the terminal of the first account or the screen of the terminal of the second account. (FIG. 2D, emoticon 270, ¶0037, wherein the item comprises an emoticon, wherein the coordinate is inherently selected along with the emoticon) It would be obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Ham in view of Tsai to incorporate well-known visual elements such as emoticons as applied to profiles in the teachings of Baker in order to more effectively facilitate desired user profiles and displays in an efficient manner. Claim 14 Ham in view of Ezzeddine, Moon and Baker teaches Claim 13, and further teaches wherein the selected coordinate is a coordinate determined to be displayed in a middle of an x-axis of the screen and a middle of a y-axis of the screen on which the profile view is displayed. (Baker, FIG. 2D, emoticon 270, ¶0037, wherein the item comprises an emoticon at the center of the screen, i.e. the middle of an x-axis and a y-axis) 4. Claims 8-10 are rejected under 35 U.S.C. 103 as being unpatentable over Ham et al. (US 20210118013 A1) in view of Ezzeddine et al. (US 20020165732 A1) and Moon et al. (US 20220094679 A1) and in further view of Lee et al. (US 20160259526 A1), Claim 8 Ham in view of Ezzeddine and Moon teaches Claim 1, but does not explicitly teach moving an emoticon from a first coordinate to a second coordinate that is different from the first coordinate of the profile view based on the series of inputs related to the profile item. From a related technology, Lee teaches moving an emoticon from a first coordinate to a second coordinate that is different from the first coordinate of the profile view based on the series of inputs related to the profile item. (FIG. 8, ¶0030 and ¶0088, moving an emoticon to a second position based on user interactions related to the profile) It would be obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Ham in view of Moon to incorporate the emoticon handling techniques described in Lee in order to provide a more interactive user experience via an efficient usage of network resources. Claim 9 Ham in view of Ezzeddine, Moon and Lee teaches Claim 8, and further teaches wherein the first coordinate is positioned above the second coordinate in a vertical direction, x-axis coordinates of the first coordinate and the second coordinate are the same, and the second coordinate is comprised in an area where a profile image is displayed on the profile view. (Lee, ¶0088, wherein the movement can comprise a downward movement, i.e. wherein the first coordinate would be above the second coordinate) Claim 10 Ham in view of Ezzeddine, Moon and Lee teaches Claim 8, and further teaches wherein the moving of the emoticon from the first coordinate to the second coordinate that is different from the first coordinate comprises moving the emoticon from the first coordinate to the second coordinate while changing the emoticon. (Lee, ¶0088, wherein the emoticon can be both moved and changed) 5. Claims 12 and 15-17 are rejected under 35 U.S.C. 103 as being unpatentable over Ham et al. (US 20210118013 A1) in view of Ezzeddine et al. (US 20020165732 A1) and Moon et al. (US 20220094679 A1) and in further view of Kumar et al. (US 20210233287 A1). Claim 12 Ham in view of Ezzeddine and Moon teaches Claim 1, but does not explicitly teach receiving a slider input from the coordinate at which the second item is provided; and causing the visual effect corresponding to the slider input to be displayed on at least one of the screen of the terminal of the first account or the screen of the terminal of the second account. From a related technology, Kumar teaches receiving a slider input from the coordinate at which the second item is provided; (¶0062, a slider input) and causing the visual effect corresponding to the slider input to be displayed on at least one of the screen of the terminal of the first account or the screen of the terminal of the second account. (¶0062, creating a visual effect based on the slider input) It would be obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Ham in view of Moon to further incorporate the teachings of Kumar to utilize a well-known interface option, such as a slider, to provide greater user input controls to efficiently utilize network resource. Claim 15 Ham in view of Ezzeddine, Moon and Kumar teaches Claim 12, and further teaches based on the slider input including an input moving in a first direction; (¶0062, wherein the slide input is received in at least a first direct) displaying an emoticon corresponding to the first direction at a selected coordinate on at least one of the screen of the first account or the screen of the second account; (Kumar, ¶0062, displaying the emoticon at the corresponding coordinate) and maintaining a horizontal width of a slider bar and increasing a vertical width of a slider bar as a slider pointer moves in the first direction. (Kumar, ¶0062, altering the size of the image based on the slider input) Claim 16 Ham in view of Ezzeddine, Moon and Kumar teaches Claim 12, and further teaches determining a size of an emoticon displayed on the screen based on a degree of movement of a slider pointer. (Kumar, ¶0062, altering the size of the image based on the slider input) Claim 17 Ham in view of Ezzeddine, Moon and Kumar teaches Claim 12, and further teaches displaying profile images of the other account on a slider pointer. (Kumar, ¶0062, displaying images related to the account based on the slider) Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to CHRISTOPHER PALACA CADORNA whose telephone number is (571)270-0584. The examiner can normally be reached M-F 10:00-7:00. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, John Follansbee can be reached at (571) 272-3964. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /CHRISTOPHER P CADORNA/Examiner, Art Unit 2444 /JOHN A FOLLANSBEE/Supervisory Patent Examiner, Art Unit 2444
Read full office action

Prosecution Timeline

Nov 08, 2023
Application Filed
Aug 16, 2024
Non-Final Rejection — §103
Nov 08, 2024
Interview Requested
Nov 14, 2024
Examiner Interview Summary
Nov 14, 2024
Applicant Interview (Telephonic)
Nov 26, 2024
Response Filed
Dec 28, 2024
Final Rejection — §103
Mar 03, 2025
Interview Requested
Mar 11, 2025
Examiner Interview Summary
Mar 11, 2025
Applicant Interview (Telephonic)
Mar 31, 2025
Request for Continued Examination
Apr 07, 2025
Response after Non-Final Action
Apr 10, 2025
Non-Final Rejection — §103
Jul 10, 2025
Examiner Interview Summary
Jul 10, 2025
Applicant Interview (Telephonic)
Jul 15, 2025
Response Filed
Oct 18, 2025
Final Rejection — §103
Dec 22, 2025
Response after Non-Final Action
Jan 22, 2026
Request for Continued Examination
Jan 29, 2026
Response after Non-Final Action
Feb 21, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12563123
METHOD, APPARATUS, AND COMPUTER PROGRAM PRODUCT FOR ENLARGING USAGE OF USER CATEGORY WITHIN A CORE NETWORK
2y 5m to grant Granted Feb 24, 2026
Patent 12541244
OBTAINING LOCATION METADATA FOR NETWORK DEVICES USING AUGMENTED REALITY
2y 5m to grant Granted Feb 03, 2026
Patent 12537878
NEEDS-MATCHING NAVIGATOR SYSTEM
2y 5m to grant Granted Jan 27, 2026
Patent 12531762
Smart Energy Hub
2y 5m to grant Granted Jan 20, 2026
Patent 12513109
IPV6 ADDRESS CONFIGURATION METHOD AND ROUTING DEVICE
2y 5m to grant Granted Dec 30, 2025
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

5-6
Expected OA Rounds
68%
Grant Probability
89%
With Interview (+21.3%)
3y 3m
Median Time to Grant
High
PTA Risk
Based on 222 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month