Prosecution Insights
Last updated: April 19, 2026
Application No. 19/105,580

BLOCK-BASED STRUCTURE FOR HAPTIC DATA

Non-Final OA §103
Filed
Feb 21, 2025
Examiner
SARMA, ABHISHEK
Art Unit
2621
Tech Center
2600 — Communications
Assignee
Interdigital Ce Patent Holdings SAS
OA Round
1 (Non-Final)
84%
Grant Probability
Favorable
1-2
OA Rounds
2y 0m
To Grant
85%
With Interview

Examiner Intelligence

Grants 84% — above average
84%
Career Allow Rate
478 granted / 572 resolved
+21.6% vs TC avg
Minimal +2% lift
Without
With
+1.6%
Interview Lift
resolved cases with interview
Fast prosecutor
2y 0m
Avg Prosecution
18 currently pending
Career history
590
Total Applications
across all art units

Statute-Specific Performance

§101
4.4%
-35.6% vs TC avg
§103
73.0%
+33.0% vs TC avg
§102
11.0%
-29.0% vs TC avg
§112
4.8%
-35.2% vs TC avg
Black line = Tech Center average estimate • Based on career data from 572 resolved cases

Office Action

§103
DETAILED ACTION Notice of Pre-AIA or AIA Status The present application is being examined under the pre-AIA first to invent provisions. In the response to this Office Action, the Examiner respectfully requests that support be shown for language added to any original claims on amendment and any new claims. That is, indicate support for newly added claim language by specifically pointing to page(s) and line numbers in the specification and/or drawing figure(s). This will assist the Examiner in prosecuting this application. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. Claims 1-16 are rejected under 35 U.S.C. 103 as being unpatentable over U.S. Patent Application Publication 2021/0397260 A1 to Birnbaum et al. (hereinafter "Birnbaum") in view of “Text for Working Draft of ISO/IEC 23090-31: Haptics Coding” (hereinafter "Haptics Coding"). Regarding Claims 1 and 5, Birnbaum teaches a method and a device for encoding haptic data of a haptic sequence (Claim 1; Figs. 1A-3; Para. 31-40 of Birnbaum; computing device 1100 may be configured to facilitate the providing of a haptic effect for experiencing the 3D environment by generating a drive signal 1400 for a haptic output device 1210), the device comprising a processor configured for: obtaining binary haptic data representative of haptic effects and metadata describing the haptic effects (Claim 1; Figs. 1A-3; Para. 31-40, 113 of Birnbaum; computing device 1100 may include one or more processors 1110 that are configured to receive media data 1300 that describes aspects of the 3D environment, and may generate the drive signal 1400 based on the media data 1300. The media data 1300 may have, e.g., an omnidirectional media format (OMAF) for allowing the 3D environment to be viewed in multiple directions, or may have some other format… media data may be generated according to the MPEG-I standard… haptic effect may be defined in metadata at a source and rendered at a device on the client side, such as the computing device 1100 or user peripheral device 1200); encoding tracks and the metadata in access units (Claim 1; Figs. 1A-3; Para. 31-40, 113 of Birnbaum; media data may have been created by encoding various sources of data, such as video data, audio data, and haptic data… media data may be generated according to the MPEG-I standard… a device (e.g., computing device 1500) may encapsulate and/or store the media data, which may include the encoded haptic data, audio data, video data, and/or image data, in a file format such as the International Standards Organization Base Media File Format (ISOBMFF)… haptic effect may be defined in metadata at a source and rendered at a device on the client side, such as the computing device 1100 or user peripheral device 1200). Birnbaum does not explicitly disclose decomposing the haptic effects in temporal events and frequency bands and grouping them in tracks; and encoding the tracks and the metadata, the metadata comprising perception data pointing to a sub-set of tracks and experience data pointing to a sub-set of perception data. However, Haptics Coding teaches decomposing the haptic effects in temporal events and frequency bands and grouping them in tracks (Figures 1-3; Section 3.1 to 3.2, 4.1 to 4.7 of Haptics Coding; haptic data of a track is contained in a set of haptic bands defined by their frequency range); and encoding the tracks and the metadata, the metadata comprising perception data pointing to a sub-set of tracks and experience data pointing to a sub-set of perception data (Figures 1-3, 8-9; Section 3.1 to 3.2, 4.1 to 4.8, 5.1, 6.1 to 6.2 of Haptics Coding; In addition to specific metadata, a perception contains a list of tracks where the data is decomposed in frequency bands. Each band defines part of the signal in a given frequency range. The bands are described with a list of Haptic effects each containing a list of keyframes. The haptic signal in a track can then be reconstructed by combining the data in the different bands as illustrated in Figure 3… haptic signals can be encoded on multiple tracks… a haptic track defines a signal to be rendered at a specific body location. Metadata stored at the track level includes information such as the gain associated to the track, the mixing weight, the desired body location of the haptic feedback and optionally the reference device and/or a direction. Additional information such as the desired sampling frequency or sample count can also be provided). Therefore, at the time when the invention was filed, it would have been obvious to a person of ordinary skill in the art to include decomposing the haptic effects in temporal events and frequency bands and grouping them in tracks; and encoding the tracks and the metadata, the metadata comprising perception data pointing to a sub-set of tracks and experience data pointing to a sub-set of perception data using the teachings of Haptics Coding in order to modify the device taught by Birnbaum. The motivation to combine these analogous arts would have been to encode both descriptive and quantized data in a human readable JSON format used for exchange purposes and a compressed bitstream version, optimized for memory usage, for distribution purposes (Section 1 of Haptics Coding). Regarding Claims 9 and 13, Birnbaum teaches a method and a device for decoding haptic data of a haptic sequence (Claim 1; Figs. 1A-4; Para. 31-45 of Birnbaum; computing device 1100 may be configured to facilitate the providing of a haptic effect for experiencing the 3D environment by generating a drive signal 1400 for a haptic output device 1210… decoding and/or rendering of media data may be performed by the computing device 1100 of FIGS. 1A-1C. FIG. 4 illustrates a computing device 4100, which may be an embodiment of the computing device 1100. In this embodiment, the computing device 4100 may include at least one processor 4110 and a memory 4120… modules 4120 a, 4120 b may include instructions which may be executed by the processor 4110 to execute a haptic decoding operation), the device comprising a processor configured for: obtaining a set of tracks and metadata encoded in access units (Claim 1; Figs. 1A-3; Para. 31-42, 113 of Birnbaum; media data may have been created by encoding various sources of data, such as video data, audio data, and haptic data… media data may be generated according to the MPEG-I standard… a device (e.g., computing device 1500) may encapsulate and/or store the media data, which may include the encoded haptic data, audio data, video data, and/or image data, in a file format such as the International Standards Organization Base Media File Format (ISOBMFF)… computing device 1100 may receive and process (e.g., decapsulate) the file F/Fs, such as the ISOBMFF file, which may include file data that encodes the media data. More particularly, the processed file data may include encoded haptic data E′h, encoded audio data E′a, and/or encoded video data E′v. In some implementations, the OMAF player may be configured to decode various encoded data using a codec… haptic effect may be defined in metadata at a source and rendered at a device on the client side, such as the computing device 1100 or user peripheral device 1200). Birnbaum does not explicitly disclose the metadata comprising perception data pointing to a sub-set of tracks and experience data pointing to a sub-set of perception data; and accessing the experience data in tracks pointed by perception data associated with the experience data. However, Haptics Coding teaches metadata comprising perception data pointing to a sub-set of tracks and experience data pointing to a sub-set of perception data; and accessing the experience data in tracks pointed by perception data associated with the experience data (Figures 1-3, 8-9; Section 3.1 to 3.2, 4.1 to 4.8, 5.1, 6.1 to 6.2, 7.1 to 7.2 of Haptics Coding; haptic data of a track is contained in a set of haptic bands defined by their frequency range… In addition to specific metadata, a perception contains a list of tracks where the data is decomposed in frequency bands. Each band defines part of the signal in a given frequency range. The bands are described with a list of Haptic effects each containing a list of keyframes. The haptic signal in a track can then be reconstructed by combining the data in the different bands as illustrated in Figure 3… haptic signals can be encoded on multiple tracks… a haptic track defines a signal to be rendered at a specific body location. Metadata stored at the track level includes information such as the gain associated to the track, the mixing weight, the desired body location of the haptic feedback and optionally the reference device and/or a direction. Additional information such as the desired sampling frequency or sample count can also be provided). Therefore, at the time when the invention was filed, it would have been obvious to a person of ordinary skill in the art to include the metadata comprising perception data pointing to a sub-set of tracks and experience data pointing to a sub-set of perception data; and accessing the experience data in tracks pointed by perception data associated with the experience data using the teachings of Haptics Coding in order to modify the device taught by Birnbaum. The motivation to combine these analogous arts would have been to encode both descriptive and quantized data in a human readable JSON format used for exchange purposes and a compressed bitstream version, optimized for memory usage, for distribution purposes (Section 1 of Haptics Coding). Regarding Claims 6, 10, 14, and 18, the combination of Birnbaum and Haptics Coding teaches that the access units are Network Abstraction Layer units (Figs. 1A-3; Para. 31-40 of Birnbaum; media data may be generated according to the MPEG-I standard). Regarding Claims 7, 11, 15, and 19, the combination of Birnbaum and Haptics Coding teaches that the access units are structured depending on whether they comprise metadata or band-data (Figs. 1A-3; Para. 31-40 of Birnbaum; media data may be generated according to the MPEG-I standard… Section 6.1 to 6.2 of Haptics Coding; PNG media_image1.png 270 802 media_image1.png Greyscale ). Regarding Claims 8, 12, 16, and 20, the combination of Birnbaum and Haptics Coding teaches that the metadata information describing a haptic effect library, avatars or devices (Section 3.2 to 4.6, 6.1 of Haptics Coding; For each haptic perception, metadata information is provided on the modality, the corresponding avatar representation, and technical characteristics of compatible haptic devices… Metadata stored at the track level includes information such as the gain associated to the track, the mixing weight, the desired body location of the haptic feedback and optionally the reference device and/or a direction). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to ABHISHEK SARMA whose telephone number is (571)272-9887. The examiner can normally be reached on Mon - Fri 8:00-5:00. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Amr Awad can be reached on 571-272-7764. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /ABHISHEK SARMA/ Primary Examiner, Art Unit 2621
Read full office action

Prosecution Timeline

Feb 21, 2025
Application Filed
Jan 24, 2026
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602122
INFORMATION HANDLING SYSTEM TOUCH DETECTION DEVICE GROUNDING AND SELF-TEST
2y 5m to grant Granted Apr 14, 2026
Patent 12597288
DISPLAY DEVICE
2y 5m to grant Granted Apr 07, 2026
Patent 12586256
DATA PROCESSING METHOD AND DATA PROCESSING SYSTEM
2y 5m to grant Granted Mar 24, 2026
Patent 12586519
DISPLAY APPARATUS AND METHOD OF MANUFACTURING THE SAME
2y 5m to grant Granted Mar 24, 2026
Patent 12579398
FINGERPRINT SENSOR PACKAGE AND SMART CARD INCLUDING THE SAME
2y 5m to grant Granted Mar 17, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

1-2
Expected OA Rounds
84%
Grant Probability
85%
With Interview (+1.6%)
2y 0m
Median Time to Grant
Low
PTA Risk
Based on 572 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month