Prosecution Insights
Last updated: April 19, 2026
Application No. 18/569,340

SHARING OF EXPERIENCE WITHOUT COMMUNICATION OF DATA OR KNOWLEDGE

Non-Final OA §103
Filed
Dec 12, 2023
Examiner
CHEN, GEORGE YUNG CHIEH
Art Unit
3628
Tech Center
3600 — Transportation & Electronic Commerce
Assignee
Smiths Detection France S A S
OA Round
3 (Non-Final)
48%
Grant Probability
Moderate
3-4
OA Rounds
4y 4m
To Grant
83%
With Interview

Examiner Intelligence

Grants 48% of resolved cases
48%
Career Allow Rate
208 granted / 435 resolved
-4.2% vs TC avg
Strong +35% interview lift
Without
With
+35.1%
Interview Lift
resolved cases with interview
Typical timeline
4y 4m
Avg Prosecution
33 currently pending
Career history
468
Total Applications
across all art units

Statute-Specific Performance

§101
30.8%
-9.2% vs TC avg
§103
40.8%
+0.8% vs TC avg
§102
10.5%
-29.5% vs TC avg
§112
13.1%
-26.9% vs TC avg
Black line = Tech Center average estimate • Based on career data from 435 resolved cases

Office Action

§103
Notice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . DETAILED ACTION This communication is a non-final action in response to RCE filed on 11/12/2025. Claims 1-19 are pending. A request for continued examination under 37 CFR 1.114, including the fee set forth in 37 CFR 1.17(e), was filed in this application after final rejection. Since this application is eligible for continued examination under 37 CFR 1.114, and the fee set forth in 37 CFR 1.17(e) has been timely paid, the finality of the previous Office action has been withdrawn pursuant to 37 CFR 1.114. Applicant's submission filed on 11/12/2025 has been entered. Response to Argument Applicant’s argument directed to 112(b) rejection are persuasive. The 112(b) rejection has been withdrawn. Applicant’s argument directed to 103 rejection has been fully considered but are not persuasive. Applicant argues on page 8 that Kushnir’s description of filter in 0061 would mean some data is transferred to global learner, as opposed to the claimed feature that all data being transferred to global learner undergo a transformation to ensure that none of the underlying cargo data remains to be transferred. Examiner agrees 0061’s description shows some data is transferred to global learner. However, Fig. 9’s embodiment demonstrate a different embodiment that teachers the limitation. In Fig. 9, it’s worth noting that data subsets are only transmitted between sensor and local monitor. Local monitor only sends local model to global monitor. This is explained in further detail in 0078 where data received from sensors are stored in database 620. Local learner also use data to modify local model 614. Subsequently in 0079, transmission to global monitor is done on local model 614 without referring data stored in 620. Furthermore, 0080 describes global monitor receives local models 614 and then combine these local models. Therefore, in this embodiment, sensor data is not transmitted to global learner or global monitor. This can also be shown in Fig. 6 that database 620 (used to store sensor data) is within local monitor 530. Global monitor (shown in Fig. 2), has its own database 220, which is separate. Fig. 9 and Fig. 5 also show the distinction between local monitor and global monitor. Converting data into models would be encoding as data is transferred from one form to another. Therefore, Kushmir teaches the limitation. Therefore, Applicant’s argument is not persuasive because Kushmir teaches the limitation in 0078-0080 and Fig. 9; additionally, Fig. 2, 5-6 are also useful for understanding the context. Examiner notes Applicant has examples of encoding such as neural weights or gradient, which can potentially be amended to overcome Kushnir. Applicant’s additional arguments are based on similarity and dependency, please refer to above. Claim Rejections - 35 USC § 103 In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis (i.e., changing from AIA to pre-AIA ) for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. The factual inquiries for establishing a background for determining obviousness under 35 U.S.C. 103 are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. Claims 1-4, 6-12, 18-19 is/are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Kushnir (US 20190182122) in view of Suresh (US 20200050893). As per claim 1, Kushnir discloses a computer-implemented method for sharing, between a plurality of entities managing cargo, experience of cargo data processing by the plurality of entities, each entity being separate from each other in the plurality of entities, the method comprising: a local learner of each entity building a local model of the built local model (see at least Kushnir, 0066, “A local monitor 530 … uses active learning to build a local model for a data analytics service based on the data”); a global learner, separate from the plurality of entities, obtaining at least a first relevant part of the respective local models built by each of the respective local learners, wherein the at least a first relevant part of the local model is encoded prior to being obtained by the global learner to ensure that no cargo data based on which the cargo data processing is performed is transferred to the global learner (see at least Kushnir, 0067, “Local monitor 530 … will report a local model to global monitor 130. Global monitor 130 may then combine the local models to generate the global model for the data analytics service” See also 0078-0080 and Fig. 9, local learner receives data from sensor, storing data in database 620, update local model and transmit local model to global learner. Additionally, see Fig. 2, 5-6 for additional context regarding relative location between local monitor, global monitor, and database 620 that stores sensor data); the global learner building a global model of the the local learner of each entity obtaining at least a second relevant part of the global model built by the global learner (see at least Kushnir, 0088, “Global monitor 130 then sends the global policy to local monitors 530.”); and the local learner of each entity outputting data about the such that experience of the Examiner notes that “such that” is an intended use language and Kushnir’s model may send data to global monitor directly without the need to send data to other local monitors). Kushnir discloses generally using sensor data but does not explicitly disclose these sensor data is related to cargo data. Suresh teaches using sensors on cargo ships (see at least Suresh, 0070, “objects may be … cargo ship …” see also 0112, “The sensor system may be disposed on the maritime vessel and may be used to collect data used to navigate the maritime vessel.”) Therefore, it would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to combine Suresh’s system of placing sensors on cargo ships with Kushnir’s global/local learning model for the purpose of implementing Kushnir’s system on a cargo ship. As per claim 2, Kushnir further discloses the method of claim 1, repeated periodically (see at least Kushnir, 0063, “steps 304-312 … may be repeated”. See also 0079, “local learner 612 may send local model 614 to global monitor 130 periodically”). As per claim 3, Kushnir further discloses the method of claim 1, wherein the at least a first relevant part of the respective local model comprises the whole of the respective local model (See at least Kushnir 0079, “local learner 612 may send local model 614 to global monitor 130 periodically”). As per claim 4, Kushnir further discloses the method of claim 1, wherein building the global model of the cargo data processing comprises fusing the respective first relevant parts of the respective local models (see at least Kushnir, 0080, “Global learner 212 then combines local models 614 to build or adjust a global model …”). As per claim 6, Kushnir further discloses the method of claim 1, wherein the local learner of each entity obtaining the at least a second relevant part of the global model further comprises updating the local model based on the at least a second relevant part of the global model (see at least Kushnir, 0078, “data is targeted … according to the local policy… local learner 612 … adjust local model 614 based on data targeted …” See also 0088 cited above that local policy is based on global policy and how global policy relates to claimed second relevant part). As per claim 7, Kushnir further discloses the method of claim 1, wherein each local learner comprises a machine learning algorithm, such as a neural network, running on a computer comprising a memory and a processor (see at least Kushnir, 0074, “…local learner 612 in each local monitor 530 has performed a training phase, testing phase, etc., to build local model 614 for the data analytics service” Examiner notes such as does not limit the scope of the claim) As per claim 8, Kushnir does not but Suresh teaches the method of claim 7, wherein the machine learning algorithm comprises at least one of: deep learning, KMeans, or Federated Forests (see at least Suresh, 0112, “… a sensor system and a processor for deep learning …”). Therefore, it would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to combine Suresh’s deep learning algorithm with Kushnir’s global/local learning model for the purpose of implementing Kushnir’s system on a cargo ship. As per claim 9, Kushnir further discloses the method of claim 1, wherein the global learner comprises a machine learning algorithm, such as a neural network, running on a computer comprising a memory and a processor (see at least Kushnir, 0080, “Global learner 212 may use ensemble learning techniques …”) As per claim 10, Kushnir does not but Suresh teaches the method of claim 9, wherein the machine learning algorithm comprises at least one of: deep learning, KMeans, or Federated Forests (see at least Suresh, 0112, “… a sensor system and a processor for deep learning …”). Therefore, it would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to combine Suresh’s deep learning algorithm with Kushnir’s global/local learning model for the purpose of implementing Kushnir’s system on a cargo ship. As per claim 11, Kushnir further discloses the method of claim 1, wherein the at least first relevant part of the local model is at least one of: encoded before being transferred to the global learner as parameters, such as neural weights, of the local model (see at least Kushnir, 0061, filter/blocking data that is not authorized for transmission. Examiner notes neural weights is a exemplary language that does not limit the scope); and/or encoded before being transferred to the global learner as gradients of the parameters of the local model. As per claim 12, Kushnir further discloses the method of claim 1, wherein the at least a second relevant part of the global model is processed before being transferred to the respective local learners to be adapted to the needs of the respective local learners (see at least Kushnir, 0087, “… global learner X assesses uncertainty in a global ensemble model (F_1, . . . , F_k), and sends a global policy P to the local learners” see also 0084, “Global learner 212 combines the different local models F_i to form a single global model F”. Examiner notes assessing is a form of processing). As per claims 18-19, they contain limitaions substantially similar to claim 1 except for computer components, which is taught in Kushnir (see Fig. 2 and 6 for computer hardware/software arrangement). Therefore, they’re rejected under similar rationale above. Claims 5, is/are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Kushnir (US 20190182122) in view of Suresh (US 20200050893), further in view of Chen (CN 110264116). As per claim 5, Kushnir/Suresh discloses the method of claim 4, but does not explicitly disclose wherein fusing the respective first relevant parts of the respective local models comprises juxtaposing the respective first relevant parts of the respective local models. Chen teaches using juxtaposing different models to create a more efficient training model (see at least Chen, page 6, paragraph 5, “… combining integrated learning and constructing a series of juxtaposed RT, to form integrated learning framework … using the feature selection of efficient sample set to train and update the integrated model”) Therefore, it would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to combine Chen’s juxtaposing method with Kushnir’s global/local learning model for the purpose of producing an efficient training set (Chen: page 6, paragraph 5). Claims 13-16, is/are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Kushnir (US 20190182122) in view of Suresh (US 20200050893), further in view of Saydag (US 20200193196). As per claim 13, Kushnir/Suresh does not but Saydag teaches the method of claim 1, wherein the processing of the cargo data comprises assessing risks associated with the cargo managed by each entity in the plurality of entities (see at least Saydag, 0053, “… if a malfunction in the system 10 is detected or suspected, information from the secondary shortrange sensor 24 and captured images from the optical sensor 22 can be compared to ascertain what may be causing the system malfunction, such as a mis-mounted or failed sensor.” See also 0051, “…it also gathers associated images …for use by the machine learning system 62”. Examiner notes malfunction is a risk associated). Therefore, it would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to combine Saydag’s risk management application with Kushnir’s global/local learning model for the purpose of improving profitability (Saydag: 0003). As per claim 14, Kushnir/Suresh does not but Saydag further teaches the method of claim 13, wherein assessing the risks triggers: selecting pieces of cargo in a flux of cargo for further examination such as, x- ray scanning or manual inspection of the selected cargo (see at least Saydag, 0053, “… if a malfunction in the system 10 is detected or suspected, information from the secondary shortrange sensor 24 and captured images from the optical sensor 22 can be compared to ascertain what may be causing the system malfunction, such as a mis-mounted or failed sensor.” Examiner notes such as clause does not limit the scope of the claim). The rationale to combine would persist from claim 13. As per claim 15, Kushnir/Suresh does not, but Saydag teaches the method of claim 1, wherein the processing of the cargo data comprises: automatically detecting objects of interest in images of the cargo inspected by each entity in the plurality of entities, the objects of interest comprising objects such as threats or smuggled goods (see at least Saydag, 0053, “… if a malfunction in the system 10 is detected or suspected, information from the secondary shortrange sensor 24 and captured images from the optical sensor 22 can be compared to ascertain what may be causing the system malfunction, such as a mis-mounted or failed sensor.” Examiner notes such as clause does not limit the scope of the claim); optionally further comprising: selecting pieces of cargo, based on the automatic detection (Examiner notes optional limitation does not need to be part of the claimed scope). Therefore, it would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to combine Saydag’s risk management application with Kushnir’s global/local learning model for the purpose of improving profitability (Saydag: 0003). As per claim 16, Kushnir/Sureh does not but Saydag teaches the method of claim 15, wherein the selecting of the cargo further comprises an operator in the entity creating an annotation associated with the selected cargo (see at least Saydag, 0042, “For example, if the customer determines that the trailer has been damaged within a certain period of time, the customer can command the cargo sensor module 20 to upload images that were captured at about the same time that they believe the trailer was damaged, and collect information from the tracker unit 40 indicating who was in control of the trailer when it was damaged, and potentially indicating how it was damaged.”). Therefore, it would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to combine Saydag’s customer review/documentation method with Kushnir’s global/local learning model for the purpose of documenting information relevant to the trailer (Saydag: 0042). Claims 17 is/are rejected under pre-AIA 35 U.S.C. 103(a) as being unpatentable over Kushnir (US 20190182122) in view of Suresh (US 20200050893), further in view of Hamm (US 20170206498) As per claim 17, Kushnir/Sureh does not but Hamm teaches the method of claim 1, wherein at least one entity managing cargo comprises a customs organization (see at least Hamm, 0104, “if other shipments have sensor devices 102a-102n that monitor the package through customs, historical monitoring data may be used to estimate the duration of this period.”). Therefore, it would have been obvious for one ordinary skilled in the art before the effective filing date of present invention to combine Hamm’s custom organization with Kushnir’s global/local learning model for the purpose of estimating duration to clear custom (Hamm: 0104). Conclusion Any inquiry concerning this communication or earlier communications from the examiner should be directed to GEORGE CHEN whose telephone number is (571)270-5499. The examiner can normally be reached Monday-Friday, 8:30 AM -5:00 PM Eastern. Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Resha Desai can be reached at 571-270-7792. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. GEORGE CHEN Primary Examiner Art Unit 3628 /GEORGE CHEN/Primary Examiner, Art Unit 3628
Read full office action

Prosecution Timeline

Dec 12, 2023
Application Filed
Dec 27, 2024
Non-Final Rejection — §103
Jun 02, 2025
Response Filed
Aug 08, 2025
Final Rejection — §103
Nov 12, 2025
Request for Continued Examination
Nov 18, 2025
Response after Non-Final Action
Nov 21, 2025
Non-Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12586013
SYSTEMS AND METHODS FOR JOINT OPTIMIZATION OF ASSIGNMENTS
2y 5m to grant Granted Mar 24, 2026
Patent 12579503
NEURAL NETWORKS TO GENERATE RELIABILITY SCORES
2y 5m to grant Granted Mar 17, 2026
Patent 12561699
SUSTAINABLE ENTERPRISE ENERGY BALANCING
2y 5m to grant Granted Feb 24, 2026
Patent 12547135
OPTIMIZATION METHOD AND SYSTEM FOR TURBINES OF THERMAL POWER UNIT BASED ON SPARSE BIG DATA MINING
2y 5m to grant Granted Feb 10, 2026
Patent 12518328
SYSTEMS AND METHODS FOR GENERATING IMAGE-BASED PROPERTY RECOMMENDATIONS AND IMPROVED GRAPHICAL USER INTERFACES
2y 5m to grant Granted Jan 06, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
48%
Grant Probability
83%
With Interview (+35.1%)
4y 4m
Median Time to Grant
High
PTA Risk
Based on 435 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month