Prosecution Insights
Last updated: April 19, 2026
Application No. 18/495,108

IDENTIFICATION BASED ON A LOCATION OF A MOBILE COMMUNICATIONS DEVICE

Final Rejection §103
Filed
Oct 26, 2023
Examiner
DANG, PHILIP
Art Unit
2488
Tech Center
2400 — Computer Networks
Assignee
Cellxion Ltd.
OA Round
2 (Final)
77%
Grant Probability
Favorable
3-4
OA Rounds
2y 10m
To Grant
99%
With Interview

Examiner Intelligence

Grants 77% — above average
77%
Career Allow Rate
363 granted / 470 resolved
+19.2% vs TC avg
Strong +33% interview lift
Without
With
+33.2%
Interview Lift
resolved cases with interview
Typical timeline
2y 10m
Avg Prosecution
49 currently pending
Career history
519
Total Applications
across all art units

Statute-Specific Performance

§101
4.5%
-35.5% vs TC avg
§103
48.6%
+8.6% vs TC avg
§102
11.1%
-28.9% vs TC avg
§112
25.5%
-14.5% vs TC avg
Black line = Tech Center average estimate • Based on career data from 470 resolved cases

Office Action

§103
DETAILED ACTIONNotice of Pre-AIA or AIA Status The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA . Applicant Response to Official Action The response filed on 1/15/2026 has been entered and made of record. Acknowledgment Claims 1, 9, 11-12, 14, and 16-19, amended on 1/15/2026, are acknowledged by the examiner. Response to Arguments Applicant’s arguments with respect to claims 1, 14, and their dependent claims have been considered but they are moot in view of the new grounds of rejection necessitated by amendments initiated by the applicant. Examiner addresses the main arguments of the Applicant as below. Regarding the drawing objection, the amendment filed on 1/15/2026 addresses the issue. As a result, the drawing objection is withdrawn. Regarding the 35 U.S.C. 112(a) rejection, the amendment filed on 1/15/2026 addresses the issue. As a result, the 35 U.S.C. 112(a) rejection is withdrawn. Regarding the 35 U.S.C. 112(b) rejection, the amendment filed on 1/15/2026 addresses the issue. As a result, the 35 U.S.C. 112(b) rejection is withdrawn. Claim Rejections - 35 USC § 103 The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action: A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made. In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status. The factual inquiries set forth in Graham v. John Deere Co., 383 U.S. 1, 148 USPQ 459 (1966), that are applied for establishing a background for determining obviousness under pre-AIA 35 U.S.C. 103(a) are summarized as follows: 1. Determining the scope and contents of the prior art. 2. Ascertaining the differences between the prior art and the claims at issue. 3. Resolving the level of ordinary skill in the pertinent art. 4. Considering objective evidence present in the application indicating obviousness or nonobviousness. This application currently names joint inventors. In considering patentability of the claims under pre-AIA 35 U.S.C. 103(a), the examiner presumes that the subject matter of the various claims was commonly owned at the time any inventions covered therein were made absent any evidence to the contrary. Applicant is advised of the obligation under 37 CFR 1.56 to point out the inventor and invention dates of each claim that was not commonly owned at the time a later invention was made in order for the examiner to consider the applicability of pre-AIA 35 U.S.C. 103(c) and potential pre-AIA 35 U.S.C. 102(e), (f) or (g) prior art under pre-AIA 35 U.S.C. 103(a). Claims 1-2, 4-8, 10-14, 16, and 18-20 are rejected under 35 U.S.C. 103 as being unpatentable over Bergdale et al. (US Patent 11,803,784 B2), (“Bergdale”), in view of Lamb et al. (US Patent 9,699,431 B2), (“Lamb”). Regarding claim 1, Bergdale meets the claim limitations as follow. A system comprising (i.e. a system) [Bergdale: Abstract]: a transmitter (i.e. a transmitter) [Bergdale: col. 50, line 7] arranged to transmit a beacon signal (i.e. Bluetooth beacon signals may be transmitted) [Bergdale: col. 9, line 19-20] for reception by a mobile communications device (i.e. the mobile device 17 may receive a Bluetooth beacon signal) [Bergdale: col. 9, line 18]; a timing entity (i.e. a positioning engine may timestamp the 2D location data) [Bergdale: col. 23, line 54-55] communicable with the mobile communications device via a base station apparatus (i.e. Referring now to FIG. 11, the system components in the operating configuration 136 may primarily include a Bluetooth gateway (such as a BLE gateway) 138; a positioning unit comprising a first set of device locators 140, a second set of device locators 142, and a positioning engine 144; at least one camera such as a three-dimensional (3D) camera 146; a gateless entry controller 148; and a router such as an Ethernet router 150. In particular embodiments, the operating configuration 136 also may optionally include at least one BLE ( or other type of Bluetooth) wake-up beacon 152 and a database 154. In one embodiment, the router 150 may be connected to or operatively / communicatively coupled to the system components 138, 140, 142, 144, 146, and 148 via respective wired connections, as shown by unbroken, bidirectional arrows in FIG. 11. In particular embodiments, some or all of these wired connections may be Ethernet connections) [Bergdale: col. 23, line 50-66; Fig. 11], the timing entity being arranged to: receive a plurality of measurement report from the mobile communications device (i.e. Referring now to FIGS. 7-8, the "Controller Messages" are the messages sent between the use app 12 and the controller driver 14. These messages may typically contain commands or data which will inform the controller driver 14 how close the mobile device 17 is to the fare gate 70. The "Controller Responses" are responses sent by the controller driver 14 to the user app 12. The "Gate Beacon Advertising Packets" in FIG. 7 refer to information sent from the gate beacon(s) 64-65. This information may be used to detect the proximity of the mobile device 17 with the fare gate 70. On the other hand, the "Wake-Up Beacon Advertising Packets" in FIG. 7 refer to information sent from the wake-up beacon(s) 62. This information may be used to get the user app 12 into a ready state for entering through a fare gate-such as the fare gate 70-that is enabled for handsfree fare validation as per teachings of the present disclosure. In FIG. 7, the term "User Data In" refers to the data that a user 97 running the FV user app 12 (on the user's mobile device 17) enters through a user interface provided by the user app 12. On the other hand, the term "User Data Out" refers to the data that is displayed via the user interface to the user 97 running the FV user app 12. The term "User Control" refers to the control information sent from the mobile device 17 running the FV user app 12) [Bergdale: col. 20, line 5-28] via the base station apparatus (i.e. Referring now to FIG. 11, the system components in the operating configuration 136 may primarily include a Bluetooth gateway (such as a BLE gateway) 138; a positioning unit comprising a first set of device locators 140, a second set of device locators 142, and a positioning engine 144; at least one camera such as a three-dimensional (3D) camera 146; a gateless entry controller 148; and a router such as an Ethernet router 150. In particular embodiments, the operating configuration 136 also may optionally include at least one BLE ( or other type of Bluetooth) wake-up beacon 152 and a database 154. In one embodiment, the router 150 may be connected to or operatively / communicatively coupled to the system components 138, 140, 142, 144, 146, and 148 via respective wired connections, as shown by unbroken, bidirectional arrows in FIG. 11. In particular embodiments, some or all of these wired connections may be Ethernet connections) [Bergdale: col. 23, line 50-66; Fig. 11]; and determine a capture time (i.e. the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 33-35] in dependence on the plurality of measurement reports (i.e. 11. When the user 163 enters the pre-determined region 170 (FIG. 11) or a similar coverage location within the proximity area 176 (FIG. 12) with the mobile phone in possession, the 3D time-of-flight camera 146 may detect the person 163 as an object in the camera's field of view 160. The camera 146 may generate a 2D version of the location of this "object" indicating the x-y position of the user 163 within the pre-defined region 170 (or a similar coverage location within the proximity area 176). The camera 146 may then communicate the presence and position of this "object" to the controller driver 14 by sending a timestamped version of the 2D "object" location data to the entry controller 148. The controller driver 14 may collect this location data as an x-y position with a timestamp and may optionally smooth the received data using techniques such as Kalman filtering and cubic splines.) [Bergdale: col. 30, line 63 – col. 31, line 12]) indicating a relative peak measurement (i.e. the time period for example, peak time) [Bergdale: col. 19, line 41], among the plurality of measurement reports (i.e. Referring now to FIGS. 7-8, the "Controller Messages" are the messages sent between the use app 12 and the controller driver 14. These messages may typically contain commands or data which will inform the controller driver 14 how close the mobile device 17 is to the fare gate 70. The "Controller Responses" are responses sent by the controller driver 14 to the user app 12. The "Gate Beacon Advertising Packets" in FIG. 7 refer to information sent from the gate beacon(s) 64-65. This information may be used to detect the proximity of the mobile device 17 with the fare gate 70. On the other hand, the "Wake-Up Beacon Advertising Packets" in FIG. 7 refer to information sent from the wake-up beacon(s) 62. This information may be used to get the user app 12 into a ready state for entering through a fare gate-such as the fare gate 70-that is enabled for handsfree fare validation as per teachings of the present disclosure. In FIG. 7, the term "User Data In" refers to the data that a user 97 running the FV user app 12 (on the user's mobile device 17) enters through a user interface provided by the user app 12. On the other hand, the term "User Data Out" refers to the data that is displayed via the user interface to the user 97 running the FV user app 12. The term "User Control" refers to the control information sent from the mobile device 17 running the FV user app 12) [Bergdale: col. 20, line 5-28], of the beacon signal (i.e. Bluetooth beacon signals may be transmitted) [Bergdale: col. 9, line 19-20] by the mobile communications device (i.e. Referring now to FIGS. 7-8, the "Controller Messages" are the messages sent between the use app 12 and the controller driver 14. These messages may typically contain commands or data which will inform the controller driver 14 how close the mobile device 17 is to the fare gate 70. The "Controller Responses" are responses sent by the controller driver 14 to the user app 12. The "Gate Beacon Advertising Packets" in FIG. 7 refer to information sent from the gate beacon(s) 64-65. This information may be used to detect the proximity of the mobile device 17 with the fare gate 70. On the other hand, the "Wake-Up Beacon Advertising Packets" in FIG. 7 refer to information sent from the wake-up beacon(s) 62. This information may be used to get the user app 12 into a ready state for entering through a fare gate-such as the fare gate 70-that is enabled for handsfree fare validation as per teachings of the present disclosure. In FIG. 7, the term "User Data In" refers to the data that a user 97 running the FV user app 12 (on the user's mobile device 17) enters through a user interface provided by the user app 12. On the other hand, the term "User Data Out" refers to the data that is displayed via the user interface to the user 97 running the FV user app 12. The term "User Control" refers to the control information sent from the mobile device 17 running the FV user app 12) [Bergdale: col. 20, line 5-28]; and a camera (i.e. a camera) [Bergdale: col. 51, line 3] arranged to capture an image (i.e. camera images) [Bergdale: col. 50, line 67] at the determined capture time (i.e. the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 33-35], for use in identifying a person or vehicle carrying the mobile communications device (i.e. In particular embodiments, potential database fields may include some or all of the following: (i) a data field for unique transit vehicle/station identifier, (ii) a data field for transit vehicle stop/route information including station name and Global Positioning System (GPS) location, (iii) a data field for configuration information for each transit vehicle-based BLE gateway, (iv) a data field for the controller driver 14 configuration information, and (v) a data field for 3D camera configuration information. These database fields are exemplary only. In other embodiments, depending on the implementation of the gateless entry/exit system, the data fields may be more than, less than, or different from those listed above.) [Bergdale: col. 26, line 22-34; Figs. 11-12], wherein the camera is positioned relative to the transmitter such that the image captured at the determined capture time contains the person or vehicle (i.e. Referring again to FIGS. 11-12, a brief description of the exemplary hardware features of the system components shown therein is provided. In particular embodiments, the BLE wake-up beacon 152 may be functionally similar to the wake-up beacon 62 in FIG. 5 and, hence, additional discussion of the hardware features of the wake-up beacon 152 is not provided. Briefly, the wake-up beacon 152 may be a connectionless (wireless) BLE beacon that advertises data to indicate to a mobile device with the user app 12, such as the mobile device 17, that the user 163 of the mobile device is approaching a hands-free ticketing platform that has automatic fare validation and gateless entry/exit. Similarly, the 3D camera(s) 146 may be functionally substantially similar to one or more of the "people counting devices" 67-68 and, hence, the hardware features of the 3D camera(s) 146 are not discussed in further details here. In some embodiments, however, the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 17-35; Figs. 5-6, 11-12]. Bergdale does not explicitly disclose the following claim limitations (Emphasis added). wherein the camera is positioned relative to the transmitter However, in the same field of endeavor Lamb further discloses the claim limitations and the deficient claim limitations, as follows: wherein the camera is positioned relative to the transmitter (the position of an IR transmitter (i.e., IR beacon) within the field of view of the camera, etc) [Lamb: col. 25, line 22-23]. It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bergdale with Lamb to program the system to implement of Lamb’s method. Therefore, the combination of Bergdale with Lamb will enable the system to substantially support real time collecting audio-visual data, thereby reducing time and special equipment needed for post-processing of such disparate streams of data into a single viewable audio-visual file [Lamb: col. 2, line 41-47]. Regarding claim 2, Bergdale meets the claim limitations as set forth in claim 1. Bergdale further meets the claim limitations as follow. an image processor arranged to process the captured image to generate identification data for identifying the person or vehicle (i.e. In another embodiment, the present disclosure is directed discussed in more detail later below. to a control unit associated with a transit system. The control unit comprises: (i) an interface unit; (ii) a memory for storing program instructions; and (iii) a processor coupled to the interface unit and to the memory. In the control unit, the interface unit is operable to receive sensor data from a plurality of sensors in the transit system, wherein the control unit is communicatively coupled with the sensors, and wherein each sensor-specific portion of the sensor data includes at least one of the following: (a) a sensor-specific passenger data defining one or more attributes of a user availing a transit service in the transit system, (b) a sensor specific vehicle data defining one or more attributes of a transit vehicle associated with the transit service, and (c) a sensor-specific station data defining one or more attributes of a transit station associated with the transit service. In the control unit, the memory also stores the sensor data received by the interface unit. The processor in the control unit is operable to execute the program instructions, which, when executed by the processor, cause the control unit to: (i) combine received sensor-specific passenger data to generate a system-specific passenger data, received sensor-specific vehicle data to generate a system-specific vehicle data, and received sensor-specific station data to generate a system-specific station data; (ii) analyze the system-specific passenger data, the system-specific vehicle data, and the system- specific station data; and (iii) perform at least one of the following based on the analysis of the system-specific passenger data, the system-specific vehicle data, and the system-specific station data: (a) facilitate management of passenger-handling capacity of at least one of the transit station and the transit vehicle, (b) dynamically plan a trip for the user availing the transit service, ( c) facilitate detection of fraud for the transit service, and ( d) dynamically plan a route for the transit vehicle) [Bergdale: col. 3, line 21-55]. In the same field of endeavor Lamb further discloses the claim limitations and the deficient claim limitations, as follows: an image processor arranged to process the captured image to generate identification data for identifying the person or vehicle (The processor can then handle transmission of the input type and timestamp back to the base device and/or mobile computing device, wherein Block S140 responds to the input type accordingly. In this example, Block S150 can also implement the input type and corresponding timestamp to trigger insertion of content (e.g., a static image of a slide from a slide deck) into the multimedia stream. In another example, the processor can similarly correlate an input selection with a marker type from a set of available marker types, assign a timestamp to the marker type, and transmit the marker type and timestamp back to the base device and/or mobile computing device, wherein Block S142 receives the marker type as a time-stamped event. In this example, Block S150 can implement the time-stamped event as a searchable marker within the multimedia stream. In the foregoing examples, the processor can select an input type from a list of instructions or markers including any of: <begin presentation>, <end presentation>, <next page/ slide>, <previous page/slide>, <begin timer>, <end timer>, <toggle to secondary camera>, etc.) [Lamb: col. 5, line 25-45]. It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bergdale with Lamb to program the system to implement of Lamb’s method. Therefore, the combination of Bergdale with Lamb will enable the system to substantially support real time collecting audio-visual data, thereby reducing time and special equipment needed for post-processing of such disparate streams of data into a single viewable audio-visual file [Lamb: col. 2, line 41-47]. Regarding claim 4, Bergdale meets the claim limitations as set forth in claim 2. Bergdale further meets the claim limitations as follow. arranged to identify a person (i.e. a face recognition program to be executed by UE 17, etc.)) [Bergdale: col. 7, line 65-66], wherein the image processor is arranged to perform (i.e. In another embodiment, the present disclosure is directed discussed in more detail later below. to a control unit associated with a transit system. The control unit comprises: (i) an interface unit; (ii) a memory for storing program instructions; and (iii) a processor coupled to the interface unit and to the memory. In the control unit, the interface unit is operable to receive sensor data from a plurality of sensors in the transit system, wherein the control unit is communicatively coupled with the sensors, and wherein each sensor-specific portion of the sensor data includes at least one of the following: (a) a sensor-specific passenger data defining one or more attributes of a user availing a transit service in the transit system, (b) a sensor specific vehicle data defining one or more attributes of a transit vehicle associated with the transit service, and (c) a sensor-specific station data defining one or more attributes of a transit station associated with the transit service. In the control unit, the memory also stores the sensor data received by the interface unit. The processor in the control unit is operable to execute the program instructions, which, when executed by the processor, cause the control unit to: (i) combine received sensor-specific passenger data to generate a system-specific passenger data, received sensor-specific vehicle data to generate a system-specific vehicle data, and received sensor-specific station data to generate a system-specific station data; (ii) analyze the system-specific passenger data, the system-specific vehicle data, and the system- specific station data; and (iii) perform at least one of the following based on the analysis of the system-specific passenger data, the system-specific vehicle data, and the system-specific station data: (a) facilitate management of passenger-handling capacity of at least one of the transit station and the transit vehicle, (b) dynamically plan a trip for the user availing the transit service, ( c) facilitate detection of fraud for the transit service, and ( d) dynamically plan a route for the transit vehicle) [Bergdale: col. 3, line 21-55] facial recognition to identify the target (i.e. a face recognition program to be executed by UE 17, etc.)) [Bergdale: col. 7, line 65-66]. In the same field of endeavor Goldfarb further discloses the claim limitations as follows: perform facial recognition to identify the target (i.e. Block S122 can implement facial detection to detect the user's face and then generate the position signal to maintain the user's face within the center of the field of view of the first camera) [Lamb: col. 11, line 4-7]. It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bergdale with Lamb to program the system to implement of Lamb’s method. Therefore, the combination of Bergdale with Lamb will enable the system to substantially support real time collecting audio-visual data, thereby reducing time and special equipment needed for post-processing of such disparate streams of data into a single viewable audio-visual file [Lamb: col. 2, line 41-47]. Regarding claim 5, Bergdale meets the claim limitations as set forth in claim 2. Bergdale further meets the claim limitations as follow. said identification data is first identification data (i.e. In another embodiment, the present disclosure is directed discussed in more detail later below. to a control unit associated with a transit system. The control unit comprises: (i) an interface unit; (ii) a memory for storing program instructions; and (iii) a processor coupled to the interface unit and to the memory. In the control unit, the interface unit is operable to receive sensor data from a plurality of sensors in the transit system, wherein the control unit is communicatively coupled with the sensors, and wherein each sensor-specific portion of the sensor data includes at least one of the following: (a) a sensor-specific passenger data defining one or more attributes of a user availing a transit service in the transit system, (b) a sensor specific vehicle data defining one or more attributes of a transit vehicle associated with the transit service, and (c) a sensor-specific station data defining one or more attributes of a transit station associated with the transit service. In the control unit, the memory also stores the sensor data received by the interface unit. The processor in the control unit is operable to execute the program instructions, which, when executed by the processor, cause the control unit to: (i) combine received sensor-specific passenger data to generate a system-specific passenger data, received sensor-specific vehicle data to generate a system-specific vehicle data, and received sensor-specific station data to generate a system-specific station data; (ii) analyze the system-specific passenger data, the system-specific vehicle data, and the system- specific station data; and (iii) perform at least one of the following based on the analysis of the system-specific passenger data, the system-specific vehicle data, and the system-specific station data: (a) facilitate management of passenger-handling capacity of at least one of the transit station and the transit vehicle, (b) dynamically plan a trip for the user availing the transit service, ( c) facilitate detection of fraud for the transit service, and ( d) dynamically plan a route for the transit vehicle) [Bergdale: col. 3, line 21-55]; the camera is arranged to capture a stream of images each having a respective time stamp (i.e. 11. When the user 163 enters the pre-determined region 170 (FIG. 11) or a similar coverage location within the proximity area 176 (FIG. 12) with the mobile phone in possession, the 3D time-of-flight camera 146 may detect the person 163 as an object in the camera's field of view 160. The camera 146 may generate a 2D version of the location of this "object" indicating the x-y position of the user 163 within the pre-defined region 170 (or a similar coverage location within the proximity area 176). The camera 146 may then communicate the presence and position of this "object" to the controller driver 14 by sending a timestamped version of the 2D "object" location data to the entry controller 148. The controller driver 14 may collect this location data as an x-y position with a timestamp and may optionally smooth the received data using techniques such as Kalman filtering and cubic splines.) [Bergdale: col. 30, line 63 – col. 31, line 12]; the image processor is arranged to (i.e. In another embodiment, the present disclosure is directed discussed in more detail later below. to a control unit associated with a transit system. The control unit comprises: (i) an interface unit; (ii) a memory for storing program instructions; and (iii) a processor coupled to the interface unit and to the memory. In the control unit, the interface unit is operable to receive sensor data from a plurality of sensors in the transit system, wherein the control unit is communicatively coupled with the sensors, and wherein each sensor-specific portion of the sensor data includes at least one of the following: (a) a sensor-specific passenger data defining one or more attributes of a user availing a transit service in the transit system, (b) a sensor specific vehicle data defining one or more attributes of a transit vehicle associated with the transit service, and (c) a sensor-specific station data defining one or more attributes of a transit station associated with the transit service. In the control unit, the memory also stores the sensor data received by the interface unit. The processor in the control unit is operable to execute the program instructions, which, when executed by the processor, cause the control unit to: (i) combine received sensor-specific passenger data to generate a system-specific passenger data, received sensor-specific vehicle data to generate a system-specific vehicle data, and received sensor-specific station data to generate a system-specific station data; (ii) analyze the system-specific passenger data, the system-specific vehicle data, and the system- specific station data; and (iii) perform at least one of the following based on the analysis of the system-specific passenger data, the system-specific vehicle data, and the system-specific station data: (a) facilitate management of passenger-handling capacity of at least one of the transit station and the transit vehicle, (b) dynamically plan a trip for the user availing the transit service, ( c) facilitate detection of fraud for the transit service, and ( d) dynamically plan a route for the transit vehicle) [Bergdale: col. 3, line 21-55]: generate respective identification data for at least some of the images in the stream of images (i.e. 11. When the user 163 enters the pre-determined region 170 (FIG. 11) or a similar coverage location within the proximity area 176 (FIG. 12) with the mobile phone in possession, the 3D time-of-flight camera 146 may detect the person 163 as an object in the camera's field of view 160. The camera 146 may generate a 2D version of the location of this "object" indicating the x-y position of the user 163 within the pre-defined region 170 (or a similar coverage location within the proximity area 176). The camera 146 may then communicate the presence and position of this "object" to the controller driver 14 by sending a timestamped version of the 2D "object" location data to the entry controller 148. The controller driver 14 may collect this location data as an x-y position with a timestamp and may optionally smooth the received data using techniques such as Kalman filtering and cubic splines.) [Bergdale: col. 30, line 63 – col. 31, line 12]; and store the respective identification data for said at least some of the images in association with the respective time stamps of those image (i.e. In the control unit, the interface unit is operable to receive sensor data from a plurality of sensors in the transit system, wherein the control unit is communicatively coupled with the sensors, and wherein each sensor-specific portion of the sensor data includes at least one of the following: (a) a sensor-specific passenger data defining one or more attributes of a user availing a transit service in the transit system, (b) a sensor specific vehicle data defining one or more attributes of a transit vehicle associated with the transit service, and (c) a sensor-specific station data defining one or more attributes of a transit station associated with the transit service. In the control unit, the memory also stores the sensor data received by the interface unit. The processor in the control unit is operable to execute the program instructions, which, when executed by the processor, cause the control unit to: (i) combine received sensor-specific passenger data to generate a system-specific passenger data, received sensor-specific vehicle data to generate a system-specific vehicle data, and received sensor-specific station data to generate a system-specific station data; (ii) analyze the system-specific passenger data, the system-specific vehicle data, and the system- specific station data; and (iii) perform at least one of the following based on the analysis of the system-specific passenger data, the system-specific vehicle data, and the system-specific station data: (a) facilitate management of passenger-handling capacity of at least one of the transit station and the transit vehicle, (b) dynamically plan a trip for the user availing the transit service, (c) facilitate detection of fraud for the transit service, and (d) dynamically plan a route for the transit vehicle) [Bergdale: col. 3, line 21-55]; (i.e. the memory also stores the sensor data received by the interface unit) [Bergdale: col. 3, line #]); and the system is arranged to identify the first identification data from the stored identification data (The processor can then handle transmission of the input type and timestamp back to the base device and/or mobile computing device, wherein Block S140 responds to the input type accordingly. In this example, Block S150 can also implement the input type and corresponding timestamp to trigger insertion of content (e.g., a static image of a slide from a slide deck) into the multimedia stream. In another example, the processor can similarly correlate an input selection with a marker type from a set of available marker types, assign a timestamp to the marker type, and transmit the marker type and timestamp back to the base device and/or mobile computing device, wherein Block S142 receives the marker type as a time-stamped event. In this example, Block S150 can implement the time-stamped event as a searchable marker within the multimedia stream. In the foregoing examples, the processor can select an input type from a list of instructions or markers including any of: <begin presentation>, <end presentation>, <next page/ slide>, <previous page/slide>, <begin timer>, <end timer>, <toggle to secondary camera>, etc.) [Lamb: col. 5, line 25-45] by matching the determined capture time with one of the stored time stamps (i.e. comparison of device-specific timestamped location data (for each mobile device) from two different sources may allow the controller unit 148 to determine which user is attempting ingress into the paid area) [Bergdale: col. 31, line 28-32]. Regarding claim 6, Bergdale meets the claim limitations as set forth in claim 1. Bergdale further meets the claim limitations as follow. the image is a first image ((i.e. camera images) [Bergdale: col. 50, line 67]; (i.e. the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 33-35]); the camera is arranged to capture a stream of images each having a respective time stamp (i.e. the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 33-35]; and the timing entity is arranged (i.e. a positioning engine may timestamp the 2D location data) [Bergdale: col. 23, line 54-55] to identify the first image from the captured stream of images (The processor can then handle transmission of the input type and timestamp back to the base device and/or mobile computing device, wherein Block S140 responds to the input type accordingly. In this example, Block S150 can also implement the input type and corresponding timestamp to trigger insertion of content (e.g., a static image of a slide from a slide deck) into the multimedia stream. In another example, the processor can similarly correlate an input selection with a marker type from a set of available marker types, assign a timestamp to the marker type, and transmit the marker type and timestamp back to the base device and/or mobile computing device, wherein Block S142 receives the marker type as a time-stamped event. In this example, Block S150 can implement the time-stamped event as a searchable marker within the multimedia stream. In the foregoing examples, the processor can select an input type from a list of instructions or markers including any of: <begin presentation>, <end presentation>, <next page/ slide>, <previous page/slide>, <begin timer>, <end timer>, <toggle to secondary camera>, etc.) [Lamb: col. 5, line 25-45] by matching the determined capture time with a time stamp of one of the images within the captured stream of images (i.e. comparison of device-specific timestamped location data (for each mobile device) from two different sources may allow the controller unit 148 to determine which user is attempting ingress into the paid area) [Bergdale: col. 31, line 28-32]. Regarding claim 7, Bergdale meets the claim limitations as set forth in claim 1. Bergdale further meets the claim limitations as follow. wherein the timing entity is arranged (i.e. a positioning engine may timestamp the 2D location data) [Bergdale: col. 23, line 54-55] to trigger the camera to capture the image (i.e. In this example, Block S150 can also implement the input type and corresponding timestamp to trigger insertion of content (e.g., a static image of a slide from a slide deck) into the multimedia stream) [Bergdale: col. 5, line 29-32] in dependence on the measurement report (i.e. 11. When the user 163 enters the pre-determined region 170 (FIG. 11) or a similar coverage location within the proximity area 176 (FIG. 12) with the mobile phone in possession, the 3D time-of-flight camera 146 may detect the person 163 as an object in the camera's field of view 160. The camera 146 may generate a 2D version of the location of this "object" indicating the x-y position of the user 163 within the pre-defined region 170 (or a similar coverage location within the proximity area 176). The camera 146 may then communicate the presence and position of this "object" to the controller driver 14 by sending a timestamped version of the 2D "object" location data to the entry controller 148. The controller driver 14 may collect this location data as an x-y position with a timestamp and may optionally smooth the received data using techniques such as Kalman filtering and cubic splines.) [Bergdale: col. 30, line 63 – col. 31, line 12]). Regarding claim 8, Bergdale meets the claim limitations as set forth in claim 1.Bergdale further meets the claim limitations as follow. wherein the transmitter is a simplex device (i.e. one or more Wake-Up beacon transmitters 62) [Bergdale: col. 10, line 60]; (i.e. beacon transmitters) [Bergdale: col. 10, line 58]. Regarding claim 10, Bergdale meets the claim limitations as set forth in claim 1. Bergdale further meets the claim limitations as follow. the transmitter (i.e. one or more Wake-Up beacon transmitters 62) [Bergdale: col. 10, line 60]; (i.e. beacon transmitters) [Bergdale: col. 10, line 58] comprises a directional antenna arranged to transmit (i.e. The user app and controller driver components may be in bi-directional communication (preferably wireless, as discussed below with reference to FIG. 2) with each other) [Bergdale: col. 6, line 41-43] said beacon signal as a beacon signal beam across a roadway (i.e. Referring again to FIGS. 11-12, a brief description of the exemplary hardware features of the system components shown therein is provided. In particular embodiments, the BLE wake-up beacon 152 may be functionally similar to the wake-up beacon 62 in FIG. 5 and, hence, additional discussion of the hardware features of the wake-up beacon 152 is not provided. Briefly, the wake-up beacon 152 may be a connectionless (wireless) BLE beacon that advertises data to indicate to a mobile device with the user app 12, such as the mobile device 17, that the user 163 of the mobile device is approaching a hands-free ticketing platform that has automatic fare validation and gateless entry/exit. Similarly, the 3D camera(s) 146 may be functionally substantially similar to one or more of the "people counting devices" 67-68 and, hence, the hardware features of the 3D camera(s) 146 are not discussed in further details here. In some embodiments, however, the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 17-35; Figs. 5-6, 11-12]; and the camera is arranged to capture images (i.e. the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 33-35] of persons and/or vehicles travelling on the roadway (i.e. Referring again to FIGS. 11-12, a brief description of the exemplary hardware features of the system components shown therein is provided. In particular embodiments, the BLE wake-up beacon 152 may be functionally similar to the wake-up beacon 62 in FIG. 5 and, hence, additional discussion of the hardware features of the wake-up beacon 152 is not provided. Briefly, the wake-up beacon 152 may be a connectionless (wireless) BLE beacon that advertises data to indicate to a mobile device with the user app 12, such as the mobile device 17, that the user 163 of the mobile device is approaching a hands-free ticketing platform that has automatic fare validation and gateless entry/exit. Similarly, the 3D camera(s) 146 may be functionally substantially similar to one or more of the "people counting devices" 67-68 and, hence, the hardware features of the 3D camera(s) 146 are not discussed in further details here. In some embodiments, however, the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 17-35; Figs. 5-6, 11-12]. Regarding claim 11, Bergdale meets the claim limitations as set forth in claim 10. Bergdale further meets the claim limitations as follow. wherein the transmitter is a first transmitter (i.e. one or more Wake-Up beacon transmitters 62) [Bergdale: col. 10, line 60] in a first location comprising a first directional antenna arranged to (i.e. The user app and controller driver components may be in bi-directional communication (preferably wireless, as discussed below with reference to FIG. 2) with each other) [Bergdale: col. 6, line 41-43] transmit a first beacon signal beam in a first direction (i.e. Referring now to FIGS. 7-8, the "Controller Messages" are the messages sent between the use app 12 and the controller driver 14. These messages may typically contain commands or data which will inform the controller driver 14 how close the mobile device 17 is to the fare gate 70. The "Controller Responses" are responses sent by the controller driver 14 to the user app 12. The "Gate Beacon Advertising Packets" in FIG. 7 refer to information sent from the gate beacon(s) 64-65. This information may be used to detect the proximity of the mobile device 17 with the fare gate 70. On the other hand, the "Wake-Up Beacon Advertising Packets" in FIG. 7 refer to information sent from the wake-up beacon(s) 62. This information may be used to get the user app 12 into a ready state for entering through a fare gate-such as the fare gate 70-that is enabled for handsfree fare validation as per teachings of the present disclosure. In FIG. 7, the term "User Data In" refers to the data that a user 97 running the FV user app 12 (on the user's mobile device 17) enters through a user interface provided by the user app 12. On the other hand, the term "User Data Out" refers to the data that is displayed via the user interface to the user 97 running the FV user app 12. The term "User Control" refers to the control information sent from the mobile device 17 running the FV user app 12) [Bergdale: col. 20, line 5-28], the system further comprising (i.e. Referring again to FIGS. 11-12, a brief description of the exemplary hardware features of the system components shown therein is provided) [Bergdale: col. 25, line 17-19; Figs. 5-6, 11-12] a second transmitter (i.e. one or more Wake-Up beacon transmitters 62) [Bergdale: col. 10, line 60]; (i.e. beacon transmitters) [Bergdale: col. 10, line 58] at a second location different from the first location and comprising a second directional antenna arranged to transmit a second beacon signal beam in a second direction different from the first direction (i.e. Referring again to FIGS. 11-12, a brief description of the exemplary hardware features of the system components shown therein is provided. In particular embodiments, the BLE wake-up beacon 152 may be functionally similar to the wake-up beacon 62 in FIG. 5 and, hence, additional discussion of the hardware features of the wake-up beacon 152 is not provided. Briefly, the wake-up beacon 152 may be a connectionless (wireless) BLE beacon that advertises data to indicate to a mobile device with the user app 12, such as the mobile device 17, that the user 163 of the mobile device is approaching a hands-free ticketing platform that has automatic fare validation and gateless entry/exit. Similarly, the 3D camera(s) 146 may be functionally substantially similar to one or more of the "people counting devices" 67-68 and, hence, the hardware features of the 3D camera(s) 146 are not discussed in further details here. In some embodiments, however, the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 17-35; Figs. 5-6, 11-12], thereby to generate an overlap of the first and second beacon signal beams (i.e. A pre-defined region 170 is also shown in FIG. 11 referring to an area covered by the 3D camera's field of view 160 and located directly in the path for the user 163 to enter the paid area 168. In the gateless entry configuration 174 of FIG. 12, the proximity area 176 itself may be the unpaid area-as indicated by the usage of the same reference numeral "176" for both, whereas the paid area 178 may be a portion of ( or subset of) the proximity area 176, as shown. As mentioned before, the gateless entry configuration 174 of FIG. 12 may be primarily implemented inside a transit vehicle such that the boundaries of the paid area 178 may be the perimeter of the transit vehicle whereas the unpaid area 176 may be the area surrounding the entry and exit points of the transit vehicle. Therefore, there may be an overlap between the paid area 178 and the unpaid area 176, as shown) [Bergdale: col. 27, line 44-59], wherein the timing entity is arranged (i.e. a positioning engine may timestamp the 2D location data) [Bergdale: col. 23, line 54-55] to determine the capture time (i.e. the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 33-35] in dependence on the plurality of measurement report received from the mobile communications device (i.e. 11. When the user 163 enters the pre-determined region 170 (FIG. 11) or a similar coverage location within the proximity area 176 (FIG. 12) with the mobile phone in possession, the 3D time-of-flight camera 146 may detect the person 163 as an object in the camera's field of view 160. The camera 146 may generate a 2D version of the location of this "object" indicating the x-y position of the user 163 within the pre-defined region 170 (or a similar coverage location within the proximity area 176). The camera 146 may then communicate the presence and position of this "object" to the controller driver 14 by sending a timestamped version of the 2D "object" location data to the entry controller 148. The controller driver 14 may collect this location data as an x-y position with a timestamp and may optionally smooth the received data using techniques such as Kalman filtering and cubic splines.) [Bergdale: col. 30, line 63 – col. 31, line 12] indicating peak measurements (i.e. the time period for example, peak time) [Bergdale: col. 19, line 41] of the first beacon signal beam and the second beacon signal beam (i.e. Bluetooth beacon signals may be transmitted) [Bergdale: col. 9, line 19-20] by the mobile communications device (i.e. Referring now to FIGS. 7-8, the "Controller Messages" are the messages sent between the use app 12 and the controller driver 14. These messages may typically contain commands or data which will inform the controller driver 14 how close the mobile device 17 is to the fare gate 70. The "Controller Responses" are responses sent by the controller driver 14 to the user app 12. The "Gate Beacon Advertising Packets" in FIG. 7 refer to information sent from the gate beacon(s) 64-65. This information may be used to detect the proximity of the mobile device 17 with the fare gate 70. On the other hand, the "Wake-Up Beacon Advertising Packets" in FIG. 7 refer to information sent from the wake-up beacon(s) 62. This information may be used to get the user app 12 into a ready state for entering through a fare gate-such as the fare gate 70-that is enabled for handsfree fare validation as per teachings of the present disclosure. In FIG. 7, the term "User Data In" refers to the data that a user 97 running the FV user app 12 (on the user's mobile device 17) enters through a user interface provided by the user app 12. On the other hand, the term "User Data Out" refers to the data that is displayed via the user interface to the user 97 running the FV user app 12. The term "User Control" refers to the control information sent from the mobile device 17 running the FV user app 12) [Bergdale: col. 20, line 5-28]. Regarding claim 12, Bergdale meets the claim limitations as set forth in claim 1. Bergdale further meets the claim limitations as follow. comprising one or more further transmitters (i.e. one or more Wake-Up beacon transmitters 62) [Bergdale: col. 10, line 60]; (i.e. beacon transmitters) [Bergdale: col. 10, line 58] arranged to transmit respective further beacon signals ((i.e. Referring again to FIGS. 11-12, a brief description of the exemplary hardware features of the system components shown therein is provided. In particular embodiments, the BLE wake-up beacon 152 may be functionally similar to the wake-up beacon 62 in FIG. 5 and, hence, additional discussion of the hardware features of the wake-up beacon 152 is not provided. Briefly, the wake-up beacon 152 may be a connectionless (wireless) BLE beacon that advertises data to indicate to a mobile device with the user app 12, such as the mobile device 17, that the user 163 of the mobile device is approaching a hands-free ticketing platform that has automatic fare validation and gateless entry/exit. Similarly, the 3D camera(s) 146 may be functionally substantially similar to one or more of the "people counting devices" 67-68 and, hence, the hardware features of the 3D camera(s) 146 are not discussed in further details here. In some embodiments, however, the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 17-35; Figs. 5-6, 11-12], wherein the timing entity is arranged to (i.e. a positioning engine may timestamp the 2D location data) [Bergdale: col. 23, line 54-55] determine the capture time (i.e. the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 33-35] further in dependence on whether measurement reports are received from the mobile communications device prior to, or after, the receiving of the plurality of first measurement reports (i.e. 11. When the user 163 enters the pre-determined region 170 (FIG. 11) or a similar coverage location within the proximity area 176 (FIG. 12) with the mobile phone in possession, the 3D time-of-flight camera 146 may detect the person 163 as an object in the camera's field of view 160. The camera 146 may generate a 2D version of the location of this "object" indicating the x-y position of the user 163 within the pre-defined region 170 (or a similar coverage location within the proximity area 176). The camera 146 may then communicate the presence and position of this "object" to the controller driver 14 by sending a timestamped version of the 2D "object" location data to the entry controller 148. The controller driver 14 may collect this location data as an x-y position with a timestamp and may optionally smooth the received data using techniques such as Kalman filtering and cubic splines.) [Bergdale: col. 30, line 63 – col. 31, line 12], indicating measurements of further beacon signals transmitted (i.e. Bluetooth beacon signals may be transmitted) [Bergdale: col. 9, line 19-20] by one or more further transmitters ((i.e. one or more Wake-Up beacon transmitters 62) [Bergdale: col. 10, line 60]; (i.e. The Beacon ID may provide transmitter-specific identification information that the mobile operating system 24 may use to recognize the Bluetooth Beacon. For iBeacons, for example, the Beacon ID is the UUID along with the major and minor numbers. It is observed here that the Bluetooth LE (also referred to as "Bluetooth Smart") is a wireless communication protocol that permits short range (up to 30 meters) communications. Bluetooth LE functionality is found on many smartphones and tablets) [Bergdale: col. 11, line 29-38]). Regarding claim 13, Bergdale meets the claim limitations as set forth in claim 1. Bergdale further meets the claim limitations as follow. arranged to store device identification information relating to one or more devices of interest (i.e. a valid electronic ticket (which may be stored in the passenger's mobile device)) [Bergdale: col. 2, line 27-28; Figs. 3, 9, 10], wherein the timing entity is arranged (i.e. a positioning engine may timestamp the 2D location data) [Bergdale: col. 23, line 54-55] to:determine, using the stored device identification information, that the mobile communications device a device of interest (i.e. A controller as per teachings of the present disclosure may also detect when a passenger, with a mobile ticket previously activated, exits from the Paid Area. Furthermore, in some embodiments, the system may detect, and provide external visual and audio alerts, when a passenger enters into the Paid Area without a valid permit for travel. The system may also detect, and provide external visual and audio alerts, when a passenger attempts to exit from the Paid Area without a valid permit for travel.) [Bergdale: col. 2, line 28-37]; and determine the capture time (i.e. the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 33-35] further in dependence on said determination that the mobile communications device is a device of interest (i.e. 11. When the user 163 enters the pre-determined region 170 (FIG. 11) or a similar coverage location within the proximity area 176 (FIG. 12) with the mobile phone in possession, the 3D time-of-flight camera 146 may detect the person 163 as an object in the camera's field of view 160. The camera 146 may generate a 2D version of the location of this "object" indicating the x-y position of the user 163 within the pre-defined region 170 (or a similar coverage location within the proximity area 176). The camera 146 may then communicate the presence and position of this "object" to the controller driver 14 by sending a timestamped version of the 2D "object" location data to the entry controller 148. The controller driver 14 may collect this location data as an x-y position with a timestamp and may optionally smooth the received data using techniques such as Kalman filtering and cubic splines.) [Bergdale: col. 30, line 63 – col. 31, line 12]. Regarding claim 14, Bergdale meets the claim limitations as follow. A method comprising (i.e. a method) [Bergdale: col. 2, line 61]: transmitting (i.e. Bluetooth beacon signals may be transmitted) [Bergdale: col. 9, line 19-20], by a transmitter (i.e. a transmitter) [Bergdale: col. 50, line 7], a beacon signal (i.e. Bluetooth beacon signals may be transmitted) [Bergdale: col. 9, line 19-20] for reception by a mobile communications device (i.e. the mobile device 17 may receive a Bluetooth beacon signal) [Bergdale: col. 9, line 18]; receiving a plurality of measurement report from the mobile communications device (i.e. Referring now to FIGS. 7-8, the "Controller Messages" are the messages sent between the use app 12 and the controller driver 14. These messages may typically contain commands or data which will inform the controller driver 14 how close the mobile device 17 is to the fare gate 70. The "Controller Responses" are responses sent by the controller driver 14 to the user app 12. The "Gate Beacon Advertising Packets" in FIG. 7 refer to information sent from the gate beacon(s) 64-65. This information may be used to detect the proximity of the mobile device 17 with the fare gate 70. On the other hand, the "Wake-Up Beacon Advertising Packets" in FIG. 7 refer to information sent from the wake-up beacon(s) 62. This information may be used to get the user app 12 into a ready state for entering through a fare gate-such as the fare gate 70-that is enabled for handsfree fare validation as per teachings of the present disclosure. In FIG. 7, the term "User Data In" refers to the data that a user 97 running the FV user app 12 (on the user's mobile device 17) enters through a user interface provided by the user app 12. On the other hand, the term "User Data Out" refers to the data that is displayed via the user interface to the user 97 running the FV user app 12. The term "User Control" refers to the control information sent from the mobile device 17 running the FV user app 12) [Bergdale: col. 20, line 5-28]; determining a capture time (i.e. the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 33-35] in dependence on the plurality of measurement reports (i.e. 11. When the user 163 enters the pre-determined region 170 (FIG. 11) or a similar coverage location within the proximity area 176 (FIG. 12) with the mobile phone in possession, the 3D time-of-flight camera 146 may detect the person 163 as an object in the camera's field of view 160. The camera 146 may generate a 2D version of the location of this "object" indicating the x-y position of the user 163 within the pre-defined region 170 (or a similar coverage location within the proximity area 176). The camera 146 may then communicate the presence and position of this "object" to the controller driver 14 by sending a timestamped version of the 2D "object" location data to the entry controller 148. The controller driver 14 may collect this location data as an x-y position with a timestamp and may optionally smooth the received data using techniques such as Kalman filtering and cubic splines.) [Bergdale: col. 30, line 63 – col. 31, line 12]) indicating a relative peak measurement (i.e. the time period for example, peak time) [Bergdale: col. 19, line 41], among the plurality of measurement reports (i.e. Referring now to FIGS. 7-8, the "Controller Messages" are the messages sent between the use app 12 and the controller driver 14. These messages may typically contain commands or data which will inform the controller driver 14 how close the mobile device 17 is to the fare gate 70. The "Controller Responses" are responses sent by the controller driver 14 to the user app 12. The "Gate Beacon Advertising Packets" in FIG. 7 refer to information sent from the gate beacon(s) 64-65. This information may be used to detect the proximity of the mobile device 17 with the fare gate 70. On the other hand, the "Wake-Up Beacon Advertising Packets" in FIG. 7 refer to information sent from the wake-up beacon(s) 62. This information may be used to get the user app 12 into a ready state for entering through a fare gate-such as the fare gate 70-that is enabled for handsfree fare validation as per teachings of the present disclosure. In FIG. 7, the term "User Data In" refers to the data that a user 97 running the FV user app 12 (on the user's mobile device 17) enters through a user interface provided by the user app 12. On the other hand, the term "User Data Out" refers to the data that is displayed via the user interface to the user 97 running the FV user app 12. The term "User Control" refers to the control information sent from the mobile device 17 running the FV user app 12) [Bergdale: col. 20, line 5-28], of the beacon signal by the mobile communications device ((i.e. Bluetooth beacon signals may be transmitted) [Bergdale: col. 9, line 19-20]; (i.e. Referring now to FIGS. 7-8, the "Controller Messages" are the messages sent between the use app 12 and the controller driver 14. These messages may typically contain commands or data which will inform the controller driver 14 how close the mobile device 17 is to the fare gate 70. The "Controller Responses" are responses sent by the controller driver 14 to the user app 12. The "Gate Beacon Advertising Packets" in FIG. 7 refer to information sent from the gate beacon(s) 64-65. This information may be used to detect the proximity of the mobile device 17 with the fare gate 70. On the other hand, the "Wake-Up Beacon Advertising Packets" in FIG. 7 refer to information sent from the wake-up beacon(s) 62. This information may be used to get the user app 12 into a ready state for entering through a fare gate-such as the fare gate 70-that is enabled for handsfree fare validation as per teachings of the present disclosure. In FIG. 7, the term "User Data In" refers to the data that a user 97 running the FV user app 12 (on the user's mobile device 17) enters through a user interface provided by the user app 12. On the other hand, the term "User Data Out" refers to the data that is displayed via the user interface to the user 97 running the FV user app 12. The term "User Control" refers to the control information sent from the mobile device 17 running the FV user app 12) [Bergdale: col. 20, line 5-28]); and capturing an image (i.e. camera images) [Bergdale: col. 50, line 67] at the determined capture time (i.e. the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 33-35], for use in identifying a person or vehicle carrying the mobile communications device (i.e. In particular embodiments, potential database fields may include some or all of the following: (i) a data field for unique transit vehicle/station identifier, (ii) a data field for transit vehicle stop/route information including station name and Global Positioning System (GPS) location, (iii) a data field for configuration information for each transit vehicle-based BLE gateway, (iv) a data field for the controller driver 14 configuration information, and (v) a data field for 3D camera configuration information. These database fields are exemplary only. In other embodiments, depending on the implementation of the gateless entry/exit system, the data fields may be more than, less than, or different from those listed above.) [Bergdale: col. 26, line 22-34; Figs. 11-12], wherein the camera is positioned relative to the transmitter such that the image captured at the determined capture time contains the person or vehicle (i.e. Referring again to FIGS. 11-12, a brief description of the exemplary hardware features of the system components shown therein is provided. In particular embodiments, the BLE wake-up beacon 152 may be functionally similar to the wake-up beacon 62 in FIG. 5 and, hence, additional discussion of the hardware features of the wake-up beacon 152 is not provided. Briefly, the wake-up beacon 152 may be a connectionless (wireless) BLE beacon that advertises data to indicate to a mobile device with the user app 12, such as the mobile device 17, that the user 163 of the mobile device is approaching a hands-free ticketing platform that has automatic fare validation and gateless entry/exit. Similarly, the 3D camera(s) 146 may be functionally substantially similar to one or more of the "people counting devices" 67-68 and, hence, the hardware features of the 3D camera(s) 146 are not discussed in further details here. In some embodiments, however, the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 17-35; Figs. 5-6, 11-12]. Bergdale does not explicitly disclose the following claim limitations (Emphasis added). wherein the camera is positioned relative to the transmitter. However, in the same field of endeavor Lamb further discloses the claim limitations and the deficient claim limitations, as follows: wherein the camera is positioned relative to the transmitter (the position of an IR transmitter (i.e., IR beacon) within the field of view of the camera, etc.) [Lamb: col. 25, line 22-23]. It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bergdale with Lamb to program the system to implement of Lamb’s method. Therefore, the combination of Bergdale with Lamb will enable the system to substantially support real time collecting audio-visual data, thereby reducing time and special equipment needed for post-processing of such disparate streams of data into a single viewable audio-visual file [Lamb: col. 2, line 41-47]. Regarding claim 16, Bergdale meets the claim limitations as set forth in claim 14.Lamb further meets the claim limitations as follow. for identifying a person, the method further comprising performing facial recognition to identify the target, (i.e. Block S122 can implement facial detection to detect the user's face and then generate the position signal to maintain the user's face within the center of the field of view of the first camera) [Lamb: col. 11, line 4-7]. Regarding claim 18, Bergdale meets the claim limitations as set forth in claim 14. Bergdale further meets the claim limitations as follow. maintaining a dedicated channel with the mobile communications device (i.e. The Beacon ID may provide transmitter-specific identification information that the mobile operating system 24 may use to recognize the Bluetooth Beacon. For iBeacons, for example, the Beacon ID is the UUID along with the major and minor numbers. It is observed here that the Bluetooth LE (also referred to as "Bluetooth Smart") is a wireless communication protocol that permits short range (up to 30 meters) communications. Bluetooth LE functionality is found on many smartphones and tablets) [Bergdale: col. 11, line 29-38], whereby to enable reception of the plurality of measurement reports from the mobile communications device ((i.e. The transit station 60 may have a number of "people counting" devices 67-68 to determine when a person has entered the fare validation zone. In one embodiment, the "people counting" devices may include stereoscopic digital Infrared (IR) cameras. In some embodiments, the cameras 67-68 may be wirelessly connected to the controller unit 18 to notify the controller 18 when a person has entered the fare validation zone. In other embodiments, there may be an Ethernet-based connectivity between the controller unit 18 and the "people counting" devices 67-68) [Bergdale: col. 12, line 9-18]; (i.e. The Beacon ID may provide transmitter-specific identification information that the mobile operating system 24 may use to recognize the Bluetooth Beacon. For iBeacons, for example, the Beacon ID is the UUID along with the major and minor numbers. It is observed here that the Bluetooth LE (also referred to as "Bluetooth Smart") is a wireless communication protocol that permits short range (up to 30 meters) communications. Bluetooth LE functionality is found on many smartphones and tablets) [Bergdale: col. 11, line 29-38]). Regarding claim 19, Bergdale meets the claim limitations as set forth in claim 14. Bergdale further meets the claim limitations as follow. comprising (i.e. a method) [Bergdale: col. 2, line 61] determining the capture time (i.e. the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 33-35] further in dependence on whether measurement reports are received from the mobile communications device prior to, or after, the receiving of the plurality of measurement reports (i.e. 11. When the user 163 enters the pre-determined region 170 (FIG. 11) or a similar coverage location within the proximity area 176 (FIG. 12) with the mobile phone in possession, the 3D time-of-flight camera 146 may detect the person 163 as an object in the camera's field of view 160. The camera 146 may generate a 2D version of the location of this "object" indicating the x-y position of the user 163 within the pre-defined region 170 (or a similar coverage location within the proximity area 176). The camera 146 may then communicate the presence and position of this "object" to the controller driver 14 by sending a timestamped version of the 2D "object" location data to the entry controller 148. The controller driver 14 may collect this location data as an x-y position with a timestamp and may optionally smooth the received data using techniques such as Kalman filtering and cubic splines.) [Bergdale: col. 30, line 63 – col. 31, line 12], indicating measurements of further beacon signals transmitted (i.e. Bluetooth beacon signals may be transmitted) [Bergdale: col. 9, line 19-20] by one or more further transmitters ((i.e. one or more Wake-Up beacon transmitters 62) [Bergdale: col. 10, line 60]; (i.e. The Beacon ID may provide transmitter-specific identification information that the mobile operating system 24 may use to recognize the Bluetooth Beacon. For iBeacons, for example, the Beacon ID is the UUID along with the major and minor numbers. It is observed here that the Bluetooth LE (also referred to as "Bluetooth Smart") is a wireless communication protocol that permits short range (up to 30 meters) communications. Bluetooth LE functionality is found on many smartphones and tablets) [Bergdale: col. 11, line 29-38]). Regarding claim 20, Bergdale meets the claim limitations as set forth in claim 14. Bergdale further meets the claim limitations as follow. storing device identification information relating to one or more devices of interest (i.e. a valid electronic ticket (which may be stored in the passenger's mobile device)) [Bergdale: col. 2, line 27-28; Figs. 3, 9, 10]; determining, using the stored device identification information, that the mobile communications device a device of interest (i.e. A controller as per teachings of the present disclosure may also detect when a passenger, with a mobile ticket previously activated, exits from the Paid Area. Furthermore, in some embodiments, the system may detect, and provide external visual and audio alerts, when a passenger enters into the Paid Area without a valid permit for travel. The system may also detect, and provide external visual and audio alerts, when a passenger attempts to exit from the Paid Area without a valid permit for travel.) [Bergdale: col. 2, line 28-37]; and determining the capture time (i.e. the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 33-35] further in dependence on said determination that the mobile communications device is a device of interest (i.e. 11. When the user 163 enters the pre-determined region 170 (FIG. 11) or a similar coverage location within the proximity area 176 (FIG. 12) with the mobile phone in possession, the 3D time-of-flight camera 146 may detect the person 163 as an object in the camera's field of view 160. The camera 146 may generate a 2D version of the location of this "object" indicating the x-y position of the user 163 within the pre-defined region 170 (or a similar coverage location within the proximity area 176). The camera 146 may then communicate the presence and position of this "object" to the controller driver 14 by sending a timestamped version of the 2D "object" location data to the entry controller 148. The controller driver 14 may collect this location data as an x-y position with a timestamp and may optionally smooth the received data using techniques such as Kalman filtering and cubic splines.) [Bergdale: col. 30, line 63 – col. 31, line 12]. Claims 3 and 15 are rejected under 35 U.S.C. 103 as being unpatentable over Bergdale et al. (US Patent 11,803,784 B2), (“Bergdale”), in view of Lamb et al. (US Patent 9,699,431 B2), (“Lamb”), in view of Goldfarb et al (US Patent 11,337,054 B2), (“Goldfarb”). Regarding claim 3, Bergdale meets the claim limitations as set forth in claim 1. Bergdale further meets the claim limitations as follow. arranged to identify a vehicle (i.e. sensor-specific vehicle data defining one or more attributes of a transit vehicle) [Bergdale: col. 3, line 1-3], wherein the identification data is indicative of (i.e. identification information) [Bergdale: col. 11, line 30] a registration number of the vehicle. Bergdale and Lamb do not explicitly disclose the following claim limitations (Emphasis added). wherein the identification data is indicative of a registration number of the vehicle. However, in the same field of endeavor Goldfarb further discloses the deficient claim limitations as follows: wherein the identification data is indicative of a registration number of the vehicle (the system described herein may be installed at a vehicular border crossing having multiple control checkpoints, such that, as vehicles approach the control checkpoints, the mobile communication terminals of the occupants of the vehicles reassociate with the local interrogation devices. Each mobile communication terminal identifier may then be associated with an appropriate passport scan, and/or with an appropriate license plate number, obtained, for example, by automatic license plate recognition techniques) [Goldfarb: col. 5, line 40-49]. It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bergdale and Lamb with Goldfarb to program the system to implement of Goldfarb’s method. Therefore, the combination of Bergdale and Lamb with Goldfarb will enable to improve the liability of the system [Goldfarb: col. 2, line 1-2]. Regarding claim 15, Bergdale meets the claim limitations as set forth in claim 14. Bergdale further meets the claim limitations as follow. for identifying a vehicle (i.e. sensor-specific vehicle data defining one or more attributes of a transit vehicle) [Bergdale: col. 3, line 1-3], the method further comprising processing (i.e. a method) [Bergdale: col. 2, line 61] the captured image (i.e. the 3D camera 146 may be an infrared camera that uses time-of-flight (TOF) technology to detect and track objects in the camera's field of view 160) [Bergdale: col. 25, line 33-35] to generate identification data for identifying the person or vehicle (i.e. sensor-specific vehicle data defining one or more attributes of a transit vehicle) [Bergdale: col. 3, line 1-3], wherein the identification data is indicative of (i.e. identification information) [Bergdale: col. 11, line 30] a registration number of the vehicle. Bergdale and Lamb do not explicitly disclose the following claim limitations (Emphasis added). wherein the identification data is indicative of a registration number of the vehicle. However, in the same field of endeavor Goldfarb further discloses the deficient claim limitations as follows: wherein the identification data is indicative of a registration number of the vehicle (the system described herein may be installed at a vehicular border crossing having multiple control checkpoints, such that, as vehicles approach the control checkpoints, the mobile communication terminals of the occupants of the vehicles reassociate with the local interrogation devices. Each mobile communication terminal identifier may then be associated with an appropriate passport scan, and/or with an appropriate license plate number, obtained, for example, by automatic license plate recognition techniques) [Goldfarb: col. 5, line 40-49]. It would have been obvious to one with an ordinary skill in the art before the effective filing date of the claimed invention to modify the teachings of Bergdale and Lamb with Goldfarb to program the system to implement of Goldfarb’s method. Therefore, the combination of Bergdale and Lamb with Goldfarb will enable to improve the liability of the system [Goldfarb: col. 2, line 1-2]. Allowable Subject Matter 12. Claims 9 and 17 are objected to as being dependent upon a rejected base claim, but would be allowable if rewritten in independent form including all of the limitations of the base claim and any intervening claims. This objection is given with a condition that all other objections and rejections of related claims are addressed. 13. The above identified claims recite special data that contains an unique parameter indicating specific measurement. The prior arts fail to teach or render obvious this set of steps and features. Reference Notice Additional prior arts, included in the Notice of Reference Cited, made of record and not relied upon is considered pertinent to applicant's disclosure. Conclusion Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in 37 CFR 1.136(a). A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to 37 CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action. Contact Information Any inquiry concerning this communication or earlier communications from the examiner should be directed to Philip Dang whose telephone number is (408) 918-7529. The examiner can normally be reached on Monday-Thursday between 8:30 am - 5:00 pm (PST). Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice. If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, Sath Perungavoor can be reached on 571-272-7455. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300. Information regarding the status of an application may be obtained from the Patent Application Information Retrieval (PAIR) system. Status information for published applications may be obtained from either Private PAIR or Public PAIR. Status information for unpublished applications is available through Private PAIR only. For more information about the PAIR system, see http://pair-direct.uspto.gov. Should you have questions on access to the Private PAIR system, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative or access to the automated information system, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000. /Philip P. Dang/Primary Examiner, Art Unit 2488
Read full office action

Prosecution Timeline

Oct 26, 2023
Application Filed
Sep 17, 2025
Non-Final Rejection — §103
Jan 15, 2026
Response Filed
Mar 24, 2026
Examiner Interview Summary
Mar 24, 2026
Examiner Interview (Telephonic)
Apr 01, 2026
Final Rejection — §103 (current)

Precedent Cases

Applications granted by this same examiner with similar technology

Patent 12602837
ON SUB-DIVISION OF MESH SEQUENCES
2y 5m to grant Granted Apr 14, 2026
Patent 12593116
IMAGING MEASUREMENT DEVICE USING GAS ABSORPTION IN THE MID-INFRARED BAND AND OPERATING METHOD OF IMAGING MEASUREMENT DEVICE
2y 5m to grant Granted Mar 31, 2026
Patent 12581069
METHOD FOR ENCODING/DECODING VIDEO SIGNAL, AND APPARATUS THEREFOR
2y 5m to grant Granted Mar 17, 2026
Patent 12581106
IMAGE DECODING METHOD AND DEVICE THEREFOR
2y 5m to grant Granted Mar 17, 2026
Patent 12574557
SCALABLE VIDEO CODING USING BASE-LAYER HINTS FOR ENHANCEMENT LAYER MOTION PARAMETERS
2y 5m to grant Granted Mar 10, 2026
Study what changed to get past this examiner. Based on 5 most recent grants.

AI Strategy Recommendation

Get an AI-powered prosecution strategy using examiner precedents, rejection analysis, and claim mapping.
Powered by AI — typically takes 5-10 seconds

Prosecution Projections

3-4
Expected OA Rounds
77%
Grant Probability
99%
With Interview (+33.2%)
2y 10m
Median Time to Grant
Moderate
PTA Risk
Based on 470 resolved cases by this examiner. Grant probability derived from career allow rate.

Sign in with your work email

Enter your email to receive a magic link. No password needed.

Personal email addresses (Gmail, Yahoo, etc.) are not accepted.

Free tier: 3 strategy analyses per month