Notice of Pre-AIA or AIA Status
The present application, filed on or after March 16, 2013, is being examined under the first inventor to file provisions of the AIA .
Response to Arguments
Applicant’s arguments with respect to claims 1-5, 9-12, 14-16, 21, 23, 25-27, 30, 32, and 33 have been considered but are moot because the arguments do not apply to any of the references being used in the current rejection.
In the remarks, Applicant argued in substance that
(a) Applicant argues that Banerjee’s cited passages relate to fog-layer nodes (not to the claimed “edge device” doing on-device classification), so Banerjee does not disclose an “edge processing means” that produces “edge processed data comprising at least one value representative for a class … for classifying the event” before transmitting to a fog device.
(b) Applicant argues that Banerjee teaches that training and classification occur at fog layers, and therefore the Office’s reliance on edge-level classification is incorrect.
(c) Applicant argues that Banerjee’s hierarchy (e.g., paragraph [0051]) shows lower-level devices sending raw data upward to fog nodes for processing, not that the lower-level devices pre-processing or classifying data.
Examiner respectfully traversed Applicant’s remarks:
As to point (a), Banerjee expressly teaches pushing analytics/processing toward the edge and on device/local processing. See Banerjee’s teaching that “Analytics in fog network needs communication, processing and intelligence to be pushed directly into edge nodes” (paragraph [0006]) and that federated/distributed learning “may be understood to move to the edge of the network, so that data may never leave the device” (Banerjee paragraph [0011]). These passages explicitly contemplate edge resident analytics/processing and on device model use/updates. The claim element requires only that an edge device have an “edge processing means” that produces edge processed data which includes at least one class representative value. Banerjee’s teaching that analytics/intelligence may be located at the edge (paragraph [0006]) and that learning/processing can occur at the device (paragraph [0011]) supports the reasonable interpretation that edge devices can perform local inference/classification and output a compact class/value prior to forwarding to fog. Therefore, the Office’s view that Banerjee discloses edge devices capable of producing edge processed outputs is supported by Banerjee’s explicit statements about placing processing/analytics at edge nodes.
As to point (b), Banerjee’s discussions of decentralized/fog layer training and aggregation (paragraphs [0231], [0235], [0246], [0254]) describe where model training/aggregation may occur in the hierarchy, but they do not exclude edge level inference and reporting of compact results. Training location (fog) and inference location (edge) are consistent and complementary in Banerjee’s architecture: fog layers can coordinate model updates while edge nodes may run inference and emit class/value outputs. The claim requires edge side production of at least one class value; Banerjee’s combined teachings (edge analytics paragraph [0006], on device/distributed learning paragraph [0011], and hierarchical flows paragraph [0051]) support that on device/edge inference is contemplated, while fog layers may perform aggregation/training. Thus, Banerjee supports both the claimed edge processing& output and the fog aggregation steps.
As to point (c), Banerjee describes the hierarchical flow of information from sensors and lower-level devices to fog nodes and then upward to cloud (paragraph [0051]). That description does not compel an interpretation that only raw unprocessed streams travel upward. Indeed, Banerjee’s overall disclosure including the explicit statement about pushing analytics to edge nodes (paragraph [0006]) is consistent with lower-tier devices producing processed/condensed outputs (e.g., class labels or compact event values) and forwarding those to fog for further aggregation. The hierarchical diagram simply shows direction of information flow; it does not mandate the format (raw vs. condensed) of that upward information.
Claim Rejections - 35 USC § 112
The following is a quotation of 35 U.S.C. 112(b):
(B) CONCLUSION.—The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the inventor or a joint inventor regards as the invention.
The following is a quotation of pre-AIA 35 U.S.C. 112, second paragraph::
The specification shall conclude with one or more claims particularly pointing out and distinctly claiming the subject matter which the applicant regards as his invention.
Claims 11, 15, 16, 21, 25 30, and 32 are rejected under 35 U.S.C. 112(b) or pre-AIA 35 U.S.C. 112, second paragraph, as being indefinite for failing to particularly point out and distinctly claim the subject matter which the inventor or a joint inventor, or for pre-AIA the applicant regards as the invention.
Claim 11 discloses “the fog processing means is configured to use data, preferably weather related data, from a database …”. The transitional/qualifying term “preferably” renders it unclear whether the feature following “preferably” is an essential claim limitation (required for infringement) or merely a preferred (non-limiting) example. Because it is unclear whether the claim requires the use of weather related data or merely permits it as a preference, the metes and bounds of the claimed invention are not reasonably ascertainable by one of ordinary skill in the art.
Claim 15 discloses “wherein the at least one data source comprises at least one sensor, preferably at least two sensor.” The use of the term “preferably” renders the claim ambiguous as to whether the recited “at least two sensors” limitation is required for infringement or is merely a non-limiting preferred embodiment. The word “preferably” is a word of preference and does not clearly define whether the limitation following it is mandator.
Claim 16 discloses “wherein the at least one data source comprises at least one of, preferably at least two of: ….” The word “preferably” renders it unclear whether “at least two” is a required limitation of the claim or merely a preferred embodiment. Words of preference in claim language create ambiguity about whether the feature is mandatory, and therefore may render the claim indefinite. In addition, the phrase “at least one of, preferably at least two of” is internally inconsistent and ambiguous. “At least one of” sets a minimum of one; “preferably at least two” suggests a preference for two or more. It is unclear whether infringement requires one, two, or more of the listed items. A person of ordinary skill cannot determine with reasonable certainty the required number of elements.
Claim 21 discloses “wherein preferably the edge communication means.” The word “preferably” creates ambiguity about whether the recited communication capability is required or merely a preferred embodiment. Words of preference in claim language can render the claim indefinite because they fail to inform, with reasonable certainty, those skilled in the art of the scope of the invention.
Claim 25 discloses “wherein each subset of edge devices comprises at least ten edge devices, preferably at least fifty edge devices.” The word “preferably” renders it unclear whether the recited communication capability is required for infringement or is merely a preferred embodiment. Words of preference in claim language create ambiguity about whether the limitation is mandatory and therefore may render the claim indefinite.
Claim 30 discloses “An edge module, preferably for use in a system according to claim 1.” The use of the word “preferably” renders the claim ambiguous as to whether the recited “use in a system according to claim 1” is a required limitation of the claim or merely a preferred (non-limiting) embodiment. The term “preferably” introduces uncertainty as to the metes and bounds of the claimed invention because it does not clearly indicate whether the recited relationship to claim 1 is mandatory for infringement.
Claim 32 discloses “wherein preferably the edge processing means is configured to use the first class to select the second class.” The word “preferably” creates uncertainty whether the limitation “the edge processing means is configured to use the first class to select the second class” is required for infringement or merely a preferred embodiment. Words of preference in claim language can render a claim indefinite because they fail to inform, with reasonable certainty, those skilled in the art of the claim’s scope.
Claim Rejections - 35 USC § 102
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of the appropriate paragraphs of 35 U.S.C. 102 that form the basis for the rejections under this section made in this Office action:
A person shall be entitled to a patent unless -
(a)(2) the claimed invention was described in a patent issued under section 151, or in an application for patent published or deemed published under section 122(b), in which the patent or application, as the case may be, names another inventor and was effectively filed before the effective filing date of the claimed invention.
Claims 1, 3, 4, 9-12, 14-16, 21, 23, 26, 27, and 30 are rejected under 35 U.S.C. 102(a)(2) as being anticipated by Banerjee et al. (US 2023/0169356, hereinafter Banerjee).
Regarding claim 1, Banerjee discloses
A network system comprising (Fig. 3):
a plurality of edge devices arranged at a plurality of locations, an edge device thereof comprising (paragraph [0051]: the lower level devices provide information to respective roadside traffic fog devices 16):
at least one data source configured to obtain environmental data, said environmental data being related to an event in the edge device or vicinity of the edge device (paragraph [0051]: As may be appreciated in FIG. 1, at the bottom of the hierarchy, one may find devices, e.g., sensors, such as vehicle and pedestrian aware sensors 11 in e.g., a smart traffic light, road side sensors 12 capable of measuring speed, volume, weather e.g., ice, snow, water, etc. . . . , on-board devices 13 that may connect via an Access Point (AP) in the vehicle for internet, phone, or infotainment services, in vehicle sensors 14, road side traffic fog devices 15, e.g., cameras. Information from these devices may then be provided upward, to successive levels of nodes, based on the hierarchy of the architecture. Different vertical application use cases may use a fog hierarchy differently),
an edge processing means configured to produce edge processed data based on said environmental data (paragraph [0082]: The sensors 328 may be full IoT devices 302, for example, capable of both collecting data and processing the data), said edge processed data comprising at least one value representative for a class selected from a plurality of predetermined classes for classifying the event (paragraph [0006]: Analytics in fog network needs communication, processing and intelligence to be pushed directly into edge nodes; paragraph [0011]: In FL, algorithm training may be understood to move to the edge of the network, so that data may never leave the device; paragraph [0084]: A service request may be for example, a traffic condition service or a weather condition service; paragraph [0231]: The first level of the fog hierarchy may be the capturing module consisting of F-UEs. At this level, fog nodes may receive captured images from different sources. To uphold the ultra-low latency application, captured images, that is, data, may not be sent back to the cloud for training. Instead, a decentralized machine learning model may learn from it collaboratively at local fog layers, where leader nodes may coordinate the jobs to handle performance-cost trade off), and
an edge communication means for communicating edge processed data (paragraph [0051]: the lower level devices provide information to respective roadside traffic fog devices 16, which may in turn feed information to the next level fog nodes, neighbourhood traffic fog devices 17, which may then provide information to the next level fog nodes, regional traffic fog devices 18, which may in turn provide information to an Enterprise Mobility Suite (EMS) cloud 19 or a Metropolitan Traffic Services Cloud 20);
a plurality of fog devices, a fog device thereof being associated with a subset of said plurality of edge devices (paragraph [0051]: the lower level devices provide information to respective roadside traffic fog devices 16, which may in turn feed information to the next level fog nodes, neighbourhood traffic fog devices 17, which may then provide information to the next level fog nodes, regional traffic fog devices 18, which may in turn provide information to an Enterprise Mobility Suite (EMS) cloud 19 or a Metropolitan), said fog device comprising
a fog communication means configured to receive edge processed data from said subset and to transmit fog processed data (paragraph [0246]: The learnt result, that is, the global belief, and data from the first layer may be aggregated and sent up the fog hierarchy to the second and third layers—neighbourhood and regional fog nodes—for further analysis and distribution), and
a fog processing means configured to process the edge processed data received from said subset and to produce fog processed data based thereon, said fog processed data comprising information about an event in the vicinity of said subset or an event in at least one edge device of said subset (paragraph [0235]: each fog layer in the hierarchy may provide additional computing, storage, and network capabilities in service of the vertical application at their level of the hierarchy; paragraph [0246]: The learnt result, that is, the global belief, and data from the first layer may be aggregated and sent up the fog hierarchy to the second and third layers—neighbourhood and regional fog nodes—for further analysis and distribution; paragraph [0254]: The learnt result, that is, the global belief, and the data from the first layer may be aggregated and sent up the fog hierarchy to the second and third layers—neighbourhood and zonal fog nodes—for further analysis and distribution. For example: in an Airport Surveillance system—parking lot, check-in area, lounge area etc. may have different fog clusters. Decentralized models at this layer may learn to make decisions on anomaly detection at those locations. In a second layer, data and prediction from these different locations may be aggregated for making more complex decisions); and
a central control system in communication with said plurality of fog devices and configured to receive fog processed data from said plurality of fog devices (paragraph [0051]: paragraph [0051]: the lower level devices provide information to respective roadside traffic fog devices 16, which may in turn feed information to the next level fog nodes, neighbourhood traffic fog devices 17, which may then provide information to the next level fog nodes, regional traffic fog devices 18, which may in turn provide information to an Enterprise Mobility Suite (EMS) cloud 19 or a Metropolitan).
Regarding claim 3, Banerjee discloses
wherein the edge processed data comprises at least one value representative for an attribute associated to the event, said attribute characterizing a property of the event (paragraph [0051]: As may be appreciated in FIG. 1, at the bottom of the hierarchy, one may find devices, e.g., sensors, such as vehicle and pedestrian aware sensors 11 in e.g., a smart traffic light, road side sensors 12 capable of measuring speed, volume, weather e.g., ice, snow, water, etc. . . . , on-board devices 13 that may connect via an Access Point (AP) in the vehicle for internet, phone, or infotainment services, in vehicle sensors 14, road side traffic fog devices 15, e.g., cameras. Information from these devices may then be provided upward, to successive levels of nodes, based on the hierarchy of the architecture).
Regarding claim 4, Banerjee discloses
wherein the event comprises one of an event related to an object in the edge device or its vicinity, an event related to a state of an object in the edge device or the vicinity of the edge device, an event related to an area in the vicinity of the edge device, an event related to a state of a component of the edge device (paragraph [0006]: Analytics in fog network needs communication, processing and intelligence to be pushed directly into edge nodes; paragraph [0011]: In FL, algorithm training may be understood to move to the edge of the network, so that data may never leave the device; paragraph [0081]: each of the first node 111, the second node 112, the third node 113 and the another node 114 may collect data towards a common problem, e.g., monitoring traffic. In such a context, the prediction of the event may be to predict a location where a car driving over the speed limit may be found).
Regarding claim 9, Banerjee discloses
wherein the edge processed data comprises at least one value representative for an attribute associated to the event, said attribute characterizing a property of the event, wherein the at least one value for an attribute associated to the event is based on the first and/or the second sensed environmental data (paragraph [0051]: at the bottom of the hierarchy, one may find devices, e.g., sensors, such as vehicle and pedestrian aware sensors 11 in e.g., a smart traffic light … the lower level devices provide information to respective roadside traffic fog devices 16, which may in turn feed information to the next level fog nodes, neighbourhood traffic fog devices 17, which may then provide information to the next level fog nodes, regional traffic fog devices 18, which may in turn provide information to an Enterprise Mobility Suite (EMS) cloud 19 or a Metropolitan Traffic Services Cloud 20; paragraph [0085]: The service requests may be of different types, for example weather and traffic, or of the same type, for example, two different measures of pollution; paragraph [0085]: The service requests may be of different types, for example weather and traffic, or of the same type, for example, two different measures of pollution).
Regarding claim 10, Banerjee discloses
wherein the fog processing means is configured to use data from a database to process the edge processed data received from said subset; and/or wherein the edge processing means is configured to use data from a database to process the obtained environmental data (paragraph [0065]: Any of the first node 111, the second node 112, the third node 113 and the another node 114, may, in some embodiments, have the capability to determine, e.g., derive or calculate, one or more respective machine-learning models 116, respectively, which may be stored, in a respective database or memory 117).
Regarding claim 11, Banerjee discloses
wherein, when the first class is different from the second class, the fog processing means is configured to use data, such as weather related data, from a database to determine whether to attribute the first or second class to the event, and to generate processed fog data including the determined class for the event (paragraph [0065]: Any of the first node 111, the second node 112, the third node 113 and the another node 114, may, in some embodiments, have the capability to determine, e.g., derive or calculate, one or more respective machine-learning models 116, respectively, which may be stored, in a respective database or memory 117; paragraph [0085]: The service requests may be of different types, for example weather and traffic, or of the same type, for example, two different measures of pollution).
Regarding claim 12, Banerjee discloses
wherein the fog processing means is configured to augment the fog processed data using data from the database; and/or wherein the data in the database includes any one or more of the following: weather related information, traffic information, geo-coordinates, news and internet information, public transportation information, events schedules, timing information, public safety information, sanitary reports, security reports, road condition reports and cellphone data of cellphones (paragraph [0051]: As may be appreciated in FIG. 1, at the bottom of the hierarchy, one may find devices, e.g., sensors, such as vehicle and pedestrian aware sensors 11 in e.g., a smart traffic light, road side sensors 12 capable of measuring speed, volume, weather e.g., ice, snow, water, etc. . . . , on-board devices 13 that may connect via an Access Point (AP) in the vehicle for internet, phone, or infotainment services, in vehicle sensors 14, road side traffic fog devices 15, e.g., cameras. Information from these devices may then be provided upward, to successive levels of nodes, based on the hierarchy of the architecture; paragraph [0065]: Any of the first node 111, the second node 112, the third node 113 and the another node 114, may, in some embodiments, have the capability to determine, e.g., derive or calculate, one or more respective machine-learning models 116, respectively, which may be stored, in a respective database or memory 117; paragraph [0085]: The service requests may be of different types, for example weather and traffic, or of the same type, for example, two different measures of pollution).
Regarding claim 14, Banerjee discloses
wherein each fog device is configured to take decisions on the processing and transmitting of data, said decisions including one or more of the following: - whether or not received edge processed data is to be processed by the fog device or to be transmitted to the central control system; - whether or not fog processed data is to be transmitted to the central control system (paragraph [0051]: the lower level devices provide information to respective roadside traffic fog devices 16, which may in turn feed information to the next level fog nodes, neighbourhood traffic fog devices 17, which may then provide information to the next level fog nodes, regional traffic fog devices 18, which may in turn provide information to an Enterprise Mobility Suite (EMS) cloud 19 or a Metropolitan Traffic Services Cloud 20; paragraph [0231]: To uphold the ultra-low latency application, captured images, that is, data, may not be sent back to the cloud for training. Instead, a decentralized machine learning model may learn from it collaboratively at local fog layers, where leader nodes may coordinate the jobs to handle performance-cost trade off).
Regarding claim 15, Banerjee discloses
wherein the at least one data source comprises at least one sensor, preferably at least two sensors; and/or wherein said at least one data source comprises at least two sensors connected to a common interface board of the edge device (Fig. 1, paragraph [0051]: at the bottom of the hierarchy, one may find devices, e.g., sensors, such as vehicle and pedestrian aware sensors 11 in e.g., a smart traffic light … the lower level devices provide information to respective roadside traffic fog devices 16).
Regarding claim 16, Banerjee discloses
wherein the at least one data source comprises at least one of, preferably at least two of: an optical sensor, a photodetector or an image sensor, a sound sensor, a radar, a Doppler effect radar, a LIDAR, a humidity sensor, a pollution sensor, a temperature sensor, a motion sensor, an antenna, an RF sensor, a vibration sensor, a metering device, a malfunctioning sensor, a measurement device for measuring a maintenance related parameter of a component of the edge device, an alarm device (paragraph [0051]: As may be appreciated in FIG. 1, at the bottom of the hierarchy, one may find devices, e.g., sensors, such as vehicle and pedestrian aware sensors 11 in e.g., a smart traffic light, road side sensors 12 capable of measuring speed, volume, weather e.g., ice, snow, water, etc. . . . , on-board devices 13 that may connect via an Access Point (AP) in the vehicle for internet, phone, or infotainment services, in vehicle sensors 14, road side traffic fog devices 15, e.g., cameras. Information from these devices may then be provided upward, to successive levels of nodes, based on the hierarchy of the architecture); and/or
wherein the at least one data source comprises an image sensor configured to sense raw image data of the event, wherein the edge processing means is configured to process the sensed raw image data to select a class from a plurality of classes relating to the type of object involved in the event, to generate an image attribute associated with the event, and to include the selected class and said image attribute in the edge processed data; and/or
wherein the at least one data source comprises a sound sensor configured to sense sound data of the event, wherein the edge processing means is configured to select a class from a plurality of classes according to the type of object involved in the event, and to include the selected class in the edge processed data; and/or
wherein the at least one data source comprises a radar sensor configured to sense radar data, wherein the edge processing means is configured to process the sensed radar image data to select a class from a plurality of classes relating to the type of object involved in the event, to generate a speed attribute associated with said object, and to include the selected class and said speed attribute in the edge processed data.
Regarding claim 21, Banerjee discloses
wherein each fog device and associated subset of edge devices are arranged in a mesh network (paragraph [0045]: . This fully decentralized peer-to-peer learning with information aggregation from the one-hop neighbours is not fully possible in a layered fog architecture, such as e.g., the OpenFog reference architecture, and when context aware clustering of fog nodes is taken into consideration.), wherein preferably the edge communication means and the fog communication means are configured to communicate, at least, through an IEEE 802.15.4 protocol (Note: the word “preferably” will ordinarily prevent that phrase from having patentable weight. “Preferably” signals a preferred embodiment and is usually read as non-limiting, so the IEEE 802.15.4 feature described with “preferably” would not reliably serve as a required claim element to distinguish prior art or support infringement); and/or
wherein the edge device is configured to control the at least one data source and the edge processing means using a machine learning model; and/or wherein the fog device is configured to control the fog processing means using a machine learning model (paragraph [0012]: fog nodes may benefit from obtaining a well-trained machine learning model without sending their privacy-sensitive personal data to the cloud; [0246]: The learnt result, that is, the global belief, and data from the first layer may be aggregated and sent up the fog hierarchy to the second and third layers—neighbourhood and regional fog nodes—for further analysis and distribution; paragraph [0254]: The learnt result, that is, the global belief, and the data from the first layer may be aggregated and sent up the fog hierarchy to the second and third layers—neighbourhood and zonal fog nodes—for further analysis and distribution. For example: in an Airport Surveillance system—parking lot, check-in area, lounge area etc. may have different fog clusters. Decentralized models at this layer may learn to make decisions on anomaly detection at those locations. In a second layer, data and prediction from these different locations may be aggregated for making more complex decisions).
Regarding claim 23, Banerjee discloses
wherein the edge device configured to control the edge processing means to update the predetermined classes for classifying; and/or wherein the fog device is configured to control the edge processing means to update the predetermined classes for classifying (paragraph [0065]: Any of the first node 111, the second node 112, the third node 113 and the another node 114, may, in some embodiments, have the capability to determine, e.g., derive or calculate, one or more respective machine-learning models 116, respectively, which may be stored, in a respective database or memory 117; paragraph [0084]: A service request may be for example, a traffic condition service or a weather condition service; paragraph [0231]: The first level of the fog hierarchy may be the capturing module consisting of F-UEs. At this level, fog nodes may receive captured images from different sources. To uphold the ultra-low latency application, captured images, that is, data, may not be sent back to the cloud for training. Instead, a decentralized machine learning model may learn from it collaboratively at local fog layers, where leader nodes may coordinate the jobs to handle performance-cost trade off; paragraph [0124]: processor, as used herein, may be understood to be a hardware component; paragraph [0161]: the third node 113 updates a machine-learning model of the event)
Regarding claim 26, Banerjee discloses
wherein the plurality of edge devices comprises any one or more of the following: a luminaire, a bin, a sensor device, a street furniture, a charging station, a payment terminal, a parking terminal, a street sign, a traffic light, a telecommunication cabinet, a traffic surveillance terminal, a safety surveillance terminal, a water management terminal, a weather station, an energy metering terminal, an access lid in a pavement (paragraph [0051]: the lower level devices provide information to respective roadside traffic fog devices 16; paragraph [0254]: The learnt result, that is, the global belief, and the data from the first layer may be aggregated and sent up the fog hierarchy to the second and third layers—neighbourhood and zonal fog nodes—for further analysis and distribution. For example: in an Airport Surveillance system—parking lot, check-in area, lounge area etc. may have different fog clusters. Decentralized models at this layer may learn to make decisions on anomaly detection at those locations).
Regarding claim 27, Banerjee discloses
wherein the fog device is configured to transmit control data based on fog processed data to an edge device of the plurality of edge devices, and wherein the edge device comprises a controller configured to control a component thereof in function of the received control data; and/or wherein the fog processing means comprises at least three RF front end modules to communicate with at least two edge devices and the central control means (paragraph [0246]: The learnt result, that is, the global belief, and data from the first layer may be aggregated and sent up the fog hierarchy to the second and third layers—neighbourhood and regional fog nodes—for further analysis and distribution; paragraph [0276]: The embodiments herein in the first node 111 may be implemented through one or more processors, such as a processor 1104 in the first node 111 depicted in FIG. 11a, together with computer program code for performing the functions and actions of the embodiments herein; paragraph [0278]: the receiving port 1106 may be, for example, connected to one or more antennas in first node 111).
Regarding claim 30, Banerjee discloses
An edge module, preferably for use in a system according to the previous claim (Note: the word “preferably” will ordinarily prevent that phrase from having patentable weight) comprising at least two sensors and further comprising a common interface board for interconnecting said at least two sensors to the edge processing means of an edge device, wherein the common interface board comprises signal level translation means for translating data signal levels between the at least two sensors and the edge processing means (Fig. 1, paragraph [0051]: at the bottom of the hierarchy, one may find devices, e.g., sensors, such as vehicle and pedestrian aware sensors 11 in e.g., a smart traffic light … the lower level devices provide information to respective roadside traffic fog devices 16) and/or power conversion and management means for receiving power from a power source and converting said power for powering the at least two sensors.
Claim Rejections - 35 USC § 103
In the event the determination of the status of the application as subject to AIA 35 U.S.C. 102 and 103 (or as subject to pre-AIA 35 U.S.C. 102 and 103) is incorrect, any correction of the statutory basis for the rejection will not be considered a new ground of rejection if the prior art relied upon, and the rationale supporting the rejection, would be the same under either status.
The following is a quotation of 35 U.S.C. 103 which forms the basis for all obviousness rejections set forth in this Office action:
A patent for a claimed invention may not be obtained, notwithstanding that the claimed invention is not identically disclosed as set forth in section 102 of this title, if the differences between the claimed invention and the prior art are such that the claimed invention as a whole would have been obvious before the effective filing date of the claimed invention to a person having ordinary skill in the art to which the claimed invention pertains. Patentability shall not be negated by the manner in which the invention was made
Claims 2, 5, 32, and 33 are rejected under 35 U.S.C. 103 as being unpatentable over Banerjee in view of Matsuo et al. (US 2023/0254365, hereinafter Matsuo).
Regarding claim 2, Banerjee discloses
wherein the fog communication means is configured to receive first edge processed data about an event from a first edge device and to receive second edge processed data about an event from a second edge device, and wherein the fog processing means is configured to process said first and second edge processed data to determine whether or not the first and second edge processed data relate to the … event, and to transmit fog processed data to the central control system in accordance with the determined result (paragraph [0051]: the lower level devices provide information to respective roadside traffic fog devices 16, which may in turn feed information to the next level fog nodes, neighbourhood traffic fog devices 17, which may then provide information to the next level fog nodes, regional traffic fog devices 18, which may in turn provide information to an Enterprise Mobility Suite (EMS) cloud 19 or a Metropolitan Traffic Services Cloud 20; paragraph [0165]: Fog nodes may update their local belief by aggregating information from their local observational data with the model of their same cluster neighbours to collectively learn a model that best fits the observations for the local cluster; paragraph [0187]: the first measure of similarity in the distribution pattern of data collected by the nodes in the first plurality of nodes 110 over the first period of time, about the service requests received of a same type).
Banerjee does not disclose the first and second edge processed data relate to the same event while Banerjee teaches receiving edge data at fog nodes and performing aggregation/analysis across multiple edge sources. Matsuo discloses the first and second edge processed data relate to the same event (paragraph [0019]: A sensor device 4 is accommodated in a neighboring edge server and configured to perform communication via the edge server; paragraph [0048] In a first example, there are a vehicle X configured to periodically transmit metadata and a vehicle Y configured to periodically transmit metadata and also transmit sensor data (a motion image of the on-board camera) in a non-periodic manner. It is assumed that a priority for the sensor data of the vehicle Y is calculated based on the metadata of a vehicle group relevant to the vehicle X. In other words, even among the sensor devices 4 of a same type, there are a sensor device 4 configured to transmit the metadata and also a sensor device 4 configured to transmit the sensor data instead of the metadata. Alternatively, there may be a sensor device 4 configured to transmit both the metadata and the sensor data. In the present invention, since the device has the sensor data collection unit 12 and the metadata collection unit 11 as components, the sensor devices 4 of the same type can be dealt with at the same time without distinguishing whether there is metadata to be transmitted or whether there is the sensor data to be transmitted)). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Banerjee’s fog node processing to incorporate Matsuo’s cross-device relevance/priority techniques. The motivation would have been to more efficiently use a resource of a server while requirements on a sensor device are satisfied (Matsuo paragraph [0008]]. and aligns with Banerjee’s goals of managing local processing and communication in a fog hierarchy. A POSITA would have had a reasonable expectation that applying Matsuo’s metadata-based relevance/prioritization at Banerjee’s fog nodes would enable determination of whether two edge outputs relate to the same event and conditional transmission of fog processed data accordingly.
Regarding claim 5, Banerjee discloses
wherein the at least one data source comprises at least a first sensor configured to obtain first sensed environmental data and a second sensor configured to obtain second sensed environmental data for the same event, and … (paragraph [0051]: at the bottom of the hierarchy, one may find devices, e.g., sensors, such as vehicle and pedestrian aware sensors 11 in e.g., a smart traffic light … the lower level devices provide information to respective roadside traffic fog devices 16, which may in turn feed information to the next level fog nodes, neighbourhood traffic fog devices 17, which may then provide information to the next level fog nodes, regional traffic fog devices 18, which may in turn provide information to an Enterprise Mobility Suite (EMS) cloud 19 or a Metropolitan Traffic Services Cloud 20; paragraph [0085]: The service requests may be of different types, for example weather and traffic, or of the same type, for example, two different measures of pollution).
Banerjee does not disclose wherein the edge processing means is configured to select a class for the event based on at least the first and the second sensed environmental data while Banerjee teaches that service requests may be of different or same type, and that multiple sensors may exist, but it does not explicitly teach fusing the outputs from two sensors for the same event to derive a single classification.
Matsuo discloses wherein the edge processing means is configured to select a class for the event based on at least the first and the second sensed environmental data (paragraph [0019]: A sensor device 4 is accommodated in a neighboring edge server and configured to perform communication via the edge server; paragraph [0048] In a first example, there are a vehicle X configured to periodically transmit metadata and a vehicle Y configured to periodically transmit metadata and also transmit sensor data (a motion image of the on-board camera) in a non-periodic manner. It is assumed that a priority for the sensor data of the vehicle Y is calculated based on the metadata of a vehicle group relevant to the vehicle X. In other words, even among the sensor devices 4 of a same type, there are a sensor device 4 configured to transmit the metadata and also a sensor device 4 configured to transmit the sensor data instead of the metadata. Alternatively, there may be a sensor device 4 configured to transmit both the metadata and the sensor data. In the present invention, since the device has the sensor data collection unit 12 and the metadata collection unit 11 as components, the sensor devices 4 of the same type can be dealt with at the same time without distinguishing whether there is metadata to be transmitted or whether there is the sensor data to be transmitted)). In particular, Matsuo further teaches treating devices of the same type uniformly and using multiple inputs to inform decision logic, which is analogous to combining multiple sensor data streams to make a unified determination.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Banerjee’s edge processing to incorporate Matsuo’s cross-input relevance/priority determination techniques so that the edge processor selects a class for an event based on both the first and second sensed environmental data. The motivation would have been to more efficiently use a resource of a server while requirements on a sensor device are satisfied (Matsuo paragraph [0008]]) which aligns with Banerjee’s goals of optimizing local processing and communication in a fog hierarchy. A POSITA would have recognized that fusing multiple sensor inputs for the same event to produce a single classification would improve accuracy, reduce redundant transmissions, and optimize resource usage.
Regarding claim 32, Banerjee discloses
A network system comprising (Fig. 3):
a plurality of edge devices arranged at a plurality of locations, an edge device thereof comprising (paragraph [0051]: the lower level devices provide information to respective roadside traffic fog devices 16):
at least one data source configured to obtain environmental data, said environmental data being related to an event in the edge device or the vicinity of the edge device (paragraph [0051]: As may be appreciated in FIG. 1, at the bottom of the hierarchy, one may find devices, e.g., sensors, such as vehicle and pedestrian aware sensors 11 in e.g., a smart traffic light, road side sensors 12 capable of measuring speed, volume, weather e.g., ice, snow, water, etc. . . . , on-board devices 13 that may connect via an Access Point (AP) in the vehicle for internet, phone, or infotainment services, in vehicle sensors 14, road side traffic fog devices 15, e.g., cameras. Information from these devices may then be provided upward, to successive levels of nodes, based on the hierarchy of the architecture. Different vertical application use cases may use a fog hierarchy differently),
an edge processing means configured to produce edge processed data based on said environmental data (paragraph [0082]: The sensors 328 may be full IoT devices 302, for example, capable of both collecting data and processing the data), said edge processed data comprising at least one value representative for a class selected from a plurality of predetermined classes for classifying the event (paragraph [0006]: Analytics in fog network needs communication, processing and intelligence to be pushed directly into edge nodes; paragraph [0011]: In FL, algorithm training may be understood to move to the edge of the network, so that data may never leave the device; paragraph [0084]: A service request may be for example, a traffic condition service or a weather condition service; paragraph [0231]: The first level of the fog hierarchy may be the capturing module consisting of F-UEs. At this level, fog nodes may receive captured images from different sources. To uphold the ultra-low latency application, captured images, that is, data, may not be sent back to the cloud for training. Instead, a decentralized machine learning model may learn from it collaboratively at local fog layers, where leader nodes may coordinate the jobs to handle performance-cost trade off), and
an edge communication means for communicating edge processed data (paragraph [0051]: the lower level devices provide information to respective roadside traffic fog devices 16, which may in turn feed information to the next level fog nodes, neighbourhood traffic fog devices 17, which may then provide information to the next level fog nodes, regional traffic fog devices 18, which may in turn provide information to an Enterprise Mobility Suite (EMS) cloud 19 or a Metropolitan Traffic Services Cloud 20);
a plurality of fog devices, a fog device thereof being associated with a subset of said plurality of edge devices (paragraph [0051]: the lower level devices provide information to respective roadside traffic fog devices 16, which may in turn feed information to the next level fog nodes, neighbourhood traffic fog devices 17, which may then provide information to the next level fog nodes, regional traffic fog devices 18, which may in turn provide information to an Enterprise Mobility Suite (EMS) cloud 19 or a Metropolitan), said fog device comprising:
a fog communication means configured to receive edge processed data from said subset and to transmit fog processed data (paragraph [0246]: The learnt result, that is, the global belief, and data from the first layer may be aggregated and sent up the fog hierarchy to the second and third layers—neighbourhood and regional fog nodes—for further analysis and distribution), and
a fog processing means configured to process the edge processed data received from said subset and to produce fog processed data based thereon, said fog processed data comprising information about an event in the vicinity of said subset or an event in at least one edge device of said subset (paragraph [0235]: each fog layer in the hierarchy may provide additional computing, storage, and network capabilities in service of the vertical application at their level of the hierarchy; paragraph [0246]: The learnt result, that is, the global belief, and data from the first layer may be aggregated and sent up the fog hierarchy to the second and third layers—neighbourhood and regional fog nodes—for further analysis and distribution; paragraph [0254]: The learnt result, that is, the global belief, and the data from the first layer may be aggregated and sent up the fog hierarchy to the second and third layers—neighbourhood and zonal fog nodes—for further analysis and distribution. For example: in an Airport Surveillance system—parking lot, check-in area, lounge area etc. may have different fog clusters. Decentralized models at this layer may learn to make decisions on anomaly detection at those locations. In a second layer, data and prediction from these different locations may be aggregated for making more complex decisions); and
a central control system in communication with said plurality of fog devices and configured to receive fog processed data from said plurality of fog devices (paragraph [0051]: paragraph [0051]: the lower level devices provide information to respective roadside traffic fog devices 16, which may in turn feed information to the next level fog nodes, neighbourhood traffic fog devices 17, which may then provide information to the next level fog nodes, regional traffic fog devices 18, which may in turn provide information to an Enterprise Mobility Suite (EMS) cloud 19 or a Metropolitan);
wherein the at least one data source comprises at least a first sensor configured to obtain first sensed environmental data and a second sensor configured to obtain second sensed environmental data, and … (paragraph [0051]: As may be appreciated in FIG. 1, at the bottom of the hierarchy, one may find devices, e.g., sensors, such as vehicle and pedestrian aware sensors 11 in e.g., a smart traffic light, road side sensors 12 capable of measuring speed, volume, weather e.g., ice, snow, water, etc. . . . , on-board devices 13 that may connect via an Access Point (AP) in the vehicle for internet, phone, or infotainment services, in vehicle sensors 14, road side traffic fog devices 15, e.g., cameras. Information from these devices may then be provided upward, to successive levels of nodes, based on the hierarchy of the architecture).
Banerjee does not disclose wherein the edge processing means is configured to select a first class for the event based on the first sensed environmental data and to select a second class for the same event based on the second sensed environmental data, wherein preferably the edge processing means is configured to use the first class to select the second class. While Banerjee generically teaches multi-sensor inputs, aggregation, and local classification, the specific dependency “use the first class to select the second class” is not explicitly described.
Matsuo discloses wherein the edge processing means is configured to select a first class for the event based on the first sensed environmental data and to select a second class for the same event based on the second sensed environmental data, wherein preferably the edge processing means is configured to use the first class to select the second class (paragraph [0019]: A sensor device 4 is accommodated in a neighboring edge server and configured to perform communication via the edge server; paragraph [0048] In a first example, there are a vehicle X configured to periodically transmit metadata and a vehicle Y configured to periodically transmit metadata and also transmit sensor data (a motion image of the on-board camera) in a non-periodic manner. It is assumed that a priority for the sensor data of the vehicle Y is calculated based on the metadata of a vehicle group relevant to the vehicle X. In other words, even among the sensor devices 4 of a same type, there are a sensor device 4 configured to transmit the metadata and also a sensor device 4 configured to transmit the sensor data instead of the metadata. Alternatively, there may be a sensor device 4 configured to transmit both the metadata and the sensor data. In the present invention, since the device has the sensor data collection unit 12 and the metadata collection unit 11 as components, the sensor devices 4 of the same type can be dealt with at the same time without distinguishing whether there is metadata to be transmitted or whether there is the sensor data to be transmitted)). In particular, Matsuo further teaches treating devices of the same type uniformly and using multiple inputs to inform decision logic, which is analogous to combining multiple sensor data streams to make a unified determination.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Banerjee’s edge processing to incorporate Matsuo’s cross-input relevance/priority techniques. The combination is reasonably expected to yield predictable improvements: applying metadata-based relevance or group-based inference (Matsuo) at Banerjee’s edge/fog nodes would guide how multiple sensor outputs for the same event are combined or prioritized, and a POSITA would have had a reasonable expectation that using one sensor’s classification to influence or select another sensor’s classification (or the fusion strategy) would improve classification accuracy, reduce redundant transmissions, and optimize resource usage in the fog/edge hierarchy. Matsuo provides the teaching and motivation to perform such cross-input logic at edge/fog nodes in Banerjee’s architecture.
Regarding claim 33, Banerjee discloses
wherein the edge processing means
is configured to control the communication means to transmit: … (paragraph [0051]: the lower level devices provide information to respective roadside traffic fog devices 16, which may in turn feed information to the next level fog nodes, neighbourhood traffic fog devices 17, which may then provide information to the next level fog nodes, regional traffic fog devices 18, which may in turn provide information to an Enterprise Mobility Suite (EMS) cloud 19 or a Metropolitan Traffic Services Cloud 20; paragraph [0165]: Fog nodes may update their local belief by aggregating information from their local observational data with the model of their same cluster neighbours to collectively learn a model that best fits the observations for the local cluster; paragraph [0187]: the first measure of similarity in the distribution pattern of data collected by the nodes in the first plurality of nodes 110 over the first period of time, about the service requests received of a same type).
Banerjee does not disclose a single value if the selected first and second classes are the same, or first and second values representative for the selected first and second classes if the first and second classes are not the same while Banerjee teaches receiving edge data at fog nodes and performing aggregation/analysis across multiple edge sources. Matsuo discloses a single value if the selected first and second classes are the same, or first and second values representative for the selected first and second classes if the first and second classes are not the same (paragraph [0019]: A sensor device 4 is accommodated in a neighboring edge server and configured to perform communication via the edge server; paragraph [0048] In a first example, there are a vehicle X configured to periodically transmit metadata and a vehicle Y configured to periodically transmit metadata and also transmit sensor data (a motion image of the on-board camera) in a non-periodic manner. It is assumed that a priority for the sensor data of the vehicle Y is calculated based on the metadata of a vehicle group relevant to the vehicle X. In other words, even among the sensor devices 4 of a same type, there are a sensor device 4 configured to transmit the metadata and also a sensor device 4 configured to transmit the sensor data instead of the metadata. Alternatively, there may be a sensor device 4 configured to transmit both the metadata and the sensor data. In the present invention, since the device has the sensor data collection unit 12 and the metadata collection unit 11 as components, the sensor devices 4 of the same type can be dealt with at the same time without distinguishing whether there is metadata to be transmitted or whether there is the sensor data to be transmitted)). Matsuo further discloses treating devices of the same type uniformly and handling both metadata and sensor data without distinguishing between them for processing purposes. This teaching inherently supports logic where, if two data inputs (e.g., metadata and sensor data) are determined to correspond to the same classification or context, a single representative output can be transmitted; if they differ, multiple outputs can be sent to preserve distinct information.
It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify Banerjee’s fog node processing to incorporate Matsuo’s cross-device relevance/priority techniques, thereby enabling the fog node or edge processor to determine whether two classification outputs (from different data sources for the same event) are the same or different and to control transmission accordingly — sending a single value when they match, or both values when they differ. The motivation would have been to more efficiently use a resource of a server while requirements on a sensor device are satisfied (Matsuo paragraph [0008]]. This motivation aligns with Banerjee’s goals of managing local processing and communication efficiency in a fog hierarchy. A POSITA would have recognized that such conditional transmission reduces redundant data transfer when classifications agree, while preserving necessary detail when they do not, thereby optimizing bandwidth and processing resource usage.
Claim 25 is rejected under 35 U.S.C. 103 as being unpatentable over Banerjee in view of Stringfellow et al. (US 2018/0270121, hereinafter Stringfellow).
Regarding claim 25, Banerjee discloses
wherein each subset of edge devices comprises … preferably at least fifty edge devices (Fig. 1 roadside traffic fog devices; Note: the word “preferably” will ordinarily prevent that phrase from having patentable weight. “Preferably” signals a preferred embodiment and is usually read as non-limiting, so at least fifty edge devices described with “preferably” would not reliably serve as a required claim element to distinguish prior art or support infringement).
Banerjee does not disclose wherein each subset of edge devices comprises at least ten edge devices, preferably at least fifty edge devices. Stringfellow discloses wherein each subset of edge devices comprises at least ten edge devices, preferably at least fifty edge devices (paragraph [0091]: The use of such hierarchies enables the present invention to scale to arbitrarily large and widespread optimization problems, that may encompass thousands of network fog nodes (or more)). It would have been obvious to one of ordinary skill in the art before the effective filing date of the claimed invention to modify the teaching of Banerjee by incorporating Stringfellow’s selectable flavors in a template, mapping of some availability zones to central vs edge cloud based on deployment location information tied to flavors, and programmable logic to filter and automatically determine N availability zones for deployment. The motivation would have been to enable the present invention to scale to arbitrarily large and widespread optimization problems, that may encompass thousands of network fog nodes (or more). (Stringfellow paragraph [0091].
Conclusion
Applicant's amendment necessitated the new ground(s) of rejection presented in this Office action. Accordingly, THIS ACTION IS MADE FINAL. See MPEP § 706.07(a). Applicant is reminded of the extension of time policy as set forth in [0037] CFR 1.136(a).
A shortened statutory period for reply to this final action is set to expire THREE MONTHS from the mailing date of this action. In the event a first reply is filed within TWO MONTHS of the mailing date of this final action and the advisory action is not mailed until after the end of the THREE-MONTH shortened statutory period, then the shortened statutory period will expire on the date the advisory action is mailed, and any extension fee pursuant to [0037] CFR 1.136(a) will be calculated from the mailing date of the advisory action. In no event, however, will the statutory period for reply expire later than SIX MONTHS from the date of this final action.
Any inquiry concerning this communication or earlier communications from the examiner should be directed to SISLEY N. KIM whose telephone number is (571)270-7832. The examiner can normally be reached M-F 11:30AM -7:30PM.
Examiner interviews are available via telephone, in-person, and video conferencing using a USPTO supplied web-based collaboration tool. To schedule an interview, applicant is encouraged to use the USPTO Automated Interview Request (AIR) at http://www.uspto.gov/interviewpractice.
If attempts to reach the examiner by telephone are unsuccessful, the examiner’s supervisor, April Y. Blair can be reached on (571)270-1014. The fax phone number for the organization where this application or proceeding is assigned is 571-273-8300.
Information regarding the status of published or unpublished applications may be obtained from Patent Center. Unpublished application information in Patent Center is available to registered users. To file and manage patent submissions in Patent Center, visit: https://patentcenter.uspto.gov. Visit https://www.uspto.gov/patents/apply/patent-center for more information about Patent Center and https://www.uspto.gov/patents/docx for information about filing in DOCX format. For additional questions, contact the Electronic Business Center (EBC) at 866-217-9197 (toll-free). If you would like assistance from a USPTO Customer Service Representative, call 800-786-9199 (IN USA OR CANADA) or 571-272-1000.
/SISLEY N KIM/Primary Examiner, Art Unit 2196 3/20/2026