Hussein Fawaz, Omran Ayoub, Davide Andreoletti, Silvia Giordano
SUPSI, Switzerland
IEEE WONS 2026
Other version: [[IEEE] (https://www.researchgate.net/profile/Hussein-Fawaz-8/publication/403692932_Energy_Cost_of_Enhancing_Reliability_of_Machine_Learning_Models_for_Edge_IoT_Security/links/69d8eb5d5970dd1b05f78f8c/Energy-Cost-of-Enhancing-Reliability-of-Machine-Learning-Models-for-Edge-IoT-Security.pdf)] [Code]
Objective
Making machine learning models more reliable for IoT security improves detection quality, but it also increases energy consumption. This paper studies that trade-off in edge network intrusion detection systems.
Reliability vs. energy at the edge
Edge IoT devices operate under strict energy constraints, yet they are increasingly expected to run reliable ML-based security mechanisms such as network intrusion detection systems (NIDS). Techniques that improve reliability—such as uncertainty-aware models or calibration—often increase computational cost.
This raises a key question: how much energy does reliability actually cost?
Methodology
We evaluate multiple ML-based intrusion detection pipelines deployed at the edge and systematically enhance their reliability using techniques such as calibration-aware learning and uncertainty modeling. We then measure:
- Detection performance
- Reliability metrics
- Energy consumption under realistic edge workloads
Key findings
- Improving reliability consistently increases energy usage, but not always proportionally
- Some reliability techniques offer better energy–reliability trade-offs
- Feature and model choices strongly influence energy efficiency
Implications
Our results highlight the importance of energy-aware trustworthy AI for edge security deployments, where reliability must be balanced against limited resources.
Future work
We plan to release an open-source benchmarking framework for evaluating reliability–energy trade-offs in edge NIDS.