Driving the evolution of NILM technology


The Driving the evolution of NILM technology project was led by Professor Knottenbelt and Jack Kelly from the Department of Computing. It was overseen by Dr Dominique Bertin of EDF Energy R&D UK Centre. The project's aim was to provide suggestions for measuring performance, and accelerating development, of technology that uses smart meter data to estimate the energy consumed by individual appliances.

The project was part of the EDF FlexiFund, a collaboration between Imperial College London and EDF Energy R&D UK Centre. This initiative enabled short-term exploratory projects in areas of common interest to both. The expectation from these precursor projects is that they could lead to longer-term research projects.


The project's focus was on non-intrusive load monitoring (NILM) algorithms that use smart meter data to estimate the energy consumed by individual appliances without monitoring each device individually.

This disaggregated energy data has the potential to give users and utility companies unprecedented insight into appliance usage. For example, disaggregated data might help users to reduce their energy consumption. And it could help utilities attract users to the utility company’s website, hence saving money on call centres. Furthermore, disaggregated data may help to enable technologies such as demand-side response.

The project set out to help solve the problem of comparing the performance of different NILM methods. This is complex issue as each researcher and NILM company uses a different set of metrics and datasets.


The project proposed two broad solutions to the issue:

  • A NILM algorithm competition
    A web platform that provides datasets, which competitors would disaggregate. Their estimates would be uploaded and compared to the secret true values. The platform would then compute multiple performance metrics.
  • A NILM "gym"
    An open source software tool that would simulate large amounts of usage data for a wide range of different scenarios. NILM developers would then train and test their algorithms on this synthetic data. These estimates would then be tested against a range of performance metrics by the tool. To create the gym a high fidelity simulator of disaggregated energy data would need to be built.