Neural NILM

Deep Neural Networks Applied To
Energy Disaggregation

Jack Kelly & William Knottenbelt
Imperial College London
(Swipe or press right-arrow on your keyboard to change slides)

Energy Disaggregation

Aim: Itemised Energy Bills

Outline

  1. Why use deep neural nets (DNNs) for NILM?
  2. How DNNs work
  3. Three DNN architectures for NILM
  4. Data augmentation
  5. Results
  6. Summary

Outline

  1. Why use deep neural nets (DNNs) for NILM?
  2. How DNNs work
  3. Three DNN architectures for NILM
  4. Data augmentation
  5. Results
  6. Summary

Name the Appliance?

Face Recognition

Manual Feature Extraction

georgehart.com/research/hartbiog.html

scgp.stonybrook.edu/archives/8516

Deep Neural Nets

Automatic Feature Learning

ImageNet Large Scale Visual Recognition Challenge (ILSVRC)

From: Krizhevsky, Sutskever & Hinton. ImageNet Classification with Deep Convolutional Neural Networks. NIPS (2012)

Image from devblogs.nvidia.com

Krizhevsky et al.'s DNN Results on ImageNet 2012

Krizhevsky, Sutskever & Hinton. ImageNet Classification with Deep Convolutional Neural Networks. NIPS (2012)

Outline

  1. Why use deep neural nets (DNNs) for NILM?
  2. How DNNs work
  3. Three DNN architectures for NILM
  4. Data augmentation
  5. Results
  6. Summary

The Artificial Neuron

Image adapted from WikiMedia Commons image by Chrislb

Feed Forward Nets

Krizhevsky et al.'s Architecture for ImageNet 2012

Krizhevsky, Sutskever & Hinton. ImageNet Classification with Deep Convolutional Neural Networks. NIPS (2012)

Training

Autoencoders

Autoencoder Examples

Hinton & Salakhutdinov. Reducing the dimensionality of data with neural networks. Science (2006)

Denoising Autoencoders

Image from Marc'Aurelio Ranzato

Vincent et al. Extracting and composing robust features with denoising autoencoders. ICML (2008)

Recurrent Neural Nets

Recurrent Neural Nets

Long Short-Term Memory (LSTM) Cells

Image from blog.otoro.net

Hochreiter & Schmidhuber. Long short-term memory. Neural Computation (1997)

Recurrent Neural Nets

Playing Volleyball :)

By hardmaru / ōtoro / 大トロ

Outline

  1. Why use deep neural nets (DNNs) for NILM?
  2. How DNNs work
  3. Three DNN architectures for NILM
    1. Recurrent Neural Nets (LSTM)
    2. Denoising Autoencoder
    3. 'Bounding rectangle' around the target
  4. Data augmentation
  5. Results
  6. Summary

Recurrent Neural Nets

Denoising Autoencoders

Bounding Rectangle

Outline

  1. Why use deep neural nets (DNNs) for NILM?
  2. How DNNs work
  3. Three DNN architectures for NILM
  4. Data augmentation
  5. Results
  6. Summary

DNNs need lots of data!

Data Augmentation for Images of Plakton

Raw

Augmented

From ≋ Deep Sea ≋ team (Dieleman et al.) on Kaggle National Data Science Bowl Plankton competition

Data Augmentation for NILM

  • Extract individual appliance activations from real data
  • For each generated example:
    • Randomly pick which appliances to include
    • Randomly pick individual activations
    • Randomly align activations

Outline

  1. Why use deep neural nets (DNNs) for NILM?
  2. How DNNs work
  3. Three DNN architectures for NILM
  4. Data augmentation
  5. Results
  6. Summary

Example Output

LSTM

Autoencoder

Rectangles

Metrics

Metrics on Seen Appliances

Metrics on Unseen Appliances

Outline

  1. Why use deep neural nets (DNNs) for NILM?
  2. How DNNs work
  3. Three DNN architectures for NILM
  4. Data augmentation
  5. Results
  6. Summary

Summary

  1. Developed 3 deep neural nets for NILM
  2. They perform better than NILMTK's CO or FHMM algorithms (on UK-DALE)
  3. Code and Data available:
    1. www.doc.ic.ac.uk/~dk3810/neuralnilm
    2. github.com/JackKelly/neuralnilm
  4. Just scratched the surface!