Subscribe to our Newsletter | Join our LinkedIn Group | Post on IoT Central

Guest blog post by Ajit Jaokar



In this series of exploratory blog posts, we explore the relationship between recurrent neural networks (RNNs) and IoT data.  The article is written by Ajit Jaokar, Dr Paul Katsande and Dr Vinay Mehendiratta  as part of the Data Science for Internet of Things practitioners course. Please contact [email protected] for more details


RNNs are already used for Time series analysis. Because IoT problems can often be modelled as a Time series, RNNs could apply to IoT data. In this multi-part blog, we first discuss Time series applications and then discuss how RNNs could apply to Time series applications. Finally, we discuss applicability to IoT.


In this article (Part One), we present the overall thought process behind the use of Recurrent neural networks and Time series applications - especially a type of RNN called Long Short Term Memory networks (LSTMs).

Time series applications

The process of Prediction involves making claims about the state of something in future depending on values in the past and its current state.  Many IoT applications (such as temperature values, smart meter readings etc) have a time dimension. Classical pattern recognition problems are concerned with knowing dependencies between variables (Regression) or in classification of input vectors into categories (Classification).  By including changes in time, we add an additional temporal dimension giving us Spatio-temporal data. Even when a phenomenon is continuous, it can be converted to a Time series by the process of sampling. Thus, phenomenon like speech, ECGs etc can be modelled as a time series when sampled.


Hence, we have a variable x changing in time xt (t=1,2,...) and we would like to predict the value of x at time t+h. Time series forecasting is a problem of function approximation and the forecast is made by computing an error measure over a time series.  Also, given a time series model, we can solve many related problems. For example: Forecast the value of the variable at a time t x(t); Classify the time series at a time in future(will prices go up or down); Model one time series in terms of another(for example Oil prices to Interest rates) etc.

Neural networks for Time series Prediction

However, while many scenarios (such as Weather prediction, foreign exchange fluctuations, energy consumption etc) can be expressed as a time series, formulating and solving the equations is hard in these cases because of reasons such as

a)      There are too many factors influencing the outcome

b)      There are hidden/unknown factors influencing the outcome.  

In many such scenarios, the focus is not to find the precise solution to the problem but rather to find a possible steady state where the system will converge. Neural networks often apply in such scenarios because they are able to learn from examples only and are able to catch hidden and strongly non-linear dependencies.

Neural networks are trained from historical data with the objective that the network will discover hidden dependencies and that it will be able to use them for predicting into future.  Thus, neural networks are not represented by an explicitly given model and can spot nonlinear dependencies in spatiotemporal patterns. They solve the problem of feature engineering i.e. in finding out what is the best representation of the sample data to learn a solution to your problem

Neural networks for time series processing – incorporating the Time domain

When used for Time series forecasting, the obvious first problem is: How to model Time in the  neural network? For traditional applications of Neural networks (such as pattern recognition or classification), we do not need to model the Time dimension. Time is difficult to model in a neural network because it is constantly moving forward.  However, by including a set of delays, we can retain successive values in the time series. Thus, each past value is treated as an additional spatial dimension. This process of converting the time dimension into an infinite-dimensional spatial vector is called embedding. Because for practical purposes, we need a limited set of values – we consider a history of previous d samples (the embedding dimension) as shown in the figure below



For an evolution of neural networks, see the previous post Evolution of Deep learning models. Recurrent neural networks are often used for modelling Time series. An example is using Recurrent Neural Networks To Forecasting of Forex(pdf)


A recurrent neural network (RNN) is a class of artificial neural network where connections between units form a directed cycle. This creates an internal state of the network which allows it to exhibit dynamic temporal behaviour. Unlike feedforward neural networks, RNNs can use their internal memory to process arbitrary sequences of inputs. This feature of using internal memory to process arbitrary sequences of inputs makes RNNs applicable to tasks such as unsegmented connected handwriting recognition, where they have achieved the best known results. (Wikipedia). The fundamental feature of a Recurrent Neural Network (RNN) is that the network contains at least one feed-back connection, so the activations can flow round in a loop. That enables the networks to do temporal processing and learn sequences, e.g., perform sequence recognition/reproduction or temporal association/prediction. Using the same idea as time delays as above, the recurrent neural network can be converted into a traditional feed forward neural network by unfolding over time as shown below.





RNNs are used to model Time series because the feedback mechanism creates a ‘memory’ i.e. an ability to process the Time dimension. Memory is important because many Time series problems (such as Traffic modelling) need a long term / historical modelling of Time values.  Long Short Term Memory networks(LSTMs) are a special kind of RNN, capable of learning long-term dependencies especially because they are capable of remembering information over long time frames. The figure below summarises feed forward neural networks and Recurrent neural networks.


Implications for IoT datasets

In subsequent articles, we will explore LSTMs in greater detail and implications for IoT data.

This article covers topics we teach in the Data Science for Internet of Things practitioners course. Please contact [email protected] for more details

 Follow us @IoTCtrl | Join our Community

E-mail me when people leave their comments –

You need to be a member of IoT Central to add comments!

Join IoT Central

Premier Sponsors

Upcoming IoT Events

More IoT News

Arcadia makes supporting clean energy easier

Nowadays, it’s easier than ever to power your home with clean energy, and yet, many Americans don’t know how to make the switch. Luckily, you don’t have to install expensive solar panels or switch utility companies…


4 industries to watch for AI disruption

Consumer-centric applications for artificial intelligence (AI) and automation are helping to stamp out the public perception that these technologies will only benefit businesses and negatively impact jobs and hiring. The conversation from human…


IoT Career Opportunities