EDITING BOARD
RO
EN
×
▼ BROWSE ISSUES ▼
Issue 32

IoT Flavor: Sensoriada

Andrei Crăciun
Senior Software Architect @ Bosch Cluj



PROGRAMMING

The world of software development has a lot of flavours and software developers prefer different flavours. In the world of Internet of Things (IoT) one the most important flavour is the one of SENSING. By adding a MOBILE flavour to any IoT product the solution becomes a very pleasant one. Even though sensing and mobile would be very enjoyable for the general users it is hard to find developers that would like both flavours. On one side we have volts, amperes, sensors, actuators and on the other side we have views, icons, lifecycles, uEx. Sensoriada creates a framework and defines a context that tries to fill the gap between these flavours with the only purpose of creating better applications and in the same time to allow developers to work with their favourite flavour.

"Internet of Things" (IoT) is not something new, but it is rather a new concept for what used to be, in the last decade at least, smart home, home automation, smart monitoring and much more. In all these years one important characteristic has remained the most important and the one that defines all IoT applications: they are very present in our lives (we have a very strong direct interaction with them and they affect our life directly).

In this quest we started from a simple problem defined as follows. Display on a mobile device the temperature inside and outside of the house in a live manner; as restrictions we were not allowed to drill holes and there was no specific place where the inside temperature should be monitored. Because of the restrictions the only solution was to have a wireless solution. The live part was less restrictive and that's not because it was easy but because it could have been much harder. In our opinion there are at least three ways (in respect of frequency) to get data from a sensor: live, real time and hard real time.

In the previous paragraphs there were a few terms which will be detailed further on.

First there was sensors and actuators. Sensors are the input of an IoT project they provide data that is stored, processed and analyzed by the system. Actuators are the tools of reacting. After all the data is analyzed and the decision is taken usually an action is triggered on an actuator to handle (e.g. an actuator can be an electronic switch that will turn on a light or a heater). Usually both sensors and actuators are connected to controllers (microcontrollers, chips, CPU-s); the best practices advices I received and which I want to share with you is that in the same project actuators and sensors should not be connected to the same controller, but it's safer to have one controller for input (sensing) and one for output (action).

Next the presentation was focused on the lifecycle of an IoT product. There are several approaches for this but the article will present only one of them, which I consider the most relevant for this project. In the image below, the 4 states of the lifecycle are represented:

  1. Measuring during which the data is read from the sensors,

  2. Storing (which could have been called Storing/Transporting) is the phase during which the data is persisted on the point where it will be analyzed (either locally in the environment for the hard real time systems, or possible in a cloud solution in the more distributed systems);

  3. Analyzing is the phase during which all the data read is processed and actions based on that are decided (during this phase interaction with other systems and also the analysis can be a visual one as it will be in our case).

  4. Reacting is the phase during which the actions decided in the Analyzing phase are executed usually by a layer of actuators.

On our system we will have a mobile device involved, so we should state that in general the role of a mobile device in an IoT product is placed during Storing and Analyzing phases because mainly the device is used to visualize data and to select some actions that will be transmitted to the actuators for execution during Reacting phase.

We structured the solution we've chosen as a sensor network. Maybe it's time to detail what a sensor network is. Well, the main components are:

From the mobile application point of view we have 3 main elements of design:

The cloud cache provides data as a JSON. There are several protocols and ways to get data from a sensor network, the most popular being MQTT; however we've chosen HTTP having the payload as JSON mainly because it is enough for our system and the electric imp cloud provides a very easy way to make a HTTP server connector. We've chosen JSON as a payload format mainly because of its popularity, popularity that makes processing of it very easy and handy.

The structure of the JSON is based on the application model described above. There is an array of sensor nodes having as elements the identification ("id" which currently is an integer, but there is a big debate in making it a string), energy status ("voltage"), accuracy ("secondsAgo" since "date") and the list of sensors. A sensor will have as elements: the type, the version and the values; as value the sensor could have also other elements and there is no restriction to only one value. The type and the version identify in a unique manner the type of the sensor and influence directly the mode the values provided are processed.

Below is an example of some data provided by the live/running system:

{  
   "sensorNodes":[  
      {  
         "id":0,
         "voltage":2848,
         "secondsAgo":247192,
         "date":"2015-01-24 07:10:21",
         "sensors":[  
            {  
               "type":10,
               "version":1,
               "value":787
            }
         ]
      },
      {  
         "id":1,
         "voltage":2892,
         "secondsAgo":32,
         "date":"2015-01-24 07:10:21",
         "sensors":[  
            {  
               "type":10,
               "version":1,
               "value":2243
            }
         ]
      }
   ]
}

From the Android point of view the model of the Sensor Node is very simple and includes the above defined elements.

public class SensorNode {
 public long id;
 public int voltage;
 public long secondsAgo;
 public Date date;
 public List sensors = new 
  LinkedList();
}

The Sensor model is an interface as we have a lot of diversity depending on the type and the version of the sensor. As for this version is a read-only system we only provide methods to get the data in a view manner and that is the main reason for having only the getHumanReadableValue() method to get the data. Regarding the variant of the sensor, there are several fixed values for the type (defined in the Sensor.Type enum); as for the version, the implemented classes should choose a maximum version and support all the versions below the chosen one:

public interface Sensor {
    public enum Type {
        TEMPERATURE(10),
        HUMIDITY(11),
        PRESSURE(12);
… 
}

    String getHumanReadableValue();
    Type[] getSupportedTypes();    
    boolean isTypeSupported(Type type);
    int getMaximumSupportedVersion();
}

The world of automotive has taught us that in this business things should be very simple, otherwise testing is very difficult and if IoT systems are not properly tested errors can have regrettable consequences because, as exposed in the beginning of the article the IoT systems are very present in our lives and interact with it directly. That being said we've wanted the users of the framework to have a very easy way to use it.

Below is a code sample of usage in a classic Android application

String nodesJson = getSomeHowTheJson(); 
// DataProvider in the next version

List sensorNodes = SensorNodeUtil. 
  parseSensorNodes(nodesJson);

int mySensorNodeIndex = someValue; 
// String identification in the next version

int mySensorIndex = someValue; 
// byType identification in the next version

someView.setText(sensorNodes.get(mySensorNodeIndex).sensors.get(mySensorIndex).getHumanReadableValue());

The magic happens in the SensorNodeUtil class which is in charge of processing the JSON input and to return a list of SensorNode-s each of them containing the proper implementations of Sensor-s based on type and version. To achieve this we've implemented a way to configure the system. The configuration will mainly handle a map between the sensor type and the name of the class that will handle the data for that type. This allows the framework to be extended by anyone and also to have a default configuration out of the box.

Conclusions

In the beginning of the article we stated that our goal was to create a bridge that will allow engineers working in the IoT business to be able to work together no matter the flavour the prefer. If we achieved this or not by the first version of the framework we will be told by time, but we consider we made a few steps forward and with the evolvement of the IoT community we hope Sensoriada will help build a synergy between developers that will not leave any room for conflict, but only for harmony and constructive debates :-).

Conference TSM

VIDEO: ISSUE 109 LAUNCH EVENT

Sponsors

  • Accenture
  • BT Code Crafters
  • Accesa
  • Bosch
  • Betfair
  • MHP
  • BoatyardX
  • .msg systems
  • P3 group
  • Ing Hubs
  • Cognizant Softvision
  • Colors in projects

VIDEO: ISSUE 109 LAUNCH EVENT

Andrei Crăciun wrote also