IoT is pervading our lifes through many of its daily use applications, from Fitbit to smart home automation tools. But where the IoT is more likely to be revolutionary is on different sectors:
· In Industry, in Smart Cities or in building management, the smart connected IoT sensors are becoming so cheap that they'll soon pervade all critical equipment. This will allow for a better control of this equipment that will not only allow to be more reactive to failures, but to put in place Predictive Failure Analysis where we use Planned Preventative Maintenance which is more expensive.
· In transportation and logistics IoT may bring not only a better control on equipment but on all of the supply chain. The sensitive good themselves will be followed during their travel from departure to destination with a precise recording of shocks, vibrations, temperature and humidity variations, etc.
· Telemedicine, for example is forecasted to be revolutioned by IoT, producing great quantities of sensitive data.
All these applications will cause a huge amount of data to be created and to be stored in databases. This is a challenge for DBAs which will see in parallel of the usual application-driven databases, appear less structured databases fed by thousands or tens of thousands of remote IoT devices.
Most of the time there will be real-time analytics that will have to run on those data, in order to take quick decisions (whether reactive or proactive) creating challenges in terms of space, of performance or on how to historicize or to purge this data that is produced in huge quantities. The most reactive a decision must be taken, the more frequently the data must be measured and this creates a high volume of data to be transferred, stored and analyzed. While we need to have detailed/fresh data to take reactive decisions, most of it is of no or little use after it was created and analyzed. To draw a daily report of a fluid temperature, a read every 5 minutes will be sufficient, while a may need 1 measure per second (300 times more) to be reactive enough in my industrial process.
So the question comes in : does it makes sense to centralize huge amounts of data and have to run centralized analytics on an infratructure that may be difficult to up-size to allow it to follow a growth that may be exponential?
The CERN which has a data flow coming form its experiments of about 25GB/s (Gigabyte per second) has developed a two-stage Process to handle such a huge amount of data: the first stage uses algorithms to select the events that need to be further analyzed and discard the events that have no interest.
This is similar to the process that can be put in place near the IoT sensors (at the "edge") thanks to the specialized IoT Edge Gateway proposed by Dell that is able to collect IoT data and to run analytics on it, to take immediate decisions and to push the valuable info (and only that one) to a centralized location where it will be stored for future and detailed analysis.
Dell's IoT Edge gateways are rugged fanless gateways powered with Celeron processors that will connect to wired and wireless IoT devices and through Statistica -or with the middleware of your choice- be able to aggregate and analyze their output, allowing to take immediate decisions and to send only the meaning ful data to your databases, saving also precious bandwidth. They can be placed on factory or buildings where temperatures are harsh or even embarked on trucks.
The Dell IoT Edge Gateways also can connect to the devices through IoT typical protocols such as ModBus, BACnet, ZigBee, etc. and it will allow for a high level of security in data transmission, it will be therefore a mandatory component of all IoT architectures.
In conclusion, the DBA should participate from the early stages to IoT Project, evaluate the quality and quantity of data that will be produced and ensure that decentralized analytics are put in place at the edge so that only meaningful data is upladed in the databases he'll manage.
Fabrizio Faleni, PMP
CERN Data processing
Dell IoT Edge Gateways