using influxdb and grafana for v2 data visualization

We generate more data using our smart controllers than I have ever done with regular documents. I perfer SCADA style animations and was never a big graphing fan for some mathematical reasons – but that was before I discovered horizon graphs and lately grafana.


We started by using mysql and django for data processing and presentation but quickly ran into scalability issues. Notably, changing hardware or sensors required changing database schemas. So we shifted to nodejs and mongodb for it’s flat nature, nosql and json based documents.

The problem with mongodb is that time is embedded in it’s bbjectId which can be cumbersome for trending. We are now using nodejs, mongodb and influxdb. Influxdb is time-series based and supplements mongo’s detail oriented documents for treading. Grafana’s flexible dashboards are starting to grow on us.

Let me show you how easy it is to use arduino code on v2 and visualize this with influxdb and grafana for production grade IoT applications such as smart aquaponics.

an overview diagram is shown below:


The interface circuits on the v2 allow you to connect most arduino or pi based sensors directly to the atmega 2560 side. numerous sensor configrations are possible because of the switched input potential divider resistors bank.


Since most logic is run on linux and remote database applications, the sketch running on the atmega 2560 is really simple as it only returns raw pin(sensor) values which are decoded further upstream.

Raw Sensor JSON Data

Json is the data format used by v2. Sensor and actuators connected on the 2560 communicate with the linux ar9331 using json through the serial port. the arduino sketch exposes the raw value of all sensors connect on the atmega 2560. This json string shown below  is sent to the linux microcontroller every couple of seconds.


























































Data Collection Using The Atemga 2560 on V2

A typical arduino compatible sketch to wrap the sensors in this json centric output on the 2560 is shown below:

//parse sensors on thh 2560 and return them as a json object

//declare arduino sensor input pins

const int D6 = 6;

const int D7 = 7;

const int D8 = 8;

//program setup

int initialize;

long baudRate = 38400;

/*arduino hardware initialization*/

void setup() {                

 pinMode(D6, INPUT);

 pinMode(D7, INPUT);

 pinMode(D8, INPUT);

 //initialize serial ports

 Serial.begin(baudRate); //used for linux communication


void getD6(){


    int d6 = (digitalRead(D6)); 




void getD7(){


    int d7 = (digitalRead(D7)); 




void getD8(){


    int d8 = (digitalRead(D4)); 




//main program loop

void loop() {

  //is it time to read sensors? ... sensors all read after 'interval'

  unsigned long currentMillis = millis();

  if(currentMillis - previousMillis > measureFlowRateInterval * interval) {

  // save the last time we measure flow rate  

  previousMillis = currentMillis;

  int elapsedTime = currentMillis / (measureFlowRateInterval * interval);



















Compile the sketch using an arduino ide. Flick the programming switch on the board to the 2560 side and download the program to the v2 controller using a usb cable (just like an arduino).


Test that it is working correctly using screen as following syntax. you should see your the atmega json sensor strings flowing through.

screen /dev/ttyUSB0 38400

To return from screen use:

Ctrl+a+k followed by 'y'

Pushing Sensor Data To InfluxDB Using Python

Supposing you had a light dependent resistor LDR connected on analogue pin A9 that measured 2.5v Suppose you also had a float switch measuring a high logic level on digital pin48. The atmega 2560 would represent these as the following json string to the ar9331 side.


  "A9": 2.5,

  "D48": 1,


To log this into influxdb as data from say controller called ‘kj_v2_03’ then the json for this would appear as follows; with columns representing pin names and points representing the respective raw data value:



    "name" : "kj_v2_03",

    "columns" : ["A9", 'D48'],

    "points" : [[2.5,1]]



A python script is used to process the json object from the 2560 for pushing to influxdb on kijani grows. Influxdb can be installed on your localhost and configured to run witthout an internet connection. The following scripts processes the 2560 json and pushes this to influxdb on kijani grows.

import serial

import re

import os

import socket

print "v2 influx parser"

usbport = '/dev/ttyATH0'


host = ''

ser = serial.Serial(usbport, 38400)

line = ""



#parsers for json strings from atmega2560

a = re.compile("kijanistart")

b = re.compile('^"pins":{$')#"pins":{

c = re.compile("^t?"(.*?)":(.*?),$")#'t"A10":513.00,'

r = re.compile("^t?"(.*?)":(.*?)$")#'t"A10":513.00'

d = re.compile("kijanistop")

while True:

data =

if(data == "r"):

line = line.lstrip('n')

line = line.rstrip('r')

if (



elif (

line = line

elif (

name = '"' + + '"'

column =



elif (

namesStr = ",".join( names)

columnsStr = ",".join( columns)

#curl -X POST -d '[{"name":"foo","columns":["val"],"points":[[23]]}]' 'http://" + host + "/db/mydb/series?u=root&p=root'

post = "curl -X POST -d '[{"name":"" + hostname + "","columns":[" + namesStr + "],"points":[[" + columnsStr + "]]}]' '" + host + "/db/kijani/series?u=root&p=root'"

print post


line = ""


line = line + data

The output from this python script is a http post of sensor data to the influxdb server as shown below:

root@kj_v2_03:~# python

the script processes the sensor data and formats this data for influxdb posting as shown below

v2 influx parser

curl -X POST -d '[{"name":"kj_v2_03","columns":["initialize","baudRate","version","atmegaUptime","D3","D4","D5","D6","D7","D8","D9","D22","D23","D25","D28","D29","D30","D31","D32","D33","D34","D35","D36","D37","D38","D39","D40","D42","D43","D44","D45","D46","D47","D48","D49","A1","A2","A3","A4","A5","A6","A7","A8","A9","A10","A11","A12","A13","A14","A15","UART2","UART3","flow_rate_sensor","rtc"],"points":[[0,38400,"v2.0.0","01:00:11:36",0,1,0,0,0,1,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,0,0,0,0,0,0,0,0,324.00,419.00,368.00,362.00,337.00,385.00,383.00,352.00,358.00,395.00,376.00,360.00,352.00,391.00,389.00,0,0,0.00,"2000/1/10 22:47:52"]]}]' ''

Of course this should fail, as we have not installed or configured our influxdb server yet.

Using InfluxDB for logging and visualizing v2 data

You can easily install your own influxDB server  rather than use the one at To do so:

# for 64-bit systems ubuntu/debian systems

Download influxDB


Install influxDB

sudo dpkg -i influxdb_latest_amd64.deb

Starting influxDB daemon

sudo /etc/init.d/influxdb start

Using an internet browser point to the


– you should see a screen similar to the one below:


Login as the user root, password root and create a database called v2 using the database details window.


Click on explore data. then enter and execute the following query to see data on analogue sensor connect to pin A9

select A9 from kj_v2_03 where time >now()-1h

Influx responds with the following:


Try the following query.

Select * from kj_v2_03

Depending on the power of your computer and how much data you have been collecting with your v2 controller – perhaps now is a good time to go get some coffee…

Using Grafana for visualization and trending v2 sensors and devices

The more I use influxdb and grafana for looking into my data the more I like it. It is very easy to install and configure. We will be using apache for serving grafana’s dashboards. Apache is a little heavy for this but I already had it running on my server.

Install apache:

sudo apt-get install apache2

install grafana

cd /tmp/


tar xavf grafana-1.9.1.tar.gz

cd /var/www/

sudo mv /tmp/grafana-1.9.1 .

cd grafana

cp config.sample config.js

Enable grafana to use influxdb by editing config.js and changing the influxdb section as shown below:

    /* Data sources

      * ========================================================

      * Datasources are used to fetch metrics, annotations, and serve as dashboard storage

      *  - You can have multiple of the same type.

      *  - grafanaDB: true    marks it for use for dashboard storage

      *  - default: true      marks the datasource as the default metric source (if you have multiple)

      *  - basic authentication: use url syntax http://username:password@domain:port


      // InfluxDB example setup (the InfluxDB databases specified need to exist)


      datasources: {

        influxdb: {

          type: 'influxdb',

          url: "http://my_influxdb_server:8086/db/database_name",

          username: 'admin',

          password: 'admin',


        grafana: {

          type: 'influxdb',

          url: "http://my_influxdb_server:8086/db/grafana",

          username: 'admin',

          password: 'admin',

          grafanaDB: true




      // Graphite & Elasticsearch example setup

To (and notice the lines with /* and */ are deleted as well)

        // InfluxDB example setup (the InfluxDB databases specified need to exist)

      datasources: {

        influxdb: {

          type: 'influxdb',

          url: "http://localhost:8086/db/kijani",

          username: 'root',

          password: 'root',


        grafana: {

          type: 'influxdb',

          url: "http://localhost:8086/db/grafana",

          username: 'root',

          password: 'root',

          grafanaDB: true



      // Graphite & Elasticsearch example setup

restart apache for the grafana setting to take effect

sudo service apache2 restart

now point your browser to


you should see the grafana home screen as shown below:


On the lower panel, click on ‘title to edit’ then click on ‘edit’ to edit. This is shown below:


On the query line, click on series list and enter (or select) the controller ‘kj_v2_03’. Next click on where it ‘value’ from the select statement and select or type A9. the drop down lists for controller and sensors will only show if influx is connected correctly and you have collected some data.


The follow visualizaton of sensor A9 on garden kj_v2_03 will be displayed by grafana


easy, yeah! well, there for now, we will get into creating flexible v2 smart controller dashboards using grafana shortly