I've started measuring temperatures in and around the house. Just for fun, I might use this in the future for some home automation. I've used most of this page, and of course I changed some things.

I bought a bundle of 10 waterproof DS18B20 sensors with a 3 meter lead on them. I planned a lineair topology, but half way through decided to test the stubbed topology first. It works flawlessly, so I see no reason to change it. The total network contains 8 sensors, two of them on one 10 meter cable, one on a 15 meter cable. I still have enough 100 Ohm resistors lying around if problems might pop up. The total length is well within specs.


  • Connected the DS18B20 sensors to the Raspberry Pi. Used a pull-up resistor of 4700k Ohm.
  • Installed two sensors outside, one in the crawlspace, one in the living room, one in the attic, three on the central heating (warm water out, heating out, heating return) The sensors for the central heating are zip-tied to the pipes with a little heatsink compound I had lying around.
  • installed the 1-wire modules (w1-gpio and w1-therm) on the Raspberry Pi 2 model B.
  • made the wireless network of the Raspberry Pi automatically re-connect (second answer) by changing /etc/ifplugd/action.d/ifupdown
  • created a mysql database and two tables on my Linux machine
  • made (borrowed and changed) a python-script that:
    - scans for sensors, not from a list
    - for each sensor, read the temperature 3 times, checks the CRC, calculate the average value
    - reports the average value for each sensor
  • added the script to the cron on the Raspberry Pi, to be executed every minute. I've configured the NTP-server on the Raspberry Pi, but the records are inserted using the central server's time, so that's not really important.

I made two scripts that report in two different ways, one directly calls a Mysql stored procedure, the other
one posts the value on a web-page which executes the same stored procedure. I still have to choose which one is better.


The central Mysql database on my Linux machine has two tables:

- Keeps the sensorid, serialcode, last-temperature, last-seen date-time, last-value date-time and a description, one row per sensor

- Keeps the sensorid, date-time and the temperature, one row per measurement

The stored procedure named add_measurement:
- checks for the existence of the sensor, if not it adds the sensor
- checks if the temperature is the same as the last, if it is, it won't add it
- checks if 830 seconds have past since the last recorded temperature, if so it will add the temperature. That creates a datapoint every 15 minutes.
- if the temperature has changed, it will add the previous temperature with the previous timestamp and the current temperature with the current timestamp. This way the graphs give a correct display of the temperature over time.

and the stored procedure keeps the last-seen and last-value datetime and the last temperature up-to-date

temperatures are rounded to 1 decimal. The accuracy of the sensors won't give more detail and this way I will get less records because there will be more the same temperatures in a row. 

Storage saving:

If the temperature doesn't change, I get one record per 15 minutes, 96 records per day per sensor. If the temperature changes every minute, I get 1440 values per day per sensor. With the storage saving I now have an average of 463 measurements per sensor per day, a factor 3 saved. The table with measurements now contains 434076 rows in 10,1Mb. That's 117 days of data for 8 sensors.


After this I added a few graphs, of course I started with the one from this page. I changed a few things, and I will continue to do so until I'm satisfied:
- added a third thermometer at the top
- made almost everything configurable
- changed the logic for the timezone. Added a little PHP part that calculates the current offset server-side from UTC, and adds that to the graphs. The sample web-page just always adds 2 hours, that's mostly correct (in this part of Europe) but not always depending on daylight savings time.
- cleaned up the JSON output to the java part that creates the script, to minimize the amount of data sent to the client. Mostly removed spaces.
- added a filled-in-part (fillAlphas) in the graph for the difference between the central heating output and the central heating return. It shows the amount of energy the central heating puts into my house.
- created a few stored procedures that output the data for the graphs. Because I saved on storage, there is no datapoint for every minute, the stored procedure creates the datapoints in between two points, one for every minute. Rounded the seconds off to zero.
- added parameters to the stored procedures so I could select another interval instead of 24 hours
- added parameters to the stored procedures so I could select another ending date instead of now.

The performance of the graphs is still a bit cumbersome, if I select a week worth of data for five sensors, it takes 2.1 seconds to calculate the 43200 datapoints in the stored procedure. For 24 hours it takes 0.4 seconds .

5 sensors     Central heating

5 sensors (living room, outside high, outside low, attic, crawlspace)
The modulating central heating (living room, outside high, outside low, heating out, heating return, warm water)

RaspberryPi and DS16B20    DS18B20 on pipes

Raspberry Pi testing the sensor DS18B20
Sensor DS18B20 on pipes

Condensed graphs:

I've created a second table containing an average row per 5 minutes, that table contains 5 times as less rows than the original table. With this table I can easily create graps for a complete month in a reasonable time. I run a cron job to fill the five-minute-table. 


- I've ordered two DHT22's, I'll start measuring humidity inside also.