Home Energy Monitor Series – Internet of Things

Recently I started an Internet of Things series on my experiences installing, using and analysing data from a smart electrical meter.  This included a BC Hydro smart meter, Eagle monitoring gateway from rainforest automation, and a cloud-based analytics service from Bidgely.

I’ve collated all the posts on the topic for you below.  More will be added as I write them.  Enjoy!


Continue reading Home Energy Monitor Series – Internet of Things

IoT Day 4: Bidgely Cloud Energy Monitor Dashboard

After a week of collecting smart meter readings, I’m now ready to show results in a cloud-based energy monitor system – Bidgely – complete with graphs showing readings, cost and machine learning results breaking down my usage by appliance.


Bidgely Energy Monitor Dashboard - UsageThis is part 4 of a series of posts about the Internet of Things applied to Home Energy Monitoring.  I have a Smart Meter from BC Hydro, an Eagle energy monitor and various cloud apps helping me understand it all.

 

See my post from Day 1 – getting started or Day 2 – connecting to cloud services – Day 3 – viewing data


3 Value Added Parts to Bidgely

In this post I’ll show you the three parts of Bidgely that I’ve found most helpful:

  1. Usage dashboard
  2. Cost dashboard
  3. Appliance breakdown (best for last!)

Usage Dashboard

Bidgely Energy Monitor Dashboard - Usage
Bidgely Energy Monitor – Usage Dashboard

Continue reading IoT Day 4: Bidgely Cloud Energy Monitor Dashboard

IoT Day 2: Cloud Services for Energy Monitoring

Energy monitoring isn’t only about knowing what’s happening right now, but also understanding what happened historically.  Often that includes knowing not just what was happening but also when and why.  Enter cloud services for energy monitoring.  They range from simple charts and graphs to predicting your usage over time – essentially storing, analysing and enriching your raw data.
In this post I review two cloud services that I’ve tried so far and show you what you get from them. Continue reading IoT Day 2: Cloud Services for Energy Monitoring

Drinking from the (data) Firehose of Terror

Between classic business transactions and social interactions and machine-generated observations, the digital data tap has been turned on and it will never be turned off. The flow of data is everlasting. Which is why you see a lot of things in the loop around real time frameworks and streaming frameworks. – Mike Hoskins, CTO Actian

From Mike Hoskins to Mike Richards (yes we can do that kind of leap in logic, it’s the weekend)…

Oh, Joel Miller, you just found the marble in the oatmeal!   You’re a lucky, lucky, lucky little boy – because you know why?  You get to drink from… the firehose!  Okay, ready?  Open wide! – Stanley Spadowski, UHF

Firehose of Terror

I think you get the picture – a potentially frightening picture for those unprepared to handle the torrent of data that is coming down the pipe.  Unfortunately, for those who are unprepared, the disaster will not merely overwhelm them.  Quite the contrary – I believe they will be consumed by irrelevancy.

If you’re still with me, let me explain. Continue reading Drinking from the (data) Firehose of Terror

Neo4j Cypher Query for Graph Density Analysis

Graph analysis is all about finding relationships. In this post I show how to compute graph density (a ratio of how well connected relationships in a graph are) using a Cypher query with Neo4j. This is a follow up to the earlier post: SPARQL Query for Graph Density Analysis.

Installing Neo4j Graph Database

In this example we launch Neo4j and enter Cypher commands into the web console… Continue reading Neo4j Cypher Query for Graph Density Analysis

Graph relations in Neo4j – simple load example

In preparation for a post about doing graph analytics in Neo4j (paralleling SPARQLverse from this earlier post), I had to learn to load text/CSV data into Neo.  This post just shows the steps I took to load nodes and then establish edges/relationships in the database.

My head hurt trying to find a simple example of loading the data I had used in my earlier example but this was because I was new to the Cypher language.  I was getting really hung up on previewing the data in the Neo4j visualiser and finding that all my nodes had only ID numbers was really confusing me.  I had thought it wasn’t loading my name properties or something when it was really just a visualisation setting (more on that another time).  Anyway, enough distractions… Continue reading Graph relations in Neo4j – simple load example

SPARQL Query for Graph Density Analysis

SPARQLcity Graph Analytics Engine
SPARQLverse is the graph analytics engine produced by SPARQLcity. Standards compliant and super fast!

I’ve been spending a lot of time this past year running queries against the open source SPARQLverse graph analytic engine.  It’s amazing how simple some queries can look and yet how much work is being done behind the scenes.

My current project requires building up a set of query examples that allow typical kinds of graph/network analytics – starting with the kinds of queries needed for Social Network Analysis (SNA), i.e. find friends of friends, graph density and more.

In this post I go through computing graph density in detail. Continue reading SPARQL Query for Graph Density Analysis

Data Sharing Saved My Life – or How an Insurer Reduced My Healthcare Claim Costs

It’s not every day that you receive snail mail with life-changing information in it, but when it does come, it can come from the unlikeliest sources.

Healthcare data shown in a list of bio sample test results
My initial test results showing problems with the Liver

A year ago, when doing a simple change of health insurance vendors, I had to give the requisite blood sample.  I knew the drill… nurse comes to the house, takes blood, a month later I get new insurance documents in the mail.

But this time the package included something new: the results of my tests.

The report was a list of 13 metrics and their values, including a brief description about what they meant and what my scores should be.  One in particular was out of the norm.  My ALT score, which helps measure liver malfunction, was about 50% higher than the expected range.

Simple Data Can Be Valuable

Here is the key point: I then followed up with my family doctor, with data in hand.  I did not have to wait to see symptoms of a systemic issue and get him to figure it out. We had a number, right there, in black and white. Something was wrong.

Naturally, I had a follow up test to see if it was just a blip.  However, my second test showed even worse results, twice as high in fact!  This lead to an ultrasound and more follow up tests.

In the end, I had (non-alcoholic) Fatty Liver Disease.  Most commonly seen in alcoholics, it was a surprise as I don’t drink.  It was solely due to my diet and the weight I had put on over several years.

It was a breaking point for my system and the data was a big red flag calling me to change before it was too late.

Wellness FX chart of ALT
A chart showing some of my earlier tests – loaded into WellnessFX.com for visualisation.

Not impressed with my weight nor all my other scores, I made simple but dramatic changes to improve my health.*  Changes were so dramatic that my healthcare provider was very curious about my methods.

By only making changes to my diet I was able to get my numbers to a healthy level in just a few months.  In the process I lost 46 pounds in 8 months and recovered from various other symptoms.  The pending train wreck is over.

Long Term Value in Sharing Healthcare Data

It’s been one year this week, so I’m celebrating and it is thanks to Manulife or whoever does their lab tests, for taking the initiative to send me my lab results.

It doesn’t take long to see the business value in doing so, does it?   I took action on the information and now I’m healthier than I have been in almost 20 years.  I have fewer health issues, will use their systems less, will cost them less money, etc.

Ideally it benefits the group plan I’m in too as a lower cost user of the system.  I hope both insurers and employers take this to heart and follow suit to give the data their people need to make life changing and cost reducing decisions like this.

One final thought.. how many people are taking these tests right now?  Just imagine what you could do with a bit of data analysis of their results.  Taking these types of test results, companies could be making health predictions for their customers and health professionals to review.  That’s why I’m jumping onto “biohacking” sites like WellnessFX.com to track all my scores these days and to get expert advice on next steps or access to additional services.

I’m so happy with any data sharing, but why give me just the raw data when I still have to interpret it?  I took some initiative to act on the results, but what if I had needed more incentive?  If I had been told “Lower your ALT or your premiums will be 5% higher” I would have appreciated that.

What’s your price?  If your doctor or insurer said “do this and save $100” – would you do it?  What if they laid the data out before you and showed you where your quality of life was headed, would it make a difference to you?

I’m glad I had this opportunity to improve my health, but at this point I just say thanks for the data … and pass the salad please!

Tyler


* I transitioned to a Whole Food – Plant Based diet (read Eat to Live and The China Study).  You can read more about the massive amounts of nutrition science coming out every year at NutritionFacts.org or read research papers yourself.

Analytics Dashboard – Kibana 3 – a few short quick tips

After you’ve loaded log files into elasticsearch you can start to visualize them using the Kibana web app and build your own dashboard. While using Kibana for a week or so, I found it tricky to find the docs or tutorials to get me up to speed quickly with some of the more advanced/hidden features.

In this Kibana dashboard video:

  1. build TopN automated classification queries
  2. view the TopN values of a particular column from the table panel
  3. manually create multiple queries to appear as series in your charts

From zero to HDFS in 60 min.

(Okay, so you can be up and running quicker if you have a better internet connection than me.)

Want to get your hands dirty with Hadoop related technologies but don’t have time to waste?  I’ve spent way too much time trying to get HBase, for example, running on my Macbook with Brew and wish I had just tried this Virtualbox approach before.

In this short post I show how easy it was for me to get an NFS share mounted on OSX – so I could transparently and simply copy files on HDFS without needing any special tools.   Here are the details… Continue reading From zero to HDFS in 60 min.

%d bloggers like this: