Day 1 – Building a 3D Game using Geo Data (Hobby project)

We are a team of three guys (so far) with a design idea for a simple yet challenging racing/strategy game based on real-world geospatial data coupled with a 3D gaming engine. Tonight we started building it together in Unity after downloading a bunch of public geospatial datasets. Here’s a summary of what we’ve done and some basic screenshots. The goal is to make progress every couple days to show we is possible with a little work and a combination of creative, technical and group management skillsets.

First Images

Day 1 - Unity 3D real time rendering of geodata

Starting with Geography

The landscape of our 3D game is built using real world geographic information – aka geospatial data. We wanted this so that we could build games around real locales and market the game to local users. One side benefit is the players will learn a little about Canadian geography!

We’ve got a great collection of geo data for all of Canada available through the government Geogratis website. After we’ve decided what map tile numbers we needed, we grab the CDED data which is a TIFF elevation image file with geographic coordinates included.

We also found a BC provincial shaded relief map layer that was pretty nice as a starter. As both these data files are geolocated we can load the data into GIS (geographic information system) software from QGIS.org and combine it with any other data we have for the region. In our case we have a “populated places” point file from naturalearthdata.org that we show as stars on top of the shaded relief map.

Then we export the elevation data and the relief data (including the stars) into two files for Unity to ingest. The result is one PNG that is going to be our texture and the other will be the heightmap for the terrain.

day1_geodata

Loading into Unity

We created a terrain object with some pretty huge dimensions and height. There are mountains in the region we are working with, so it’s pretty cool to see. We apply the heightmap using a commonly used Unity script (I’ll have to put the link here when I remember where I got it!).

Then we add the shade relief map as a texture to the terrain et voila! We threw in a plane of basic water and raised it to the level where it filled the main river valleys. As our game is going to first start as a river racing game, we want to start with water from the very beginning.

We added a car, parented it to the camera and were racing down our waterways within 2 hours of starting. We spent a lot of time adjusting sizes of terrains and texture so we could try to match real-world scale. There are some further ideas we are going to try to optimise this as well as nail down a workflow for easily ingesting new geodata for other regions (as we had to manually export and adjust things in the GIS).


Reposted from our Indiedb blog, hence writing for a slightly non-geo audience: http://www.indiedb.com/members/1tylermitchell/blogs/edit/1mitchellco-day-1-cross-country-river-run-race-game

Generate terrain with flowing water from DEM in Unity

Just a quick video follow up based on a reader asking how I did what I did in my previous post with Unity (http://unity3d.com) game engine.

Using Surface Wave asset and built-in Unity Terrain generator, plus a script for taking DEMs and creating Terrains easily.  I’m really new at this by the way, but have a brilliant teacher showing me this stuff in my spare time 🙂

 

Parsing GPS with Arduino – retro-post

I dug up one of my oldest blog posts from about 7 years ago.  In the post, I show how I had connected my Garmin eTrex GPS receiver to an Arduino board and used it to control a camera in a desktop application.

After pumping the data into the Arduino, I parsed the raw GPS data coming from the eTrex and streamed it out to a Python app on the desktop, via a serial port.  I was using the output to position a virtual camera in the OSSIMPlanet platform by formatting XML in Python and sending it to a listener built into the OSSIMPlanet app.  (OSSIMPlanet is a sort of pre-Google Earth, Google Earth on steroids product.)

I haven’t used it for years, but thought some of the methods from this old post may still apply if you have an Arduino + GPS or Arduino -> Python streaming requirement.   Enjoy the flashback – I know it’s inspired me to pick up a new Arduino to continue where I left off.


 

GPS Controls OSSIMPlanet - With OSSIMPlanet’s nifty camera control and listener functionality, as demonstrated in my last post, you’ve got so many neat opportunities. A couple nights ago I got a basic GPS NMEA parser working. Here’s a pic of the ultra-professional connection method I use to hook it to my arduino board 🙂 Oops, just realised that the picture … Continue reading GPS Controls OSSIMPlanet

Two wires to hook an eTrex data cable up to the Arduino
Two wires to hook an eTrex data cable up to the Arduino

Read Post

Review of 3 Recent Internet of Thing (IoT) Announcements

Working in the big data and analytics space, I’m always interested in parts of the Internet of Things (IoT) that will produce more data, require more backend systems, and help users/customers get on with their day better.

The past week has shown a few interesting announcements relating to Internet of Things topics.  Here are just a few that jumped out to me, either because they inspired me or because I was left wondering what it would really mean.

TL;DR?  Summary: While IBM is “getting started” (oops, I meant “getting serious”) and Facebook has big plans to “take over”, Amazon comes out with a consumer focused solution.

Continue reading Review of 3 Recent Internet of Thing (IoT) Announcements

Drinking from the (data) Firehose of Terror

Between classic business transactions and social interactions and machine-generated observations, the digital data tap has been turned on and it will never be turned off. The flow of data is everlasting. Which is why you see a lot of things in the loop around real time frameworks and streaming frameworks. – Mike Hoskins, CTO Actian

From Mike Hoskins to Mike Richards (yes we can do that kind of leap in logic, it’s the weekend)…

Oh, Joel Miller, you just found the marble in the oatmeal!   You’re a lucky, lucky, lucky little boy – because you know why?  You get to drink from… the firehose!  Okay, ready?  Open wide! – Stanley Spadowski, UHF

Firehose of Terror

I think you get the picture – a potentially frightening picture for those unprepared to handle the torrent of data that is coming down the pipe.  Unfortunately, for those who are unprepared, the disaster will not merely overwhelm them.  Quite the contrary – I believe they will be consumed by irrelevancy.

If you’re still with me, let me explain. Continue reading Drinking from the (data) Firehose of Terror

Google wants “mobile-friendly” – fix your WordPress site

TheNextWeb reports: “Google will begin ranking mobile-friendly sites higher starting April 21“.  It’s always nice having advance warning, so use it wisely – here’s how to tweak WordPress to increase your mobile-friendliness.

Google Mobile-Friendly Check

I use a self hosted WordPress site and wanted to make sure it was ready for action.  I already thought it was, because I’ve accessed in on a mobile device very often and it worked okay.

I even went onto the Google Web Admin tools and the mobile usability check said things were fine, but… Continue reading Google wants “mobile-friendly” – fix your WordPress site

Geospatial Power Tools Reviews [Book]

Thinking of buying my latest book?  We’ve finally got a few reviews on Amazon that might help you decide.  See my other post for more about the book.  Buy the PDF on Locate Press.com.

Reader Reviews

Geospatial Power Tools book cover
From Amazon.com

5.0 out of 5 stars This book makes a great reference manual for using GDAL/OGR suite of command line …,
January 24, 2015 By Leo Hsu
“The GDAL Toolkit is chuckful of ETL commandline tools for working with 100s of spatial (and not so spatial data sources). Sadly the GDAL website only provides the basic API command switches with very few examples to get a user going with. I was really excited when this book was announced and purchased as soon as it came out. This book makes a great reference manual for using GDAL/OGR suite of command line utilities.
Several chapters are devoted to each commandline tool, explaining what its for, the switches it has, and several examples of how to use each one. You’ll learn how to work with both vector/(basic data no vector) data sources and how to convert from one vector format to another. You’ll also learn how to work with raster data and how to transform from one raster data source to another as well as various operations you can perform on these.”

 

Continue reading Geospatial Power Tools Reviews [Book]

HBase queries from Bash – a couple simple REST examples

Learn how to do some simple queries to extract data from the Hadoop/HDFS based HBase database using its REST API.

Are you getting stuck trying to figure out HBase query via the REST API?  Me too.  The main HBase docs are pretty limited in terms of examples but I guess it’s all there, just not that easy for new users to understand.

As an aside, during my searches for help I also wanted to apply filters – if you’re interested in HBase filters, you’ll want to check out Marc’s examples here.

What docs do you find most useful?  Leave a comment.  Should someone write more books or something else?

My Use Cases

There were two things I wanted to do – query HBase via REST to see if a table exists (before running the rest of my script, for example).  Then I wanted to grab the latest timestamp from that table.  Here we go…

Does a specific table exist in HBase?

First, checking if a table exists can be done in a couple ways.  The simplest is to simply request the table name with the “exists” path after it and see what result you get back.

$ curl -i http://localhost/existing_table/exists

HTTP/1.1 200 OK
Cache-Control: no-cache
Content-Type: text/plain
Content-Length: 0

$ curl -i http://localhost/bad_table_name/exists
HTTP/1.1 404 Not Found
Content-Length: 11
Content-Type: text/plain

Not found

Here I use the curl “-i” option to return the detailed info/headers so I can see the HTTP responses (200 vs 404).  The plain text results from the command are either blank (if exists) or “Not found” if it does not.

Let’s roll it into a simple Bash script and use a wildcard search to see if the negative status is found:

CMD=$(curl http://localhost/table_name/exists)

if [[ $CMD = *"Not found"* ]]
 then echo "Not found"
else 
 echo "Found"
fi

Extract a timestamp from an HBase scanner/query

Now that I know the table exists, I want to get the latest timestamp value from it.  I thought I’d need to use some filter attributes like I do in HBase shell:

scan 'existing_table', {LIMIT=>1}

To do this with curl, you want to use HBase scanner techniques to accomplish this (the shortest section in the official docs it seems).

It’s a two stage operation – first you initialise a scanner session, then you request the results.  Bash can obviously help pull the results together easily for you, but let’s so go step by step:

curl -vi -H "Content-Type: text/xml" -d '<Scanner batch="1"/>' "http://localhost/existing_table/scanner"

...
Location: http://localhost/existing_table/scanner/12120861925604d3b6cf3
...

Note the XML chunk in the statement that tells it how many records to return in the batch.  That’s as simple as it gets here!

Amongst the results of this command you’ll see the Location value returned, this is the URL to use to access the results of the query.  Results are truncated and line breaked so you can see the meaningful bits:

$ curl http://localhost/existing_table/scanner/12120861925604d3b6cf3

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
 <CellSet>
  <Row key="AfrV65UNXD0AAAAs">
   <Cell column="ZXZ...mNl" 
    timestamp="1218065508852">
    L2hv...Y3N2
   </Cell>
  </Row>
 </CellSet>

Ugh, XML.. if you want JSON instead just add an ACCEPT property to the header:

$ curl -H "Accept: application/json" http://localhost/existing_table/scanner/12120861925604d3b6cf3

{"Row":[{"key":"AfrV65UNXD0AAAAs","Cell":[{"column":"ZXZ...mnl","timestamp":1218065508852,"$":"L2hv...Y3N2"}]}
...

For now we’ll hack some sed to get the to the value we want, first for the JSON response, second for the XML response.  Just pipe the curl command into this sed:

# For application/json
$ curl ... | sed 's/.*\"timestamp\"\:\([0-9]\{13\}\).*/\1/'

# For default XML results
$ curl ... | sed 's/.*timestamp=\"\([0-9]\{13\}\).*/\1/'

Now you can create a basic script the grabs the latest timestamp from the HBase query and decides what to do with it.  Here we just assign it to a variable and let you go back to implement as needed.

$ LAST_TIME=$(curl ... | sed 's/.*\"timestamp\"\:\([0-9]\{13\}\).*/\1/')
$ echo $LAST_TIME

1218065508858

Thanks for reading!

If you like this, follow me on Twitter at http://twitter.com/1tylermitchell or with any of the other methods below.

 

Scripts to beat up Windows – using Powershell

This is just a shoutout to Luke Brennan’s Technet blog. This old post of his gave me some simple to use and easy to understand Powershell scripts for stress testing a Windows machine.

Saturate the CPU and fill up all your RAM easily:

Thanks for sharing Luke!

Noob compiling tip – “make -j” is your friend

In today’s super duper processor world if you are still only running the standard trio:

./configure 
make
make install

.. then you’re really missing out. Well, that is unless you like having more time to drink your coffee while waiting for compilation jobs to finish.

The make command comes with an option to spawn off multiple processes while building. This is can dramatically reduce compilation time. For example, building Quantum GIS from source (for use with our upcoming Ingres Geospatial release) – took about 12 minutes to build the traditional way. But, if you take the number of cores your system has, add 1 and pass that along to the make command then you’re really cooking…:

make -j 5

.. it took only 3.5 minutes.

Try it, you’ll like it!

%d bloggers like this: