It’s been a long time since a post but I wanted to highlight some of the new tooling that I’m learning while digging into the OmniSci platform – namely, extreme geospatial map rendering and analytics powered by server-side GPUs.
In this video I load a ~12M point dataset from geonames.org and explore it using both QGIS and OmniSci – just to get a feel for how easily it fits a normal kind of GIS exploration workflow.
See OmniSci in action with truly large datasets (we talk in billions usually) using some interactive demos at: https://www.omnisci.com/demos
A couple years ago you may have read this great post from boredpanda talking about a research paper that took deep learning algorithms and applied them to art. This opened up the possibility of, say, taking a photo and having it re-imagined as being a painting from an old master.
It’s actually pretty easy to do this now using a site called deepdreamgenerator.com. I’ve done quite a few experiments on the site using a variety of images from the web and found it pretty fun.
I’ve also started to download some of the deep learning toolkits (e.g. Berkely Caffee) that are available, hopefully I can do higher resolution work with these in the future.
But for you today I thought I’d take a basic QGIS map using Natural Earth dataset and have it “re-cartofied” as Mercator – and show you how I did it.
Step 1 – Make a world map with QGIS
Using the awesome data from the Natural Earth website I put together a very simple world map.
Step 2 – Grab an old map
The way this deep “dream” imaging site works is that it can use a second image as, what they call, a style. So I first just grabbed a basic Mercator map of Scotland. Lots are available on David Rumsey’s awesome map library site!
At first I used the full basic image without any processing, but it produced weird artifacts – lines, etc. in the resulting map. Here are a few of my test images. If you select them from this page you can see the input/source images.
Step 3 – Deep Learning Cartography Magic – Voila!
The final version of my map turned out good enough, though there are several more settings I could tweak. Note, those are not compression artifacts – they are mountains! Yes, in the ocean 🙂
Tips…
To get this far I had to also crop the Mercator source map to remove page borders, legend, surrounding text, etc. This helped reduce the set of input style items to things like mountains, water, and of course CASTLES! 🙂 Here is the image I used for styling.
I hope you found this interesting, if you make something similar please share it with on Twitter/1tylermitchell.
I have a latent interest in stargazing but haven’t done much about it for a long time until the past weekend when I had a bit of time and wondered if I could create a star map using QGIS. I found a couple tutorials and stackexchange questions that referenced David Nash’s HYG Database from www.Astronexus.com. Some of the tutorials showed equations for computing latitude and longitude from the star position values – ascension, declination specifically. However, I found that there is already an X and Y column in the latest version of the dataset which makes it easy to map, here’s how.
Download the Star Location Data
From the HYG Database page, grab the HYG version 3.0 dataset:
HYG 3.0: Database containing all stars in Hipparcos, Yale Bright Star, and Gliese catalogs (almost 120,000 stars, 14 MB)
The resulting 34MB CSV file contains about 20 columns and 119,615 rows.
Select Layer -> Add Layer -> Add Delimited Text Layer
Browse to the file location, select File Format radio button as CSV and the rest should take care of itself
Tweak the Map
If you use a typical WGS84 projection you’ll get a ton of points oriented in a circle. I took a few minutes to zoom into a meaningful scale (for me it was around 1:5M), change the background colour and then use the LUM attribute to change the relative size of the stars. I also changed the colours to be shades of white. Here is what I was able to produce. It’s not all that meaningful to me but I know that when I need to dig into the data further it will be available!
We are a team of three guys (so far) with a design idea for a simple yet challenging racing/strategy game based on real-world geospatial data coupled with a 3D gaming engine. Tonight we started building it together in Unity after downloading a bunch of public geospatial datasets. Here’s a summary of what we’ve done and some basic screenshots. The goal is to make progress every couple days to show we is possible with a little work and a combination of creative, technical and group management skillsets.
First Images
Starting with Geography
The landscape of our 3D game is built using real world geographic information – aka geospatial data. We wanted this so that we could build games around real locales and market the game to local users. One side benefit is the players will learn a little about Canadian geography!
We’ve got a great collection of geo data for all of Canada available through the government Geogratis website. After we’ve decided what map tile numbers we needed, we grab the CDED data which is a TIFF elevation image file with geographic coordinates included.
We also found a BC provincial shaded relief map layer that was pretty nice as a starter. As both these data files are geolocated we can load the data into GIS (geographic information system) software from QGIS.org and combine it with any other data we have for the region. In our case we have a “populated places” point file from naturalearthdata.org that we show as stars on top of the shaded relief map.
Then we export the elevation data and the relief data (including the stars) into two files for Unity to ingest. The result is one PNG that is going to be our texture and the other will be the heightmap for the terrain.
Loading into Unity
We created a terrain object with some pretty huge dimensions and height. There are mountains in the region we are working with, so it’s pretty cool to see. We apply the heightmap using a commonly used Unity script (I’ll have to put the link here when I remember where I got it!).
Then we add the shade relief map as a texture to the terrain et voila! We threw in a plane of basic water and raised it to the level where it filled the main river valleys. As our game is going to first start as a river racing game, we want to start with water from the very beginning.
We added a car, parented it to the camera and were racing down our waterways within 2 hours of starting. We spent a lot of time adjusting sizes of terrains and texture so we could try to match real-world scale. There are some further ideas we are going to try to optimise this as well as nail down a workflow for easily ingesting new geodata for other regions (as we had to manually export and adjust things in the GIS).
Just a quick video follow up based on a reader asking how I did what I did in my previous post with Unity (http://unity3d.com) game engine.
Using Surface Wave asset and built-in Unity Terrain generator, plus a script for taking DEMs and creating Terrains easily. I’m really new at this by the way, but have a brilliant teacher showing me this stuff in my spare time 🙂
After a week of collecting smart meter readings, I’m now ready to show results in a cloud-based energy monitor system – Bidgely – complete with graphs showing readings, cost and machine learning results breaking down my usage by appliance.
This is part 4 of a series of posts about the Internet of Things applied to Home Energy Monitoring. I have a Smart Meter from BC Hydro, an Eagle energy monitor and various cloud apps helping me understand it all.
The Eagle energy monitor from Rainforest Automation is a very handy device. It reads the wireless signal from my electricity meter and makes it available through a web interface – both a graphical environment and a RESTful API. In this post we look at the standard graphical screens and the data download option. Next time we’ll look at the RESTful API for programmers to use.
This is part 3 of a series of posts about the Internet of Things applied to Home Energy Monitoring.
Energy monitoring isn’t only about knowing what’s happening right now, but also understanding what happened historically. Often that includes knowing not just what was happening but also when and why. Enter cloud services for energy monitoring. They range from simple charts and graphs to predicting your usage over time – essentially storing, analysing and enriching your raw data.
In this post I review two cloud services that I’ve tried so far and show you what you get from them. Continue reading IoT Day 2: Cloud Services for Energy Monitoring
In my next series of blog posts we explore an Internet of Things topic – Home energy monitoring – from a first person perspective. Join me as I install, use and hack a monitor (and related cloud services) in my new home.
This is part 1 of a series of posts about the Internet of Things applied to Home Energy Monitoring.