Monthly Archives: August 2012

Denver Open Data

A few weeks back Brian Timoney tweeted about some recent Denver/Colorado open data efforts. OpenColorado hosts over 400 data sets from DRCOG, the City and County of Denver, Boulder and a handful of others.

I decided to see what data was available and what I could do with it. The Public Art data set seemed interesting and small enough to do create something without too much effort.

I downloaded the shapefile and uploaded to my hosted geo service of choice, CartoDB. I use CartoDB because, with the exposed SQL API, my options are almost endless.

Within a few hours (over a few days) I put together a quick “Denver Art Finder” application. It uses Leaflet for the web mapping API and Handlebars as a simple templating engine. The application works by checking your location (you can fake it if you’re not in Denver) and searching for public art within 1km of you via CartoDB’s SQL API. If we find some, we throw some pins on the map that you can click (tap) to find the title of the piece, the artist’s name, and how far away it is. Very simple stuff.

You can grab the source code or view the application at the links below:

Fun with ScraperWiki and CartoDB

I’ve been trying to contain my excitement for the Great American Beer Festival until I successfully acquired tickets. Luckily, I was one of the few that actually got tickets.

I couldn’t help but wonder about the spatial distribution of all of the breweries attending the festival. Will there be a Colorado bias? I wonder who’s coming from my home state (North Carolina)? Anyone coming from Alaska, Hawaii? (yup)

So, being a geo geek, I headed to their website in hopes of finding a map. No luck. But there was a well-fomratted table of all 566 breweries. Nice.

I vaguely remembered some twitter chatter from a few months back about ScraperWiki, but didn’t know much about it. Within a few minutes, even with my marginal Python skills, I was able to write a recipe to scrape all of the names, cities and states for all of the breweries. The end product? A CSV.

Time to head to CartoDB. Importing my CSV was easy enough and geocoding these points through their interface was easy enough. You can even concatenate multiple columns to geocode against – in my case {city}, {state}, USA.

More exciting than the map itself is the fact that it took about 20 minutes to do all the things above. Thanks ScraperWiki and CartoDB!

The Map

The Recipe