The Building Of radiationinfo.org

First, check out Radiation Info.

Since the earthquake in Japan I have been closely following the news to keep up with what is happening. Of particular interest to me (and initially, concern) were the events unfolding at the Fukushima nuclear power plants. As I was trying to follow along I kept becoming frustrated with the reporting not effectively communicating the severity of their sensationalist radiation numbers. So I hatched an idea: assign graded risk values to radiation levels and create a piece of JavaScript which could either be embedded or included with a bookmarklet and show the severity information about radiation on any site.

This is the story of that project.

Planning

To be completely honest, this didn’t last nearly as long as it should have. Excited to have something to do I quickly threw together a one page content mockup for everything I imagined would be on the site and sent off emails to get help with the design. Let me pause here and express my undying gratitude to Lindsay Burtner for the design work.

On a technical level I decided that I wanted everything to be transparent and would build everything in client-side code so that the embedded bookmarklet wouldn’t have to make any additional requests which could be tracked. My hope in doing so was to allay concerns about including the embed in a site. Imagining I was through, I started building out the functionality.

The Bookmarklet

First up was the bookmarklet. I needed to gracefully enable embedding into any website and do so without requiring them to change their markup. The solution I came up with for this was to traverse document.body for each HTML text node and run it through a regular expression that looked for radiation units. A match meant that I needed to replace that text node with a special span tag to which I would be able to later attach an event to pop up a gadget. This required the first piece of my radiation library development which came together pretty quickly: unit identification and conversion.

The Site

Lindsay got the design back to me super-fast and I threw the HTML/CSS together for the site. Because it made the most sense, the scale for the radiation is logarithmic, bounded on the low end by 1 microsievert and the other by 10 sievert, these numbers being chosen for their relatively extreme values. The text input allows values outside of that range but it pegs the slider at the top or bottom of that scale. Since I was simply using a jQuery UI Slider I had to bolt on support for sliding over logarithmic values. This was the second part of the radiation library and, after a bit of mental gymnastics to wrap my head around it, it was finished. In what turned out to be a mistake, I decided to put off the more complicated logic of generating the informational data and moved onto hosting.

Hosting

It needs to be fast, reliable, scalable (in case everybody and their uncle starts linking to it), and simple. Because of all my recent work with Amazon Web Services, this was a no-brainer. I registered the domain and set up Google Apps, Amazon S3, Amazon CloudFront, and Amazon EC2.

Google Apps

The first thing I do with any domain is to get the Google Apps account set up to take care of email. I’m always careful to create a completely separate online identity for each new project of mine which, aside from having 320 email accounts, works wonders. This is simple and straight-forward having done it so many times now.

EC2

Moving on to actual hosting, the first thing I did was spin up an EC2 micro instance to handle redirection from the “naked” domain to the “www” domain. It has an elastic IP assigned to it and I create an A record in DNS for the “naked” domain which points to it. Amazon, take note: this redirection is all I’m using EC2 for. This is an incredibly simple but needed functionality that AWS could provide a much simpler way to handle. As a footnote, I also put a full copy of the site in its webroot in case I elect to run the site from EC2, but it is currently unused.

S3

Amazon S3 has a neat new trick where you can run a bucket as a website. This requires specially named and configured buckets, so I set those up: cdn.radiationinfo.org and www.radiationinfo.org. After a bit of configuration (including setting a proper bucket policy so that anybody can read from it) Amazon kindly provides a URL that you can CNAME to and I threw those into the DNS for both www.radiationinfo.org and cdn.radiationinfo.org.

CloudFront

This is my first time using Amazon CloudFront, but if you conceptually understand the way edge-caching works it is a breeze to set up. I told CloudFront that my hostname was going to be cdn.radiationinfo.org and set the cdn.radiationinfo.org S3 bucket as the origin server. This, paired with my S3 bucket-as-a-website setup, makes everything super-easy because it means that I can seamlessly transition from S3 to CloudFront and back by only making changes in the DNS. I’m not presently using CloudFront but it remains available to me if I need it.

A Responsive Web Design Detour

To this point everything I’ve built on the web has been focused on desktop environments. I realized in building this that it would be relatively simple for me to adjust the way in which this site was displayed to not only look good on a mobile browser, but to possibly be even more effective at communicating the information. This led to a fair amount of reading up on responsive web design, media selectors, a discovery that there are hard minimums on body tag width, and finally a bit of fiddling to end up with a design that worked for most display dimensions. I’m quite pleased with the results.

Backtracking

Mostly built, I went off to find the data for assigning risk and found out that we don’t really have a strong enough understanding of the effects of radiation on our bodies to accurately assign risk values. Oops. Content first next time. Newly educated, this necessitated a design change to describe the radiation level in context which meant collecting a bunch of data. I decided to take on the design tweaks myself and enlisted help with research and data collation. Let me pause here and apologize to Lindsay for messing up her design by not properly planning. Sticking with my goal of transparency, all of that data that was collected ended up in a Google Spreadsheet that anybody can see.

This was the last and most fiddly part of developing the entire project: I had to turn the data I had into context. Paired with wanting to make sure that the information provided by the site wasn’t unnecessarily alarmist and communicating an accurate understanding of the radiation level, I was stumped.

Not knowing what to do, I went with my normal approach: build a bad version and iterate. Version one was an absolutely horrendous string-building exercise which manually constructed strings for each possible case and appended them under the correct header. The results were untenable both in code and in the ability of a visitor to understand the data. After a fair amount more fiddling I finally came up with the approach I’m using currently which assigns an “icon” and simply prefixes the description string from the source data with a bit of text describing scale.

Happy enough with my second version I went in and hooked up all the events for the changing sliders and connected them to the radiation processing library I had built and things were pretty much ready to go. If you have any comments or critiques on this approach, do let me know!

The Build Script

Just about finished now, I had a whole lot of source data that I needed to process and include into the tool, minification that needed to be done, and two possible environments that I needed to build for (local and hosted). This was becoming unwieldy to handle manually and so I wrote a simple build script that functions almost as a preprocessor. It first grabs the source data from the spreadsheet (using code from a script I originally wrote to help name the company I’m a cofounder of, Typewire), replaces all the URLs with the appropriate location, and minifies all the CSS and JS. Simple and effective, it also makes sure that my deploy process is easily repeatable, reducing mistakes.

The Gadget

It was time to revisit the bookmarklet to make sure that it too contains the correct data and to wire up events to it. The approach I took simply checks to see if jQuery is already loaded on the page, and if not, load it into my own namespace for doing event handling and positioning of the popup. Otherwise I’ll use the site’s current version of jQuery! I can get away with this because I’m using core functionality that has been around forever in jQuery, but it means it does try to play nicely with reducing its total impact on the page.

Conclusion

All in all it has been a fun project. I’ve enjoyed building this and learning along the way and am quite happy with the results. I feel like I’ve achieved my goals of providing information about radiation levels and doing so in an open manner. And I also feel like I’m not sitting idly by while things are a mess in Japan.

All of the code for the entire project is available on GitHub for your perusal. If you notice bugs or problems, send a pull request! Look for follow-up posts that discuss my experiences with and “best” practices for dealing with Amazon Web Services as I’ve become quite comfortable in that environment recently.

You can join the discussion on Hacker News.