Year: 2008

  • Kansas Primary 2008 recap

    I’m winding down after a couple of very long days preparing for our coverage of the 2008 Kansas (and local) primaries. As always it’s been an exhausting but rewarding time. We’ve come a long way since the first election I wrote software for and was involved with back in 2006 (where election night involved someone accessing an AS/400 terminal and shouting numbers at me for entry). Our election app has become a lot more sophisticated, our data import process more refined, and election night is a whole lot more fun and loads less stressful than it used to be. I thought I’d go over some of the highlights while they’re still fresh in my mind.

    Douglas County Comission 2nd District Democratic primary section

    Our election app is definitely a success story for both the benefits of structured data and incremental development. Each time the app gets a little more sophisticated and a little smarter. What once wasn’t used until the night of the election has become a key part of our election coverage both before and after the event. For example, this year we had an overarching election section and also sections for indivudual races, like this section for the Douglas County Commission 2nd district Democratic primary. These sections tie together our coverage of the individual races: Stories, photos and videos about the race, our candidate profiles, any chats we’ve had with the candidates, campaign finance documents, and candidate selectors, an awesome app that has been around longer than I have that lets users see which candidates they most agree with. On election night they’re smart enough to display results as they come in.

    Election results start coming in Results rolling in County commission races almost done

    This time around, the newsroom also used our tools to swap out which races were displayed on the homepage throughout the night. We lead the night with results from Leavenworth County, since they were the first to report. The newsroom spent the rest of the nice swapping in one or more race on the homepage as they saw fit. This was a huge improvement over past elections where we chose ahead of time which races would be featured on the homepage. It was great to see the newsroom exercise editorial control throughout the night without having to involve editing templates.

    More results

    On the television side, 6 News Lawrence took advantage of some new hardware and software to display election results prominently throughout the night. I kept catching screenshots during commercial breaks, but the name of the race appeared on the left hand side of the screen with results paging through on the bottom of the screen. The new hardware and software allowed them to use more screen real estate to provide better information to our viewers. In years past we’ve had to jump through some hoops to get election results on the air, but this time was much easier. We created a custom XML feed of election data that their new hardware/software ingested continuously and pulled results from. As soon as results were in our database they were on the air.

    The way that election results make their way in to our database has also changed for the better over the past few years. We have developed a great relationship with the Douglas County Clerk, Jamie Shew and his awesome staff. For several elections now they have provided us with timely access to detailed election results that allow us to provide precinct-by-precinct results. It’s also great to be able to compare local results with statewide results in state races. We get the data in a structured and well-documented fixed-width format and import it using a custom parser we wrote several elections ago.

    State results flow in via a short script that uses BeautifulSoup to parse and import data from the Kansas Secretary of State site. That script ran every few minutes throughout the night and was updating results well after I went to bed. In fact it’s running right now while we wait for the last few precincts in Hodgeman County to come in. This time around we did enter results from a few races in Leavenworth and Jefferson counties by hand, but we’ll look to automate that in November.

    As always, election night coverage was a team effort. I’m honored to have played my part as programmer and import guru. As always, it was great to watch Christian Metts take the data and make it both beautiful and meaningful in such a short amount of time. Many thanks go out to the fine folks at Douglas County and all of the reporters, editors, and technical folk that made our coverage last night possible.

  • DjangoCon!

    I’m a little late to the announcement party, but I’ll be attending DjangoCon and sitting on a panel about Django in Journalism with Maura Chace and Matt Waite. The panel will be moderated by our own Adrian Holovaty.

    I think the panel will be pretty fantastic but I can’t help be just as terrified as my fellow panelists. I love that we’ll have both Journalist-programmers and Programmer-journalists on the panel, and I love that Django is so often the glue that brings the two together.

    DjangoCon is going to be awesome.

  • Natalie Anne Croydon

    Passed out

    Last weekend, our first child, Natalie Anne Croydon was born. I’ve been trying to keep up with Flickr photos and updated my twitter feed a lot during the labor and delivery process (what a geek!). Thanks to everyone for their kind words and congratulations.

    For more pictures, check my Flickr archive starting on May 24 or my photos tagged “Natalie”.

  • Arduino: Transforming the DIY UAV Community

    It’s been pretty awesome watching the homebrew UAV community discover and embrace Arduino. Back in January community leader Chris Anderson discovered and fell in love with Arduino. Today he posted information and the board design for an Arduino-powered UAV platform. Because everything is open, it’s very easy to combine functionality from other boards in order to reduce the cost:

    The decision to port the Basic Stamp autopilot to Arduino turned out to be an unexpected opportunity to make something really cool. I’ve taken Jordi’s open source RC multiplexer/failsafe board, and mashed it up with an Arduino clone to create “ArduPilot”, perhaps the cheapest autopilot in the world. ($110! That’s one-third the price of Paparazzi)

    As with their other projects, the UAV schematics, board design, and Arduino control software will be released before they’re done. It’s quite awesome to realize just how cheap the Arduino-based autopilot is:

    That’s a $110 autopilot, thanks to the open source hardware. By comparison, the Basic Stamp version of this, with processor, development board and failsafe board, would run you $300, and it’s not as powerful

    I’ve been quite impressed by how quickly the Arduino autopilot has gotten off the ground (pun only slightly intended). The decision to port the existing Basic Stamp code to Arduino was made just over a week ago. While I haven’t seen the control code, it looks like the team are well on their way.

    I love it when geek topics collide, and this is about as good as it gets. I’ll be keeping a close eye on the ArduPilot, and I can’t wait to see it in the skies.

  • This whole number reuse thing has gone too far

    This madness needs to stop!

    Espoo, Finland – Nokia today unveiled a trio of mobile devices that balance stunning and sophisticated looks with the latest in mobile functionality. All three devices, the Nokia 6600 fold, the Nokia 6600 slide and the Nokia 3600 slide present a smooth, minimalist design and an appealing array of easy-to-use features. The devices range in price from 175 EUR to 275 EUR before taxes and subsidies and are expected to start shipping during the third quarter of 2008.

    I know that Nokia have a finite set of product names when we’re talking about 4 digit numbers. Aside from the Nseries and Eseries and a handful of other products, Nokia are pretty keen on assigning 4 digit numbers as product names. While often confusing, at least it avoids product names like RAZR or ENv. I don’t quite get the naming of the 6600 fold and the 6600 slide though. Either someone in Espoo has the attention span of a goldfish or they expect that S60 consumers do.

    Us S60 owners are a pretty loyal and knowledgeable bunch. We do our research and know our history. I may be wrong, but I’d venture that a good number of S60 users could name a dozen or more S60 models from the 7650 to the N-Gage to the N95. Surely a good chunk of us would rattle off the 6600 in the process. We might also remember the 3600 as the awkward American cousin of the 3650.

    You know, that business phone from 2003 that brought significant hardware and software upgrades to the table compared to the 7650 and the 3650. I sure remember it as if it were yesterday.

    Every once in awhile someone raises a stink about Nokia reusing a product number. Usually it’s a product number from the 80’s or 90’s and the word “Classic” is attached to the new phone. I’m OK with that. I just think that it’s a little early to be reusing a product code from 2003 in a market segment of geeks and power users.

  • Python for S60: back in the saddle

    I had the opportunity to meet Jürgen Scheible and Ville Tuulos, authors of the Mobile Python book at PyCon a few weeks ago. They graciously gave me a copy of their book, which is an absolutely fantastic guide to writing S60 apps in Python. It seems like every time I look away from Python for S60 it gets better, and this time was no exception. Everything is just a little more polished, a few more APIs are supported (yay sensor API!), and the community and learning materials available have grown tremendously.

    While I didn’t get a chance to hang out too long during the sprints, I did pull together some code for a concept I’ve wanted to do for a long time: a limpet webcam that I can stick on something and watch it ride around the city. Specifically I thought it would be cool to attach one to a city bus and upload pictures while tracing its movements.

    So here’s my quick 19 line prototype that simply takes a picture using the camera API and uploads the saved photo using ftplib copied over from the Python 2.2.2 standard library. It’s called webcam.py. I haven’t run it since PyCon, so the most recent photo is from the PyS60 intro session.

    Working with PyS60 again was absolutely refreshing. I write Python code (using Django) at work but writing code for a mobile device again got the creative juices flowing. I’m trying to do more with less in my spare time, but I definitely need to make more time for PyS60 in my life.

  • PyCon 2008

    I’m headed out the door to PyCon 2008. Yay!

  • Covering Kansas Democratic Caucus Results

    I think we’re about ready for caucus results to start coming in.

    We’re covering the Caucus results at LJWorld.com and on Twitter.

    Turnout is extremely heavy. So much so that they had to split one of the caucus sites in two because the venue was full.

    Later…

    How did we do it?

    We gained access to the media results page from the Kansas Democratic Party on Friday afternoon. On Sunday night I started writing a scraper/importer using BeautifulSoup and rouging out the Django models to represent the caucus data. I spent Monday refining the models, helper functions, and front-end hooks that our designers would need to visualize the data. Monday night and in to Tuesday morning was spent finishing off the importer script, exploring Google Charts, and making sure that Ben and Christian had everything they needed.

    After a few hours of sleep, most of the morning was spent testing everything out on our staging server, fixing bugs, and improving performance. By early afternon Ben was wrapping up KTKA and Christian was still tweaking his design in Photoshop. Somewhere between 1 and 2 p.m. he started coding it up and pretty soon we had our results page running on test data on the staging server.

    While the designers were finishing up I turned my focus to the planned Twitter feed. Thanks to some handy wrappers from James, I wrote a quick script that generated a short message based on the caucus results we had, compared it to the last version of the message, and sent a post to Twitter if the message had changed.

    Once results started coming in, we activated our coverage. After fixing one quick bug, I’ve been spending most of the evening watching importers feed data in to our databases and watching the twitter script send out updates. Because we’ve been scraping the Kansas Democratic Party media results all night and showing them immediately, we’ve been picking up caucuses seconds after they’ve been reported and have been ahead of everything else I’ve looked at.

    Because we just recently finished moving our various Kansas Weekly papers to Ellington and a unified set of templates, it was quite trivial to include detailed election results on the websites for The Lansing Current, Baldwin City Signal, Basehor Sentinel, The Chieftain, The De Soto Explorer, The Eudora News, Shawnee Dispatch, and The Tonganoxie Mirror

    While there are definitely things we could have done better as a news organization (there always are), I’m quite pleased at what we’ve done tonight. Our servers hummed along quite nicely all night, we got information to our audience as quickly as possible, and generally things went quite smoothly. Many thanks to everyone involved.

  • We’re hiring!

    Wow, the Django job market is heating up. I posted a job opening for both junior and senior-level Django developers on djangogigs just a few days ago, and it has already fallen off the front page.

    So I’ll mention it again: We’re hiring! We’re growing and we have several positions open at both the junior and senior level. We’d love to talk to you if you’ve been working with Django since back in the day when everything was a tuple. We’d love to talk to you if you’re smart and talented but don’t have a lot of (or any) Django experience.

    Definitely check out the listing at djangogigs for more, or feel free to drop me a line if you’d like to know more.

  • Google apps for your newsroom

    Google spreadsheetsI like to think that I’m pretty good at recognizing trends. One thing that I’ve been seeing a lot recently in my interactions with the newsroom is that we’re no longer exchanging Excel spreadsheets, Word files, and other binary blobs via email. Instead we’re sending invites to spreadsheets and documents on Google docs, links to data visualization sites like Swivel and ManyEyes, and links to maps created with Google MyMaps.

    Using these lightweight webapps has definitely increased productivity on several fronts. While as much as we would love every FOIA request and data source to come in a digital format, we constantly see data projects start with a big old stack of paper. Google spreadsheets has allowed us to parallelize and coordinate data entry in a way that just wasn’t possible before. We can create multiple spreadsheets and have multiple web producers enter data in their copious spare time. I did some initial late night data entry for the KU flight project (Jacob and Christian rocked the data visualization house on that one), but we were able to take advantage of web producers to enter the vast majority of the data.

    Sometimes the data entry is manageable enough (or the timeline is tight enough) that the reporter or programer can handle it on their own. In this case, it allows us to quickly turn quick spreadsheet-style data entry in to CSV, our data lingua franca for data exchange. Once we have the data in CSV form we can visualize it with Swivel or play with it in ManyEyes. If all we’re looking for is a tabular listing of the data, we’ve written some tools that make that easy and look good too. On larger projects, CSV is often the first step to importing the data and mapping it to Django objects for further visualization.

    Awesome webapps that increase productivity aren’t limited to things that resemble spreadsheets from a distance. A few weeks back we had a reporter use Google’s awesome MyMaps interface to create a map of places to enjoy and avoid while traveling from Lawrence, KS to Miami, FL for the orange bowl. We pasted the KML link in to our Ellington map admin and instantly had an interactive map on our site. A little custom template work completed the project quite quickly.

    It all boils down to apps that facilitate collaboration, increase productivity, and foster data flow. Sometimes the best app for the job sits on the desktop (or laptop). Increasingly, I’ve found that those apps live online—accessable anywhere, anytime.