When the Army buys rifles and trucks, it wants equipment that any mechanic in the field can fix. Depending on GM to fly their own mechanics to an overseas base or the middle of a battlefield to make repairs to Army equipment, only to keep their mechanial knowledge secret from the soldiers using the equipment, would be bad policy.

But right now, that’s how most electronic voting systems work. States and counties purchase proprietary hardware and software and have no way of opening up the systems to determine whether they are recording votes properly, let alone to fix problems.

Open source to the rescue: the Open Source Digital Voting Foundation is promoting the use of open-source software in our elections. Eight states — California, Washington, Oregon, New York, Connecticut, New Hampshire, Vermont and North Dakota — are participating in OSDV’s Trust the Vote Project, which is building and sharing software to support voter registration, ballot design, ballot tabulation, and auditing.

On a function as central to democracy as elections, it makes sense to use hardware and software created in a democratic spirit. Open source software allows us to improve the electoral process with technology while also creating an avenue for greater participation, accountability, and citizen ownership of the ballot box.

We’re a democracy: open that data!

President Obama still has lots of work to do, but under his open data initiative, the United State federal government is moving toward a culture of greater information openness:

Moves by President Barack Obama’s administration to open up government data in the US need to go further but have succeeded in creating a culture of greater openness, according to an independent report.

The report ( http://bit.ly/ds4TIy ), from non-partisan think-tank the Information Technology and Innovation Foundation, looks at how technology has been used to implement the Open Government Directive, a memorandum written by President Obama on his first day in office focusing on three areas: transparency; public participation; and collaboration ( http://bit.ly/duz9JA ).

The first of these areas, transparency, has been the biggest success of the directive, the report says, with an “unprecedented” volume of data available to citizens online. However further moves are needed, it says, including addressing problems with access to government datasets through the central portal data.gov ( http://bit.ly/9yd810 ) [Tristan Parker, “U.S. Open Data Moves ‘Have Created New Culture’,” E-Government Bulletin, 2010.04.01].

Now we just have to hook everyone up to broadband and make sure their education includes healthy doses of media literacy and civics so they know how to use all that data.

…of course it does!

Almost every city (and every other governmental entity, from local sewer district on up to Congress and the U.N.) can benefit from a Web 2.0 makeover. Russell Nichols at GovTech.com highlights Code for America, a new effort backed by Web 2.0 progenitor Tim O’Reilly to help local governments use the Web to “improve transparency, efficiency and citizen participation.”

Transparency, efficiency, and citizen participation–who doesn’t want more of that?

A couple things catch my attention about Code for America. First, CFA sees its mission as more than simply making more government data available to citizens. They recognize that online government tools are just about G2C; they also need to be C2G. In other words, CFA recognizes that government isn’t something done to us; government is something we all do. Citizens aren’t just consumers of government data; we are producers of data that we can use to improve our communities.

Also warming the cockles of my Web 2.0 heart is Code for America’s recognition of the importance of context, of place. CFA won’t just throw programmers in a room and ask them to make widgets. According to Nichols, CFA’s Web teams will spend a month living in their client cities. They’ll get to know their cities, the citizens their widgets will serve, and the problems those widgets will address. When it comes to local government, the World Wide Web needs to feel like a part of the neighborhood.

E-Gov Bulletin from the UK suggests that social networks could sweep away political parties. Dr. Ian Kearns, former Head of the e-Government Programme at the Institute for Public Policy Research, tells the House of Commons’ Parliamentary IT Committee that Web 2.0 is giving people the tools to recognize and use their power to organize and campaign.

Dr. Kearns is speaking in the British parliamentary context where third parties have a reasonable shot at making a difference. I’m not sure social networks would have as easy of a time upending one of our two dominant parties. However, his point that parties can (and must!) take advantage of the technology is proven by the Howard Dean and Barack Obama presidential campaigns. The Internet and social apps (plus a good spreadsheet) put as much organizing power in the hands of two local advocates in a back office as could have been mustered by a national campaign office a couple decades ago.

While the technology is powerful, Dr. Kearns emphasizes that the big shift is in how we use the technology, how we expect to be involved in the information process. Yes, it’s the consumer-producer-conducer paradigm shift! Politicans need to get out of “broadcast” mode and recognize that politics is much more a two-way, participatory endeavor. The new politics is all about openness and engagement. If you’re running for office, you can’t just put up a website; you have to invite your voters in to build that website—to build its content—for you.

…hat tip to Deane at Gadgetopia!

After some test-driving a couple summers ago, I chose the open-source Drupal platform for my various online experiments (RealMadison.org, the Lake Herman Sanitary District, and my online dissertation). I wouldn’t be able to offer a strong techie defense of that choice: I just liked the look and feel of Drupal better than Joomla or some of the other tools I played with.

So I can’t help feeling my choice affirmed, just a little, by the news that the White House is replacing the Bush-era proprietary content management system with Drupal:

The great Drupal switch came about after the Obama new media team, with a few months of executive branch service (and tweaking of WhiteHouse.gov) under their belts, decided they needed a more malleable development environment for the White House web presence. They wanted to be able to more quickly, easily, and gracefully build out their vision of interactive government. General Dynamics Information Technology (GDIT), the Virginia-based government contractor who had executed the Bush-era White House CMS contract, was tasked by the Obama Administration with finding a more flexible alternative. The ideal new platform would be one where dynamic features like question-and-answer forums, live video streaming, and collaborative tools could work more fluidly together with the site’s infrastructure. The solution, says the White House, turned out to be Drupal. That’s something of a victory for the Drupal (not to mention open-source) community [Nancy Scola, “WhiteHouse.gov Goes Drupal,” Personal Democracy Forum, 2009.10.24].

Anyone care to draw parallels between open-source software choices and the future of democracy?

ThisWeKnow.org is one of three finalists in the Sunlight Labs Apps for America contest. It’s nothing big, really, just trying to make every bit of federal data available in super-searchable Semantic Web format:

Our long-term vision for ThisWeKnow is to model the entire data.gov catalog and make it available to the public using Semantic Web standards as a large-scale online database. ThisWeKnow will provide citizens with a single destination where they can search and browse all the information the government collects. It will also provide other application developers with a powerful standards-based API for accessing the data.

Loading governmental databases into a single, flexible data store breaks down silos of information and facilitates inferences across multiple data stores.  For example, inferences can be made by combining census demographic data from the Agency of Commerce, factory information from the Environmental Protection Agency, information about employment from the Department of Labor, and so on. We can’t even begin to imagine the discoveries that will become possible after all these data are loaded into an integrated repository.

Tim Berners-Lee‘s describes a Semantic Web with data distributed across the Internet that is readable by people while simultaneously being available and manipulable by software agents. To that end, in addition to building our Web pages for browsing and searching these data, we will expose all the data in the catalog to computers as RDF that can be retrieved via the SPARQL Query Language [ThisWeKnow.org About, 2009].

ThisWeKnow.org is a good first step toward consolidating diverse data stores and providing better comprehensive pictures of communities. But it will still take many eyes and many coders crunch more data together to produce Semantic Web apps that can produce insights not currently available.

I get my latest Interlibrary Loan, and even when told the title by the new librarian, The Structural Transformation of the Public Sphere, I can’t remember the author. I carry it across campus, settle into my night class seat, and open the cover.

Habermas! Oh yeah! Good old Jürgen Habermas, who gets cited in every other e-government paper I read. I love it when a core paper in my research interest isn’t an infoSys text.

And then I flip to the table of contents: “Introduction: Preliminary Demarcation of a Type of Bourgeois Public Sphere.” I laugh with pleasure at the reading ahead.