It’s been five years since Tim O’Reilly published his screed on Government as Platform. In that time, we’ve seen “civic tech” and “open data” gain in popularity and acceptance. The Federal Government has an open data platform, data.gov. And so too do states and municipalities across America. Code for America is the hottest thing around, and the healthcare.gov fiasco landed fixing public technology as a top concern in government. We’ve successfully laid the groundwork for a new kind of government technology. We’re moving towards a day when, rather than building user facing technology, the government opens up interfaces to data that allows the private sector to create applications and websites that consume public data and surface it to users.

However, we appear to have stalled out a bit in our progress towards government as platform. It’s incredibly difficult to ingest the data for successful commercial products. The kaleidoscope of data formats in open data portals like data.gov might politely be called ‘obscure’, and perhaps more accurately, ‘perversely unusable’. Some of the data hasn’t been updated since first publication, and is quite positively too stale to use. If documentation exists, most of the time it’s incomprehensible.

Measuring impact is hard, but I feel safe in saying that the number of real people who have benefited from the open data released in these portals is smaller than we would have liked. Data.gov cheerleaders will point to the sheer quantity of data available now which was never available before, and I applaud that work. However, if we want to move beyond releasing data for data’s sake, we need to focus on data quality over data quantity.

The real problem behind our data quality issues, is that the people who have the power to fix the data, don’t have an incentive to understand the problem or improve it. Government officials are lovely people who work hard in under-resourced offices. Although many of them believe deeply in transparency and citizen engagement, these portals tend to generate additional burdens that get in the way of their primary functions. When data is stale or data is inaccurate, someone has to take the time to update it or fix it. It is difficult for any one group to see beyond the limits of their own projects. The real trick is to align incentives. What we actually need, is for Uncle Sam to start dogfooding his own open data.

For those of you who aren’t familiar with the term, dogfooding is a slang term used by engineers who are using their own product. So, for example, Google employees use Gmail and Google Drive to organize their own work. This term also applies to engineering teams that consume their public APIs to access internal data. Dogfooding helps teams deeply understand their own work from the same perspective as external users. It also provides a keen incentive to make products work well.

Dogfooding is the golden rule of platforms. And currently, open government portals are flagrantly violating this golden rule. I’ve asked around, and I can’t find a single example of a government entity consuming the data they publish. (I’m sure there are examples, but it’s not a common occurrence.) For the most part, inter-agency data sharing doesn’t happen at all. However, when it does, the agencies set up non-public back channels to push and pull information. If the CFPB needs Census data, they call up their counterpart and ask for it directly. If the data isn’t in the exact right format, or if it’s lacking coverage and there’s more recent information, CFPB will ask the Census directly. The upshot of all of this, is that the public never benefits from the improved data. Also, if the data is important to other agencies, it’s a good signal that the data will be important to third-party developers. If it hasn’t already been released, it should be considered for inclusion in the open data effort. Because the data portals aren’t a method of inter-agency data flow, the portal operators are currently missing these signals.

Tim O’Reilly likes to offer up Amazon Web Services as the model for the concept of government as platform. It’s a great model. But Amazon didn’t start off as a platform. According to Steve Yegge, Jeff Bezos realized Amazon needed to become a platform after Amazon was already a large company and Jeff then issued a Big Mandate:

His Big Mandate went something along these lines:

1) All teams will henceforth expose their data and functionality through service interfaces.

2) Teams must communicate with each other through these interfaces.

3) There will be no other form of interprocess communication allowed: no direct linking, no direct reads of another team’s data store, no shared-memory model, no back-doors whatsoever. The only communication allowed is via service interface calls over the network.

4) It doesn’t matter what technology they use. HTTP, Corba, Pubsub, custom protocols — doesn’t matter. Bezos doesn’t care.

5) All service interfaces, without exception, must be designed from the ground up to be externalizable. That is to say, the team must plan and design to be able to expose the interface to developers in the outside world. No exceptions.

6) Anyone who doesn’t do this will be fired.

7) Thank you; have a nice day!

(If you haven’t yet read Steve Yegge’s Platform Rant, you should go read it now. It’s highly entertaining, and also levies the criticisms I’m making here at Google.) Amazon slowly, painfully transformed from product to platform, and in the process created a world-class developer platform. If we want to continue to see government develop as a platform, officials will have to learn to treat each other as they would external developers— with the same support, data, interfaces, and documentation.

Don’t get me wrong, I’m a realist. I don’t think President Obama could realistically micro-manage the Federal Government the way the Jeff Bezos can manage Amazon. But the dogfooding principle can be applied as a pilot and expanded over time. If I could offer some unsolicited advice to the new US CTO, Megan Smith, I would suggest she focus on coaxing, encouraging, bartering with, and otherwise chivvying government agencies into commiting to use data they publish as the they sole source for internal purposes.

If government as platform is where we’re headed, let’s make sure all of us, governmental and non-governmental actors, are using the same platform to build our technologies. Aligning the incentives of public officials with third party developers will help ensure we all have access to the high quality data we need to build technology for everyone. It’s time for Uncle Sam to start eating his own dogfood.

Suggestions for a Dogfood Pilot in the Government

 

  1. Identify the data most often shared through intra-agency and inter-agency transfers. This data could be currently shared as emailed spreadsheets or through sophisticated APIs. The current method of sharing isn’t relevant.
  2. Work with a cross-functional team of public officials and outside stakeholders to prioritize the data most likely to be valuable in products built for public consumption.
  3. Work with agency officials to create service interfaces for the prioritized data. It doesn’t matter what technology is used, but the service must be designed so that the interface could be exposed to the outside world.
  4. After the service interface is finished, issue a directive that all interprocess communication must be through the service interfaces. No back-channeling of data allowed.
  5. Make the service interfaces and documentation available to the developer community.

Comment