Imagine if YOU controlled YOUR data!

3 07 2009

This post is a suggestion for the @gov2taskforce and may be a lot more technical than some people would like this discussion to be.  However I believe it’s a simple and tangible solution that could easily be prototyped and explored.

Imagine if YOU controlled YOUR data.

If you could update it in one single place and Government Departments/Agencies could just collect it from there.

Imagine if this central data store gave you MORE privacy.  You could control which fields different Departments/Agencies were able to access and you could control if they could personally identify you or just use generic information like your age or postcode.

Imagine if it automatically created a log of each time a Department/Agency accessed your data and you could set the option to allow you to control that data – imagine it sent you an email or SMS and you could allow or deny them using important parts of your private data.

Imagine if you could start by just putting in simple information that you currently have to re-enter a million times like your home address, your phone number and email addres, your age, gender and marital status.  Then if you chose to you could add other information like medicare number and TFN – but it was YOUR choice – and YOU could turn on and off a Department’s access to those specific fields.

Imagine if every government web form you went to gave you the option to auto-populate your details – IF YOU WANTED TO – but it was completely your choice.

From the Department/Agency perspective.

Imagine if you could rely on easily getting people’s up-to-date contact details in a simple and secure way.

Imagine if you could integrate it into your existing websites/web forms without changing any of your pages or back-end systems – unless you wanted to.

Imagine if you could improve the auditability of your use of personal information while improving the quality and freshness of your data too.

Imagine if you put your users in control of their own data…but it also made your life simpler and better!

Imagine if you had a long term strategic vision of user data management that you could deliver today but that would help you evolve and adapt over the next decade and further.

But do people really want this?

According to Interacting with Government – Australians’ use and satisfaction with e-government services review from 2008 I believe they do.

two-thirds (68%) would still prefer the convenience of updating information (such as change of address) for government only once

http://www.finance.gov.au/publications/interacting-with-government/08-security-and-privacy-issues.html

Sure over half (57%) claim they would prefer complete anonymity and are happy to re-enter their data – but I believe that’s because they’re NOT AWARE of any options that can deliver both improved privacy/security AND convenience. Surely an secure system that was optional and also provided an audit log of when Government Departments/Agencies accessed your information would be something a civil libertarian would embrace.

What magical solution would achieve this flight of fantasy?
There are many ways this cat could be skinned – here I’d like to propose one of them at a very high level. If enough interest is shown in this idea then I’d be happy to map out the architecture and key sequence diagrams and user journeys to take this discussion to the next level.

Here’s one way it could be achieved.
I believe that a simple OAuth data store could be setup that would enable much of this functionality. It could be wrapped in a simple Mobile and PC web application that allowed users to control and manage the OAuth tokens they authorise.

I also believe it would be possible to create a simple jQuery plugin that could simple be integrated into existing Gov. Department/Agency webpages by just adding a single line of HTML code. This is very similar to the simple User Voice feedback buttons that are spreading across sites like wildfire – exactly because they are so simple to integrate. This plugin would add a visible button or element to the page (much like the User Voice feedback tab) that would offer the user the chance to pre-populate the form on the page they are currently on. The plugin would then manage all of the OAuth data store’s signup and token creation/authentication processes using Ajax and DHTML for overlays. It would then map the common fields from the users OAuth data store into the fields on the page (e.g. using something like JSONT rules that could be quickly customised for each form). In this way the underlying form and server-side scripts would not need to be changed at all.

If this model then did gain traction, over time Government Departments/Agencies could upgrade and integrate their back-end systems more closely with the central OAuth data store.

There is a lot of technical and user experience detail that needs to be discussed, however I’m confident that a simple proof-of-concept or prototype system could be created in a very short time. I also believe that all of this could be completed as an Open Data and Open Source project that would allow for peer-review for security enhancement and API based integration for developers to extend and enhance.

Personally, this is something that I would use, however I’m very concerned that the output of the Gershon Review and the existing Government Architecture is very “big vendor” and “internally” focused. Opening up control and externalising this core data seems to be currently sitting in the “too hard” basket.

Sure, not everyone would use it and they wouldn’t have to. And sure, not everyone has a javascript enabled browser. But if you don’t then you’re not likely to want to even think about a central data store and audit logs either.

Now, feel free to tell me I’m a dreamer – but please be prepared to back that up with detailed descriptions of WHY this wouldn’t work!

Advertisements




APIs, Accessibility and Mobility

3 07 2009

Recently the Department of Finance and Deregulation asked for feedback on the latest version of their Web Publishing Guide.  This post is a tangible recommendation for this and also relates to the broader #publicsphere discussion currently taking place.

I would recommend changes to 2 sections within the AGIMO Web Publishing Guide.

First, I’d remove RSS from the Technical Development section and create a separate section entitled Open Data and Application Programmer Interfaces or just Open Data and APIs.

This would then be the logical home for the RSS link and should be fleshed out to include Open Data expectations or policies and discussion around common Web Service Interface topics such as SOA, REST, XML-RPC, JSON-RPC, etc. and even SOAP for legacy systems and from my perspective SOAPjr for modern Ajax driven applications 8)

I will not propose here the exact structure I think this should be as that’s really a call for AGIMO, however there is a wealth of existing information in this area and I along with a wide range of other developers would happily engage in an Open Data discussion to help this process along.

The second recommendation I would make would be to bind the content of the Accessibility and Equity section to this new Open Data and APIs section in a deep and intimate way.  Both at a guideline content level and also at a policy level.

Open Data and APIs are the un-recognised foundation of true accessibility and equitible access.  If you simply encourage developers to create WCAG compliant sites then you are fixing your accessibility benefits in cement – at a certain cultural point in time.  However if you encourage or require developers to first deliver Open Data and APIs then you have enabled any other group or developer to then create new and targeted services that meet the needs of the group they are interested in or are a part of.

This cliched old proverb seems relevant and insightful again in this particular context:


Give a [person] a fish and you feed them for a day
Teach them how to fish and you feed them for a lifetime

This is also increasingly relevant as the “browsing device” landscape fractures even further.  Plain Old Mobiles (POMs), iPhones and the flood of new devices will be enabled by this and our burgeoning new market will be supported instead of constrained.

There two follow up points to this discussion.

1. I am NOT recommending that focus be removed from WCAG compliance.  I am merely suggesting that it should come AFTER Open Data and API requirements and I believe that overall Accessibility in all forms will be improved by this.

2. Some people may argue that this is shifting the onus from Government developers out into the community.  I would argue that this can be viewed differently.  It’s not “onus” but “power and freedom”.  Only allowing me to access the Public Data we have all paid for through applications and user interfaces developed by Government agencies is limiting my world to “what they think I want” and “how fast they can develop”.  By starting with opening up APIs you are letting me run alongside these agencies or even run ahead of them and letting me decide how, when and where I mash up this data.  For me this provides more freedom and also may free up some of the internal Government resources so they can focus on implementing these new Open Data policies instead of just trying to second guess what the increasingly diverse web user audiences really want.

As someone who runs an Innovation Lab I am aware how hard it is to predict what technologies and applications will actually be adopted and achieve wide diffusion.  For large Government Departments to also try to do this seems challenging at best.