Tag Archives: digital

There Are Only 10 Types of People

I made an off-the-cuff remark about the tendency of people to use this image to illustrate articles about ‘digital’. It appears to depict some binary instructions traversing the walls of some kind of weird quantum tunnel. Or perhaps the tunnel is supposed to represent an Internet connection, who knows?

13129-itok=5aT4aaLR

The Binary Tunnel

In the same publication, on the same day, we find a similar image used to illustrate a different story – this time we seem to be looking at a funky binary cube.

binarycube

In response to my plea to photo editors to step away from these abstractions of ‘digital’ Jeni Tennison asked the (not unreasonable) question “What are people supposed to use instead?!?

Jeni

I started to reply to Jeni but only got as far as “Well, they could always use…”

What could they use? What image conveys ‘digital’? My problem with the binary cube/tunnel is that binary code is literally (and I mean that ‘literally’ literally) as far from the end user experience as it’s possible to get. Binary is the bottom layer of computing – the language of the machines. Very few people in the world have ever or could ever code in binary. It’s an old joke:

“There are only 10 types of people – those who understand binary and those who don’t”

OK Mr smarty pants, what image could we use to put at the top of all those articles about ‘digital’? Well, I’m as guilty as the rest – this is a logo for the Sheffield City Region CIO Forum that I designed a while back. Spot the binary?
SCR CIO

To my shame, the ones and zeroes mean something – 010000110100100101001111 codes for the text string ‘CIO’. Sorry.

The image below is my take at representing ‘digital’. It’s not very good (I’m no designer) but it does try to depict digital in use in a way that most humans would recognise. Photo editors everywhere – I give it to you, copyright free. You are welcome.

DigitalImage

 

 

 

 

 

I Hold These Truths to be Self Evident – A Constitution for the Digital Engine Room

I’m yet to be convinced about the merits of creating a Digital Strategy or even an ICT Strategy (more on this here).

What you absolutely must have, however, is a set of principles which govern the way you do things or, as I prefer to think of it, a ‘constitution’. If you give your self a set of golden rules to dictate the way constitutionpicin which you respond to any request then the actual project list becomes significantly less important.

Your constitution – your philosophy is your strategy.

The list below is my constitution – I hold these truths to be self-evident.

  • Secure by default: information security will be designed in to all our systems, changes and processes right from start and throughout;
  • Information, not Infrastructure: Local Authorities should not be in the IT business – hardware and software is ancillary to any Council’s core activity (serving the public). Whilst the information we create and use is of ever growing strategic importance we can be less concerned about the infrastructure. We will continue to minimise our local infrastructure through a strong preference that systems will be vendor/cloud hosted wherever possible. We will review every significant application, starting with the largest and attempt to have them vendor hosted regardless of their current contractual state. Our aim is to quickly and safely reduce the equipment in our data centre and associated support activity to the absolute minimum;
  • Open Standards and open data: the use of published, open, standards for data exchange will continue to be pursued, the use of open standards will ensure that the likelihood of supplier lock in is reduced and allow the transfer of services or suppliers without significant cost or loss of data. We will publish as much of our data as possible openly, online, for reuse by citizens, the private sector and other public sector organisations;
  • Share and reuse: most Local Authorities do the same things in the same ways – this includes IT. This approach has resulted in an enormous duplication of effort and investment across the sector. We will always seek to join up with others and share services and our aspiration is to move away from each Council having its own IT department. We will learn from others and reuse software, processes and ideas;
  • Browser delivered and browser agnostic: the web browser is already the de facto standard application delivery interface. Traditional software which runs via separate client installations will soon be a thing of the past. Wherever possible we will buy/build applications that run in a web browser and are agnostic to the type of browser and device in use;
  • Any device, anywhere, any time: the traditional model of only being able to access Council applications from Council owned devices connected to the Council network is long gone. We will configure our network such that we can allow access from any device to authorised content whilst maintaining strong security;
  • Buy, don’t build: our default approach is to buy ‘off the shelf’ software (rented where possible) rather than designed in-house. We will only develop bespoke ‘in house’ software as a last resort. This removes support overheads and makes it easier to move software to the cloud. We no longer have the resources to commit to developing lots of software and, more significantly, we absolutely do not have the resources to continue to support and development the software we create;
  • Best of breed, not ERP: Local Authorities are some of the most complex and diverse organisations in the world. The wide range of services we deliver means that ‘one size fits all’ is never appropriate in the IT sense. We will take software procurement on a case by case basis always preferring best of breed point solutions over unwieldy enterprise-wide platforms that stifle agility and hamper cloud adoption;
  • Integration and APIs: regardless of where the systems we use are hosted we will always work to ensure that the systems can ‘talk to each other’ and are integrated. This will allow us to move away from the traditional silo approach and give us a holistic view of the data we hold. Where systems are provided by a third party we will insist that APIs (application programming interfaces) are available and provided.
  • Desktop and server virtualisation: physical infrastructure will be minimised through the use of server and desktop virtualisation. This allows us to extend the life of hardware and reduce the investment required in servers and laptops. In those instances when it is not possible to have a supplier host a system we will host the software in our own data centre. If possible we will use virtual servers to do so. Virtual servers are cheaper, greener, more efficient, more resilient and easier to support than traditional physical servers. We have already virtualised around 60% of our server fleet and we will review all our servers with a view to virtualising as many as possible. This will make provisioning easier and will also lead to a big energy efficiency gain as we decommission older servers which tend not to be very green;
  • Rent, don’t own: where possible we will lease licences and hardware rather than buying assets outright. This allows is to respond more quickly to changing demands and removes the inertia that comes with sunk investment in assets. A move towards a more ‘rental’ model for the majority of our software will make cloud adoption easier, allow the uptake and standardisation of new software versions to reduce support costs and improve user satisfaction;
  • Vanilla by default: Unlike the historic position, where software which is customisable to fit with business processes the working assumption will be that it is largely used out of the box as a standard or ‘vanilla’ version. Large scale or complex customisations to exactly meet business requirements will be avoided wherever possible; rather the expectation will be that business processes will be modified to meet the procured software’s approach to a process. This will significantly reduce the whole life cost of the software and enable the timely upgrade to new versions;
  • Show, don’t tell – prototypes not slideware: sometimes we will have to build things in house because this is the only option. In these cases, new ideas and proposed changes are best communicated by demonstrating how they will work. We will eschew bullet points in presentations in favour of quick and simple wire frames, prototypes and proofs of concept.
  • Minimum viable product: regardless of whether we are buying a product or building something ourselves we will adhere to the ‘minimum viable product’ principle (MVP). Rather than buying/building huge, complex unwieldy applications we will start small and move quickly. A MVP is the most pared down version of a product that can still be used and be useful.
  • ICT Professionalism and organisational resilience: as the Council and customers increasingly rely on ICT and expect more from ICT the skills of the Corporate ICT department need to be managed carefully to ensure they are fit for purpose. Membership of recognised professional bodies, such as British Computer Society or SOCTIM by Corporate ICT staff will be encouraged. Further, we will reduce the risk of downtime and data loss by ensuring that the ICT organisation has sufficient resilience through strength in depth.
  • Technology confidence in the wider workforce: we will create a skilled, technology-confident workforce through investing in learning, development and training opportunities for our own staff. We will, through training, enable staff to get the most benefit from our investment in technology;
  • Open source software: procurement of open source software will always be considered; Open source products are rarely ‘free’ as there are usually support and productivity costs but it will always be considered.
  • ICT Self-service: we will deploy self-service web tools to allow our customers to raise incidents and request changes and to give them the ability to log on to check the status of their requests. We will design these tools such that our customers will prefer this channel over telephone contact. At the same time we will reduce the number of hours during which we are contactable via phone.
  • Business case driven: there will be a business case associated with everything we do. Usually this will be a financial business case (i.e. we will ‘invest to save’) but there are other types of case for change and some things are self-evidently the right thing to do.

What do you think?

Not just a digital strategy – it’s a business strategy for a digital age…

By fixating on the need for a digital strategy document we can often forget that the real goal is a set of digitally enabled business strategies or service plans.

You don’t just need a digital strategy – you need all of your service areas to write service plans and strategies that exploit digital.

Digital shouldn’t be handed down from the top of the organisation – it should bubble up naturally from the people who are tasked with delivering services. The business areas should be irrepressibly enthusiastic about using technology – they shouldn’t need coercing.capture

This is why our new strategy was co-produced with the various business units in the Council, with the aim of pulling together all aspects of the organisation’s digital work under one roof.

And here it is – our Digital Council Strategy 2016 to 2019:

Click here to see RMBC’s new Digital Council Strategy

 

 

 

 

Darwinian Forces are at Work in the Public Sector

Local Government has spent 5 years cutting back. We haven’t just cut to the bone, we’re now darwinscraping the bone. This activity has taken massive amounts of cost out of Local Government and we have achieved savings on a scale that the 2010 versions of ourselves would have derided for being so large as to be the stuff fantasy.

Despite this we find ourselves with a set of 2015 Local Authorities which are not so different from their 2010 counterparts. Certainly our organisations are very much smaller in terms of headcount – but we persist in doing the same things in the time honoured fashions.

It’s time for us to call an end to the half-decade of inward focus – of salami slicing and thinning out – that work is at an end. We must now look outwards as we enter the second half of Local Government’s Decade Horribilis with our eyes on the real prize – changing the way we work by blurring, or entirely removing, the boundaries between many of the public sector organisations.

Like it or not ‘The State’ is in rapid retreat and the shrinkage won’t be reversed anytime soon. There are Darwinian forces at work here and only through the clever use of data and technology will the smart organisations have the requisite tools to evolve their way back from the brink.

It’s all about information – how we create it, how we store it, how we protect it, how we share it and how we exploit it. Clever exploitation of information and technology will help Local Government and the wider public sector to, not just survive, but thrive.

Embrace Shadow Tech or Die

Shadow Tech‘ – we IT folk do enjoy our cool-sounding names don’t we?ShadowTech

The term Shadow Tech refers to the use of consumer technology by the workforce, usually in a way that is not sanctioned by ‘Corporate IT’.

Generally speaking the IT department does not like the  business to use any technology that they (IT) haven’t selected, procured and learned how to support. The instinct of your average IT team is to declare shadow tech verboten, often citing ‘security’ as the reason.

Today’s shadow tech manifests itself in the form of iOS and Android devices. Our employees have them at home, love them and can see how they will help at work – but the IT team says “No, it’s not safe“.

‘ICT as Denier’ is a dangerous role to adopt. I’ve already written about the battle with pernicious ‘security’ but when it comes to shadow tech the real threat is to the IT department – the threat of irrelevance.

We can illustrate the risk by casting our minds back to Shadow Tech 1.0. Less than 30 years ago the IT Dept. was about number crunching and centralised computing – the mainframe was still king. Then in the late 80s and early 90s the Personal Computer (PC) started appearing in people’s living rooms. “Wow!” the people thought “These PCs are great, I can see how this would really help me with my work.”   

The IT Department said “No – that’s not how we do it – if you want some computing doing come to us and we’ll sort it out for you”. So the business promptly ignored IT and went out and bought PCs.

Shadow Tech

PCs proliferated on desktops throughout the organisation and a person in each area would often adopt the mantle of ‘the guy who knows about computers’. Within a few years we had mini-IT Departments all over the place, less control at the centre and an uncoordinated ad hoc approach to technology exploitation.

Wind forwards 20 years and many organisations have now managed to wrest control of ICT back to the centre. The PC (laptop) is ubiquitous but that’s OK because it is now approved and controlled by the IT team. Meanwhile, in the data centre, the mainframes have gone and Windows servers hum away contentedly – all is well with the world.

But hark! Here comes Shadow Tech 2.0 the iPhone and iPad started appearing in people’s living rooms. “Wow!” the people thought “These mobile gadgets are great, I can see how this would really help me with my work.”   

The IT Department said “No – that’s not how we do it – they are not safe. If you want some computing doing come to us and we’ll give you a proper computer”. So the business promptly ignored IT and went out and bought iPads.

Shadow Tech

We know what happens next because we’ve been here before.

(There’s something here that needs exploring around the importance of ‘Institutional Memory‘ in helping us avoid repeating the mistakes of the past – but that’s a post for another day)

Again Corporate ICT loses control – but this time the stakes are far higher. Both the volumes of data and the sensitivity of the data in use are hugely increased compared with 25 years ago. If the end users succeed in bypassing the IT Department then there’s a real risk of a security breach (and near certain compliance problems) – and the users will find a way to use these devices at work because people are clever.

The CIO’s role in this is to act as a trusted advisor to the business. IT should be a door-opener not a gate-keeper. The IT team need to get ahead of the curve and work out how to use these amazing new devices safely. Buy lots of different models and trial different management software then go back to your business users and say “Hey, look – we’ve worked out a way that you can use these things at work.”

But it doesn’t stop there – Shadow Tech 3.0 is already upon us and its name is Software as a nephoService (SaaS). I am a huge advocate of cloud computing and SaaS and I’ve written about this before. SaaS is so good (easy to use, cheap and easy to deploy) that your users will already by eyeing it/using it. Most SaaS tools require little more than a browser – your users are able to purchase their subscription and be up and running on the new application without IT ever knowing about it. This represents a serious threat to the organisation’s data as it is unlikely that the user will have checked that (e.g.) the data is being stored in the EEC.

The CIO’s job here is not to issue a diktat “Staff must not sign up to cloud based software tools.” rather we need to educate staff as to the risk and request that they run all such proposals past the ICT Governance Team so that they can do the due diligence/legwork around security. Sometimes there will be a genuine reason why a SaaS application should be blocked in the corporate world (e.g. DropBox *shudders*) – but usually this stuff is safe to use.

This is ICT adding value to the organisation and it’s a pointer towards ICT’s new role (the inexorable movement from the management of tin and wires to the management of data and risk).

So, be a door-opener not a gate-keeper because, guess what, Shadow Tech 4.0 will be along any day now…

Door-opener not gate-keeper

Door-opener not gate-keeper

 

Let the Information Flow

lifebloodInformation is the lifeblood of any organisation and, as is the case with actual blood, the consequences of the flow being blocked are just as serious as the consequences of some leaking out.

Clearly it’s essential that public sector organisations protect the data that they hold – that’s a given. None of us want to fall foul of the ICO and, much more importantly, we have a duty of care around our citizen’s data. Getting one’s data security house in order is key concern of any CIO.

But we have another duty – a duty to share information. Public sector organisations must share data internally and externally. Information is what we do – it’s our currency, our raw material, our tool and our product.

Back in 1999 Bill Gates wrote a book called ‘Business at the Speed of Thought‘ in which businesshe set out his vision of how technology could transform an organisation. The point that Gates made repeatedly is that the speed that we do business is largely dictated by the speed at which we exchange data.

We do business at the speed of information exchange – if the information flow is blocked then we are in big trouble.

We should view any change to the organisation which impedes information flow with suspicion. This includes changes which are made in the name of ‘security’.

One of my favorite quotes is:

“A ship is safe in the harbour, but that’s not what ships are for” 

(there is some debate about the origin of this quote but it seems likely that it was first used by John A. Shedd in 1928. I use this quote too often at work and at home – but it’s applicable to so many situations)

“Information is safe on a server, but that’s not what information is for” 

If you want your data to be completely secure you should put it on a server, locked in a data centre and pull out the network lead. But that’s not what data is for.

Over-zealous security measures will impede the flow of information and your organtisation will be less effective. Most of the controls in the PSN code of connection, for example, are very sensible – but a few are draconian/ill-conceived and they have been implemented to the detriment of organisational effectiveness.

A big part of the role of the modern CIO is to be organisational warfarin – an antidote to the coagulating effects of pernicious ‘security’. If a security measure seems over the top to you then you need to push back.

Information is the lifeblood of the organisation – let it flow freely.

Your Call is (Probably) Not Important to Me – In Praise of Email

We all have a strained relationship with our email inbox.

For many people Outlook is work (other email clients are available :-)).outlook

That little orange icon on our desktop can contain a world of pain and email has been getting a bad press recently. Some commentators have bemoaned the fact that people often think that they are ‘doing’ work when they are ‘doing’ email.

Strength of feeling is such that there’s even been a suggestion that email should be turned off outside of working hours.

You’ll hear no such complaints from me – I like email. The beauty of email is that it is an asynchronous communication mechanism. An email lands, you glance at it, triage it and if it’s urgent you act upon it. More often than not you delete it or leave it alone until it needs attention.

Compare this to that most cursed of devices – the telephone. “Answer me now!” it screams.

Bye bye to an old enemy.

Bye bye to an old enemy.

How often have you seen colleagues chatting around a desk – discussing a work issue – when the desk phone rings and the owner of the phone interrupts the conversation to answer it? What this person is actually saying is “I don’t know who is calling or what they want but this unknown person is more important to me than you are.” How rude!

Even worse than the phone is the Johnny-come-lately of synchronous attention seekers – INSTANT MESSAGING (IM). IM is basically email that you have to reply to immediately. It allows people to tip-tap-tap on your computer screen “Pay me attention, talk to me, stop what you are doing and talk to ME ME ME!” No matter how trivial the message – you have to drop everything and get involved in a text exchange.

My time is precious and I have to focus on the important stuff. Email allows me to do this, the telephone (and IM) don’t.

So I’ve long since abandoned the use of landlines. As, I’ve written before I have now diverted my work landline to a message asking the caller to email me. There are, of course, people who I need to be very responsive to, so my boss (etc) is now in the habit of calling my personal mobile. More often than not people will now email me with a quick “Rich, plz call me” – because they know this is the quickest way to get hold of me.

You've got mail.

You’ve got mail.

I like email – it allows me to control my day. I recommend you try giving up on your desk phone – it’s liberating.

An Open Letter to Software Suppliers – 13 Ways to Help the Public Sector to the Cloud

Dear Software Supplier,
The public sector is hemorrhaging money. Money
We all use the same systems, but instead of joining up and sharing we are each deploying identical solutions in isolation.
This is costing us a fortune at a time when (by the way) money’s too tight to mention.
A first step to joining things up is to get all this software out of our data centres and in to the cloud.
We’re trying to get to the cloud – we really are – but you’re not making it easy for us. Here are 13 steps you can take to facilitate our journey to the cloud whilst simultaneously making yourselves some money.
1. Be Cloud Ready
This may seem odd coming from a Head of ICT – but we don’t want to have our own IT departments.
We don’t want any tin on-premise. The public sector should not be in the IT business. An analogy – we consume electricity but we don’t run our own power station. We consume data – but we don’t want to run our own data centre – we want Software as a Service (SaaS).
At my local authority we’ve had a ‘cloud first’ policy since 2007. Every time we buy an application or renew a contract we try to get the vendor to host it for us. This is harder to do than you might think.
More often than not the question “Will you host it for us?” is greeted by the exchange of bemused/panicky looks between the vendor’s sales folk.
“Um, we could install some servers in your data centre and then look after it remotely?” Noooooooooo!
Vendors need to understand that, within a year or two, if you can’t deliver your application as SaaS then we are not buying it.
Vendors should be cloud ready – wshouldn’t have to start from scratch every time. I want you to spin up my instance of your application at the touch of a button. So you’ll need to…
servers
2. Invest Upfront in the Platform
You need to have your infrastructure built and ready to go before you enter pre-sales.
Regardless of whether the servers are in your own data centre or Rackspace’s – they should be humming away before we sign the contract.
The tin is not the customer’s problem.
3. Avoid Short-termism (understand our business case)
There are many reasons why cloud/SaaS is attractive to the public sector – perhaps the foremost being that SaaS should be cheaper than on-premise. I say ‘should’ because it usually isn’t.
We tried to move 2 of our largest, mission critical, systems to the cloud last year. These systems are sold by 2 of the largest software houses in the world. In each case the total cost of ownership over 5 years was TWICE as high under a cloud/SaaS model when compared with traditional on-premise hosting. So we bought the tin, put it in our data centre and it’ll now be at least 5 years before we have an opportunity to look again at the cloud for these systems. Frustrating.
The reason that these two vendors were unable to make their SaaS offering compete with on-premise is that they did not take the long view.

nepho

As with most services, the bulk of the cost of any IT department is in salaries. In theory IT departments should be able to reduce headcount as a direct result of moving systems in to the cloud. This, of course, is why many IT teams have nephophobia (fear of clouds). IT departments think clouds are evil. 
The problem is, though, we can’t reduce headcount as a result of just a couple of systems going SaaS. If we put some of our databases in the cloud we’re still going to need DBAs because we still have lots of databases. This is a major challenge to the SaaS business case.
What the vendors need to do is reduce profit margins for these early SaaS forays. By taking a hit now they will be paving the way for organisations to put more of their systems in the vendor’s cloud. You need to be at least matching, if not beating, the on-premise cost – otherwise, why would we do it?
Short-term pain for long-term gain.
4. Be Secure
It seems that there are a surprising number of software companies who don’t know much about IT security, PSN controls or the Data Protection Act (DPA).
You need to be PSN experts. You must understand this stuff inside out.
Golden Gate BridgeThe UK DPA states that our data must be stored within Europe. IF YOUR SERVERS ARE IN CALIFORNIA THEN WE CAN’T DO BUSINESS. 
We spend an inordinate amount of time verifying that the vendor’s cloud is secure – endless surveys and toing and froing – sometimes we even have to visit the vendor’s data centre. On one occasion we found that a very large supplier intended to host our sensitive, precious, data in a crumbling Victorian warehouse by a river. No thanks.
You need to get together with other vendors and with Cabinet Office and come up with an accreditation scheme. Some kind of SaaS ‘kite mark’. All we want to have to do, as the customer, is ask to see a copy of your certificate and be comforted that you know what you’re doing.
5. Build in Disaster Recover as Standard
One of the big attractions of the cloud is that it is immune to localised emergencies. Yet it is surprisingly common to find SaaS offerings that don’t include back-up to a second location as standard.
We shouldn’t have to ask whether you back-up our data to a geographically distant location – and we certainly don’t want it included as an optional (chargeable) extra. It should go without saying that you’ve thought about disaster recovery and this would be a condition of getting your SaaS ‘kite mark’.
 

6. Make Updates/Patches/Releases Opaque to the Customer

When Google adds a new feature to Gmail, or whatever, they do it without any fuss. We generally don’t know that an upgrade has happened until after the fact.

The same should be true of SaaS releases. This work needs to happen behind the scenes without the need for any downtime.
Similarly, improvements and innovations that you’ve developed for one customer should be quickly shared with all your customers (at no extra cost).

gcloud

7. Get on a Framework
Make your product easy to procure by getting yourselves on a framework such as G-Cloud. Public Sector procurement rules are hugely restrictive and prescriptive – so a product that is easy to procure is very attractive to us.
8. Think About Our Capex vs Opex Problem
Many public sector ICT projects are funded using capital monies – often from prudential borrowing.
We can only spend capital money if we can demonstrate that the investment leads directly to the creation of an asset (tangible or intangible). This is fine for a traditional on-premise delivery model because it’s easy to demonstrate that a tangible asset has been created.
But we can’t use capital money to fund the creation of a SaaS solution because there is no asset being created. At the end of the contract there is nothing that the public sector organisation ‘owns’. Sometimes this isn’t a problem as some organisations will find it easier to lay their hands on revenue (opex) monies, but sometimes it can be a deal breaker.
You can help us here by thinking of ways in which the Saas deal can lead to the creation of an asset that the customer owns. Could you, perhaps, nominate a piece of your cloud infrastructure as belonging to us, the customer, and write this in to the contract? In reality, should the relationship come to an end, we probably wouldn’t want to go to the trouble of availing ourselves of this clause – we don’t want the tin – but the clause’s existence may be enough to convince our accountants that this SaaS project is a genuine candidate for capital investment.
9. Make Short Contracts More Attractive
The days of 5 and 7 year contacts are over. New disruptive technologies mean that we need to be able to react faster than ever before. Look at the way the iPhone has changed the

bohrlandscape – we have contracts that are still running that were signed before anyone had ever heard of an ‘app’.

Predicting technology futures is harder than ever – we don’t know what we’ll need in 3 years time so we don’t want to sign 3 year contracts.

I appreciate that this is a commercial challenge – you’re losing a guaranteed revenue stream – but you need to find a way to make short contracts attractive to us.
10. Encourage Multi-Organisation Contracts
This is our responsibility as much as yours – but it’s important to construct contracts in such a way that other public sector organisations can get on board at a later date without going around the procurement rigmarole again.
When selling your wares you should encourage your customers to contact their partners/neighbours to see if they would like to be cited in the contract as part of a procurement consortium – they’ll thank you for it.
11. Be Device Independent
devices
Hopefully this goes without saying these days, but mobile is king, and whatever your SaaS product is for, it should work just as well on an iPad as it does on a laptop.
12. Be Open
Let us at our data! We’ve got big plans for big data so you need to build your cloud offering with openness in mind. We want APIs included as standard that allow us to easily extract data to work on elsewhere.
13. Allow for Online Ratings from Customers
I’ve never come across a SLA that, when push comes to shove, was fit for purpose. Most SLAs are carefully worded such that it’s very hard for them to be breached and that the criteria for triggering penalties is rarely met.

onlinerating

The purpose of contracts and SLAs is to drive a certain kind of behaviour – ie to increase quality, responsiveness and uptime. But a SLA is a crude tool for this. Much better is a system of open online reviews and ratings by your customers.
This is scary, I know – but your product will definitely get better when everyone is able to talk about it openly.

I appreciate that there’s a lot of pain involved in getting yourselves cloud-ready – but if you don’t do it soon some disruptive new SaaS player will come along and take your business. By the way, if you are that disruptive new player – please get in touch – we’ve got some money for you.

Yours sincerely,
The Public Sector

Let’s Replace Council Websites with Local.Gov.Uk – a GDS for Local Government

140 characters is not a lot of space, but sometimes a tweet can contain a very big idea. In December 2013 Dominic Campbell (@dominiccampbell) tweeted:

dctweet

“I reckon it would be possible to build a GDS platform for all #localgov for the price of the new Birmingham Library website” 

If you’re not sure what GDS is then click here.

GDS certainly seem to have no appetite to attempt to tackle local gov – they have too much on their plate already. They have offered to share code, standards, APIs, frameworks etc – the philosophy being that we create a service of ‘small pieces loosely joined’ (a phrase which was originally used as an analogy to describe the Internet) – this means that responsibility for implementing this stuff would be devolved to individual Councils. It’s nice of the GDS to offer to share this knowledge, but I don’t think it’s quite the right approach – we’re already a community of small pieces, loosely joined and we’re in a mess, we’re fragmented. FragmentsRather than being handed a set of tools and the message – “This is how we did it for Central Gov – knock yourself out!” – I would like to see the creation of a Local Government Digital Service which oversees the standardisation and improvement of all things digital in Councils. For the purpose of this discussion I’m defining a Local Government Digital Service as simultaneously being a philosophy, an IT strategy and a central team of people capable of delivering it.

So what problems would a Local Government Digital Service solve, what would the service look like and how hard would it be to create it?

477px-England_Adminstrative_2010

What problems would Local GDS solve?

This bit’s easy – there are 326 Local Authorities (LAs)/Councils in England (http://en.wikipedia.org/wiki/Districts_of_England).

That’s 326 organisation doing, pretty much, the same thing. In terms of IT this means 326 websites, 326 email systems, 326 social care systems, 326 planning systems, 326 education systems etc etc.

This is not quite true as not all LAs have, for example, responsibility as a LEA – but you get the idea.

I estimate that an averaged sized Council will be running around 75 different ‘line of business’ applications – by which I mean the ‘serious’ software that’s used to underpin service delivery, I’m excluding client installs such as CAD or pseudo-systems like MS Access databases and spreadsheets.

326 x 75 = 24,250 software applications. 

So the first benefit of a Local GDS is obvious – increased efficiency through removal of expensive duplication.

Print

The second benefit is around the user experience. Council websites vary in quality enormously – by implementing a single site which features the beautiful design principles of Gov.uk we can standardise content and quality and thereby vastly improve the user experience. Local.Gov.Uk anyone? (The LGA own the local.gov.uk address at the moment – we’d need to shift them)

OK – so it’s a no-brainer, if we could make Local GDS happen then there are serious benefits to be had. But how do we make it happen?

What would Local GDS look like and how might it be brought about?

Part 1: Serving Out Information from Local.Gov.Uk

Councils don’t need to have a website each – we can replace them all with a central Local.Gov.Uk site. Many visitors to Council sites are looking for information rather than wanting to interact/transact with the Council. The same is true of Gov.Uk. Gov.uk is largely about information dissemination – GDS went through the various departmental websites, binned a lot of dross and then re-presented the important info in an accessible way. This bit is relatively easy to replicate for a Local.Gov.Uk:

  1. Identify those bits of information which are common across Local Gov.
  2. Create a Local.Gov.Uk site (same look and feel as Gov.uk) and populate it with the important information.
  3. Cull the old Council sites which are now obsolete.
  4. Save a fortune on Content Management Systems and hosting costs.

LocalGov

Clearly we will need to have a site which recognises that not all parts of the country are the same – some Councils have coastline, some have ports, others have zoos, some have motor racing circuits – the list goes on and all these things bring with them policy and service delivery implications which are not standard across all Councils. Furthermore, as already mentioned, not all tiers of Local Gov have the same statutory responsibilities – not all Councils act as the LEA, for example. These things shouldn’t be a barrier to Local.Gov.Uk though – any transaction/search would begin by capturing the citizen’s post code and the resulting information can be tailored accordingly. Imagine how great it would be for the user of the service to not have to care about whether their area is covered by more than one Local Authority, each with different responsibilities. It reflects poorly on us that we expect our customers to concern themselves with this kind of organisational detail.

Part 2: Transactional Services via Local.Gov.Uk 

A single national web presence for local services would be a huge achievement, yet it would still be just the first step on a much longer journey. Standardising the information we push out is the easy part – delivering transactional services online is where the big challenge is – but this is also where the big savings can be realised.

MyAccountMost Councils have already started implementing some variant of the ‘My Account‘ or ‘Your Account’ service. Often these have the Council’s name appended ‘My Sheffield’ or ‘My Manchester’ and these services will give the citizen some ability to interact with their Council in a way which directly replaces the need to make contact via other channels (telephone and face to face).

This is great news – Digital by Default, Channel Shift – excellent – it’s where we need to be and it’s self evidently the right thing to do. But it’s no small task to make a ‘back end line of business system’ accessible to customers – it’s hard to do and costs OldModela huge amount of money. There are integration tools to buy, APIs to buy, then you have to think about authentication (this is tricky) and finally the Council will create a new website (branded to look like its main site) from which the customer gains access to the back-end data.

Typically it might take a Council 2 years and hundreds of thousands of pounds to get to this point. That’s OK though because as we all know the cost of an online transaction is a fraction of that of its face-to-face counterpart – the investment pays for itself quickly and many times over.

Fine – but ALL the Councils are on this journey – we’re all building identical architecture to do the same thing. We’re all trying to bring about channel shift in isolation.

duplication

This is clearly bonkers – but Local.Gov.Uk gives us a way out. We can begin rationalising this model a layer at a time.

First, as already discussed, we remove the need for Councils to host and maintain their own websites. We replace this layer with the elegant simplicity of Local.Gov.Uk:

New1

Next the Local GDS team uses the GDS’ well documented iterative development techniques to write integration with the Council’s back-end systems. This would be done starting with those systems that are most common and/or have the highest volume of transactions. This is not as onerous a task as it might sound – the mission critical systems in Local Gov are shared between just 4 or 5 suppliers. In terms of authentication – we’d also jump on GDS’ ready made identity and authentication tools to crack that (thorny) problem – by the time Local.Gov.Uk goes live nearly all our customers will have registered with Gov.Uk for one service or another.

New2

Once we’ve got to this point it becomes clear that Councils no longer need to procure 326 different instances of each system – why don’t we work together to get bigger, better, cheaper contracts from our suppliers? Delivered as SaaS of course – we don’t need any tin in our data centres:

New3

I’m conscious that I haven’t mentioned the Public Services Network (PSN) in this PSNdiscussion yet. PSN would be a key enabler of Local GDS – PSN is the secure network that joins it/us together and, potentially, could be the place where many of the SaaS systems are hosted – in effect PSN would be a secure cloud for Local GDS. 

Next Steps?

I’ve been ‘doing’ digital in Local Government for (too) many years – so I appreciate that all of the above will be hard to bring about. A significant challenge to Local.Gov.Uk/Local GDS will be catconvincing all authorities to get on board. In a presentation at the 2013 SOCITM conference (see video at the end of this post) GDS’ Mike Bracken (@MTBracken) said – “It was the devil’s own job to get 24 departments to agree to adopt Gov.uk”. Imagine that challenge scaled up to 326 Councils? Ouch. I pity the shepherd who gets the job of herding those cats.

A second major challenge to the Local GDS model is that it threatens the profits of the major software suppliers. The big suppliers – you know who they are – are very happy to sell the same software to 300 customers. Much less attractive is a joined up Local Gov wanting to to buy a small number of shared instances of these applications. The procurement and legal dimensions will be complex – but maybe G-Cloud can help us with this?

A further challenge will be in resourcing Local GDS – but logic tells us that there must be a way to do this by better using existing resources across Local Gov. Let’s assume that each Council has a web team of, say, 4 people – some are bigger many are smaller, but 4 feels about right – that’s roughly 1300 people currently involved in maintaining Council websites.

Add the various IT departments in to this and you’re looking at a standing army of over 20,000 people already employed in local digital services. If we could avail our selves of just 0.1% of this resource (20 people) then we’d be able to create a nascent Local GDS. Or, and this is probably more realistic, if each Council contributed a small amount each year we would have ample funds to make Local GDS a reality.

So, to return to where we started – could we build a GDS platform for all Local Gov for the price of Birmingham’s library website? Well that website cost £1.2m to create and £190k per year to run (http://www.bbc.co.uk/news/uk-england-birmingham-25033651) – this feels like more than we’d need for Local GDS in year 1, but not enough in terms of ongoing costs.  If each Council agreed to subscribe to Local GDS and paid just £3,000 per year we’d be able deliver a Local.Gov.Uk platform which would:

  • Remove the need for individual Council websitesSave-Money.
  • Significantly reduce software support and maintenance costs for a range of systems.
  • Allow for headcount reductions in web/digital/IT teams.
  • Begin to move away from local data centres.

That’s what we used to call an ‘invest to save’ business case in the olden days.

Who could lead on Local GDS? It’s got to be SOCITM hasn’t it? A ready made team of experts in digital government who know what’s needed to transform Local Gov and who are champing at the bit to get cracking.

babyYou may say that I’m a dreamer – but I’m not the only one. If we start small, keep it simple and take baby steps we can do it. 

Here’s Mike Bracken’s presentation at SOCITM 2013, well worth a watch:

Your Phone has Killed Your Phone – This is a Good Thing

For a long time I’ve been telling anyone who will listen that the desk phone/landline’s days are numbered – hardly earth shattering news.

Bye bye to an old enemy.

Bye bye to an old enemy.

Then a BBC news story (http://www.bbc.co.uk/news/magazine-23448353) prompted me to walk-the-walk and not just talk-the-talk and I’ve now abandoned the desk phone altogether. But more of that later – first, let’s rewind a bit.

The desk phone has had a remarkable run – 130 years of service with greater than 99.99999999% up-time – wow!

Then email came along. I’m old enough to remember when email first arrived in the public sector. Us fresh faced IT people would try to explain to managers across the business why they needed an email account. “I don’t have time for email” they’d say “I’ve got work to do!”

We tried to explain that email was the job or, at very least, a vital tool to help you do the job better. In the end they understood.

Then Twitter came along. IT and comms people tried to explain to managers across the business twitterwhy they needed to use Twitter. “I don’t have time for Twitter” they’d say “I’ve got work to do!”

We tried to explain that Twitter was the job or, at very least, a vital tool to help you do the job better. They are beginning to understand.

At the moment my organisation is testing Yammer – an internal social media tool from Microsoft. Guess what people are saying? “I don’t have time for Yammer”….you get the picture.

Something has got to give, I thought. We can’t keep adopting new, better, faster communications channels without dropping some outdated technologies – if we try to do everything we’ll end up doing not very much.

So I spent a month or so considering how I use communication tools for my work and found that the tools I use are, in order of importance…

  1. Email (from my personal smartphone linked to Exchange using a BYOD app)
  2. Twitter (on my smartphone)
  3. Yammer (on my smartphone)
  4. Email (from Outlook on my laptop)
  5. Mobile Phone (my smartphone again)
  6. Desk phone in the office

Our smartphones are actually small, powerful, computers – the fact that they can be used to make voice calls is secondary to their ability to allow us to stay online and do amazing things Nokia_Lumia_620with the tap of a right thumb.

So what’s my desk phone for?

Not much, as it turns out. A brief audit of incoming calls over the course of 3 weeks revealed that 85% of calls were cold calls from sales people (spam). The remainder were from contacts who’d have received a much faster response (minutes instead of hours) if they’d emailed me instead. Needless to say, all my most important contacts – my managers, my peers, my team – have my personal mobile number.

I’ve now diverted my desk phone to a voice message asking people to email me and it’s working well. I get a bit more email spam I guess – but that’s much quicker and easier to deal with than trying to end a call with a pushy sales guy who wants to convince me to buy something that the organisation really doesn’t need.

We have 1200 desks in our office – each with a VOIP phone (expensive pieces of kit). When the phones reach end of life it will cost at least £80,000 to replace them all (for the hardware alone) – I think I’ll be arguing the case against.

For those people who must have a traditional landline – contact centre people for example – a softphone client and USB headset will do the job well.

So it seems that my smartphone with all its wonderful apps has relegated my desk phone to the scrapheap. Hurrah!