Interesting couple of days last week at the Economist’s Technology Frontiers conference, (where ‘technology’ means ‘digital technology’). (


The most significant thing I heard was the opportunity that exists when ‘the internet of things’ enables ‘high empathy’ product or service propositions.


Charlie Leadbeater ( provided what I found to be the most thought-provoking presentation of the first day.  He suggests that high-system / high-empathy technologies will be the most successful.  For example, a high-system / low-empathy technology (his example of Ryanair raised a laugh) is not embraced with enthusiasm.  A low-system / high-empathy instance (for example a farmers’ market) isn’t scalable.  But high-system (= high effectiveness and high efficiency) coupled with high-empathy is powerful.  And, yes, Apple is the inevitable example.  He then classified a number of approaches to making a technology ‘high empathy’.  I think he’s onto something, but it’s early days yet.


I think he’s onto another useful distinction in technology; technology as tool versus technology that supports who you are as a person.  So for example, his favourite technology objects include a hand-made fountain pen and his bicycle, but not his phone.  I’ve met people for whom their customised phone is an expression of their way of life and a badge of status.  So if we are entering a post-consumer world, here’s a new way of thinking about product (and service) differentiation.


The other big area that emerged was the internet of things, as presented by Andy Hobsbawn of EVRYTHNG (  Never mind the machine-to-machine interactions, look at the opportunities for product to owner interaction coupled with rich digital media and social networks.  He’s coined the phrase ‘Product Relationship Management’ which hints at some of the opportunities.


And when you link that to Charles Leadbeater’s thinking about ‘high empathy’ products I think that you can unearth some radical innovation opportunities to build loyalty, community and propositions that are very compelling.  Better still, this is the embodiment of the post-consumer society moving to the experience society.  Add mobility to the mix for yet richer opportunities.  This has the potential to be really big.  And first or early movers will have the best chance of building a loyalty lead that becomes very valuable.



Other thoughts.  Bran Ferren of Applied Minds LLC gave a very polished presentation.  One point was the importance of focusing on the core function.  Wax cylinders, vinyl, cassettes, CDs and now downloads are just ‘ways to make music available anywhere any time’.  I liked the non-obvious point that the development of search engines broadened the appeal of publishing on the web.  It’s now worthwhile because the pearls of wisdom can be found.


He also focused on the importance of storytelling as a core to human existence – be it for business, entertainment or education.  But he maintains that reading and writing is just a fad over a few hundred years – it was just the best way of telling stories and will be overtaken in due course.   Sir John Hegarty (of BBH) reiterated the importance of the story as central to communication; indeed this message cropped up a number of times.


Another gem; I knew the internet evolved from the US Department of Defense work to build a resilient network of computers, but I hadn’t heard that the US Interstate highway system was funded by the government to make the transport of missiles easy.  If it’s true then a nice example of the externalities of investment in infrastructure – which we could learn in the UK.  Incidentally, over lunch I asked Bran if there’d been any surprises for him at the event – ‘none so far’ was the dry reply.


I’d heard the old one about ‘technology is something that doesn’t work yet’, but Charlie Leadbeater went on to suggest that a good measure of technology progress is the rate at which it ‘disappears into day to day life’.  A bit of a challenge to operationalize as a measure – but it certainly tells you where to look to develop next.


Hugh Herr’s demonstration of the state of the art (and the potential) of prosthetics was just stunning (see his talk on TedMed


Vint Cerf (of Google) posited virtually arbitrary amounts of bandwidth in the next decade (though he doesn’t live in a Suffolk village).  But Mia Fyfield of BSkyB, describing their business model in the face of new technology, focused on the ever broader content of a television model.  So it’s not clear that people know (or are admitting to know) what to do with the extra bandwidth.


Which leads me to a few disappointments:  The debate about business models wasn’t really – more of a litany of assurances from the panel about the importance of focusing on the consumer.  Really no new models or insights about models enabled by new technology.  The discussion of big data – an assurance of massive opportunities if only one abandons a schema-based view and thinks differently – but little insight into what ‘different’ might look like.  Surprised that the panel discussing ‘how competition is changing’ ended up by clashing over the relative importance of ‘creativity’ and of ‘analysis and simulation’.  Seems a naïve contrast when they’re surely complementary – but perhaps in this case this was just a self-serving stance on the ‘magic’ of the creatives.


And as is so often the case, fascinating insights and opportunities from the conversations over coffee, lunch and dinner (but the best are too commercially sensitive for an open blog).


The discussion sessions and the panel plenary at last week’s CSaP conference on Risk and Uncertainty ( showed that there remain big issues in communicating both the concepts and the degrees of both risk and resilience.

Although the first session was themed around the precautionary principle, the questions revolved about how to communicate the degree of hazard faced by the public from different courses of action.  To vaccinate or not to vaccinate?  To evacuate or not to evacuate?  To spend or not to spend?

The second session on resilience covered such topics as the wisdom of coupling together networked systems (using the internet to control the power system, anybody?), insurance markets and food production.  This time the debate was how to assess resilience against low probability high impact events.  The debate was somewhat hampered by the different interpretations of the word ‘resilience’.

Sir John Beddington described the outcomes of the Blackett review of High Impact Low Probability risks (  Wryly he observed that low probability events occur about every 12 months now (H5N1, volcano, floods etc) while still the question is, ‘what’s the next one we haven’t thought of?’

And the final panel session on communication highlighted the emergent theme of the whole day – even when you’ve identified, assessed and even evaluated the risk, how do you communicate it effectively.  Although there was a lot of debate about the role of scientists communicating with the public a key minority strand to the discussion was the question of communicating effectively with policy makers, CEOs and those with the cheque book.

There was also a debate about ‘wilful blindness’ – people’s apparent unwillingness to engage in or act upon debate about risk.  Which links well to recent neuroscience research indicating humans’ propensity for optimism ( and that people really do discount bad news (

Two comments really struck me during the day.  The first, from Prof Andy Stirling, pointed out that the best environment for debate, where dissent is allowed and divergence tolerated, is an environment that few leaders seek, preferring instead consensus. (Or as Margaret Thatcher is reputed to have said ‘I like my Ministers to form a consensus behind my opinion’.)

The second comment, from the floor (and I’m sorry I didn’t catch his name) was a question around ‘resilience of what?’  A diversity of scales, (time and spatial), of business models, and even of survival rates actually helps the resilience of the whole because failures prevent contagion across the whole.

So, where are we focusing – resilience of what and what are you willing to sacrifice, either via controlled failure or via precautionary cost?

And what means do you have to understand the issues at hand and the potential costs

And how do you ensure adequate and meaningful debate?




An illuminating article in the Economist “Playing with Fire” ( contains a fascinating timeline of financial innovation from 3000BC to today.  One of the commentators points out also that commodities futures trading started in the Osaka rice market about 1700.

I think the article might have made a little more about the difference between products for ‘risk transfer’ and the effect of ‘risk creation’, albeit because of an inability of the user to evaluate the product.  Social impact products then add the difficulty of evaluating outcome to the problems of evaluating risk during the life of the product.

Ignoring for a moment, products designed specifically for ‘risk creation’ – like betting on horse racing!

Came across a really insightful graphical representation of resilience yesterday: Dr James Kimmance of Parsons Brinkerhoff (  from a 2010 presentation, which I repeat below.

Area ‘A’ represents the cost incurred by a system failure or degradation from its nominal (100%) capability.  The vertical value of the triangle is the extent to which the system is degraded (a measure of ‘robustness’, I suggest?) and the horizontal value is the recovery time.

I like this – I find it intuitively appealing and it clarifies the trade-offs.

The idea can be extended as follows (see the diagram below)

One might choose to redesign or operate the system at some level below its theoretical optimum in exchange for improved resilience.  In other words, there’s an enduring sacrifice of performance (Area ‘B’ in the diagram above).

However, system resilience is improved, represented by a smaller triangle (Area ‘C’) compared with that of a system optimised for normal operation (Area ‘A’).

So the key question becomes the relative sizes of the shaded areas.  Can you design the ‘resilient’ system on the left so the enduring cost is small and the cost of failure is much reduced, bearing in mind the probability of failure.

I realise that this still leaves the problem choosing and then quantifying the ‘functionality’ or ‘capability’ dimension, and little detail of the probability of failure.  But now there’s a clear basis for debate and for the dimensions of the topics of the debate.


March 2, 2012

In this case a convergence of two topics that fascinate me – platforms as a basis for competition and graphics that tell a story well.

Spotted on the Economist blog “Graphic detail” a picture of the ‘platform wars’ in the computer industry:

They describe the shake-out in the computer industry as a war between platforms and show a fascinating chart. (At the risk of being pedantic I think they mean units shipped per annum).

And see also Asymco Interesting, concise and well-presented analysis.

Don’t miss Asymco’s priorities – I like them:


  • Learn by writing. Teach by listening.
  • Improve. Move the intellectual ball forward.
  • Illuminate topics which are bereft of analysis.
  • Be notable. “The proliferation consideration.” How likely is the idea to being (sic) widely re-published?
  • Review. Encourage participation by reading all comments and reply to as many as possible. Police comments with zero tolerance.
  • Repair. Declare and correct errors.
  • Select. Publish only when the contribution is unique. Avoid redundancy, clutter and noise. Don’t waste reader time

And see also the clarity of their process.