Turing Lecture 2015: The Internet Paradox (links updated)

Following a move, I’m no longer close enough to London to easily attend the BCS and IET’s prestige Turing lecture in person. So this year, for the first time, I will be attending online.

Robert Pepper is VP Global Technology Policy at Cisco. His topic: The Internet Paradox: How bottom-up beat(s) command and control. The publicity promises “a lively discussion on how the dynamics of technology policy and largely obscure decisions significantly shaped the Internet as the bottom-up driver of innovation we know today … Dr. Pepper will cover the next market transition to the Internet of Everything and the interplay between policy and technology and highlighting early indicators of what the future may hold for the Internet.

I’m expecting a good objective discussion. As I learned many years ago, listening to Peter Cochrane when he was head of BT’s research centre, those who provide technical infrastructure don’t have a reason to hype up the different services which will run on it. Quite the opposite: they need to assess investment to satisfy demand, but not exceed it. Let’s see what we see. I’ll update this blog as we go, and probably abbreviate it tomorrow.

Starting on time: Liz Bacon, BCS President, is on stage. An unexpected extra: Daniel Turing, Alan Turing’s nephew, is introducing the Turing Trust with a mention of The Imitation Game, the Turing film, and of The BCS’s role in rebuilding Turing’s codebreaking machine (“the bomb”). The Trust recycles first-used computers to less well off countries. In our move last year, I passed quite a lot of old equipment to Recycle-IT who ethically re-use or dispose of un-reusable kit.

Now the main speaker (bio online). He describes himself as a “recovering regulator”; regulation is the intersection of policy and technology. Big iron to nano-compute, and we haven’t even seen the Apple Watch yet! This (and the cost/power changes) drives decentralisation of computing. Alongside, 1969: 4 “internet” locations (packet switched) on the west coast. By 1973, extended outside continental USA (London, Hawaii). 1993: global.

1994-5 the US Government outsourced (privatised) the network. NSF had been created. Restrictions were dropped to permit commercial use; and other governance was created. In the diagram, the biggest nodes (most traffic) are Google and Facebook; but China is coming up fast!

An alternative view: in stages. 1: connectivity (email, search). 2: networked economy; 3, Immersive. 99% of the world, though, is still unconnected. 1000 devices with IP addresses in 1984; forecast 20 bn by 2020. 50bn if you include non-IP such as RFID chips. Internet of Everything will encompass people, processes, data and things. Such as, by 2018, four IP modules on each of 256million connected cars. Such as, sensor clothing for athletes. I have a 1986 news clip from MIT Media Lab about the prototypes for exactly this. The quote was: “Your shoes may know more about you than your doctor does“.

Things create data which, through process, can positively affect people. But only 0.5% of data is being analysed for insights! There’s an example from nutrition. Take a photo of a product in the supermarket, and see if it’s appropriate (for example, no alcohol with your prescription). Or the “Proteus pill” to help with older people’s medication, which the FDA has already approved. Or the Uber cab app.

So that’s the technology. Now, on to policy and governance.

Internet governance developed bottom-up and is not centralised; it’s a multi-stakeholder global ecosystem of private, governments (lots of them!) and intergovernmental, providers, researchers, academics and others. There’s a diagram of those actually involved, which will be quite useful when I can retrieve it readably. First RFC was from ARPAnet in 1969. The first IETF met in 1986. ITU’s World Conference in 2012 saw proposals from some member states to regulate the Internet, and these were rejected. In 2014 the (US Dept of Commerce) proposal is to transition IANA to become a multi-stakeholder global body, so that the US finally cedes control of the network it inaugurated.

Now: as many of us know, the international standards process we currently have is done by consensus and can take years. Contrariwise, the IETF works by “Rough consensus and run code” (everlasting beta). Much faster. Based on RFCs that come in, and with a combination of online and face-to-face meetings. There are NO VOTES (Quakerism works in a similar way); “rough consensus” in IETF is assessed by hum!

Robert shows a slide of a “Technology Hourglass” (citing Steve Deering, 2001; Deering is also a Cisco person. I can’t find the actual reference). IP, at the centre, is in essence the controlling/enabling standard. Above (applications) and below (infrastructure) there can be innovation and differentiation. (My comment: in the same way, both 19th century rolling stock and modern trains can run on today’s network.) The suggestion: it’s a martini glass because at the top there’s a party going on!

There’s no need to ask permission to innovate! This is the Common Law approach: you can do anything that’s not prohibited. The UK has almost 1.5 million people working in this area. They are here because of Common Law: European countries have the reverse (you need permission). The information economy now dominates the previous waves of service, industry and agriculture.

Internet is a General Purpose Technology, like printing and transport and the telephone. Other things are built on it. Increasing broadband provision links to growth: this is not correlational, it is causal. Digital-technology innovation drives GDP growth in mature economies (McKinsey); the impact is on traditional sectors enabled by the digital.

Third: the paradox. There’s decentralisation of compute, to individuals, to nanodevices, and to stakeholders. But right now, governments want to reverse this approach and take control; to re-create silos, have forced localisation of standards, content and devices. This is already the case with some classes of data in some countries.

The issues: (1) extending connectivity to those who are not connected. (2) safety, security and privacy – where there clearly is a role for government, but be clear that these are not just internet issues. Others on a slide about Internet of Everything. Some governments are well-intentioned but not well informed; others, more dangerously, were the reverse. And old-tech assumptions (how you charge for phone service, for example) doesn’t match the new realities; the product is connectivity (not voice).

Swedish study: if you can’t transfer data, you can’t trade (nor have global companies). Localisation of data will impact severely on the global economy. Note: Economist Intelligence Unit looked at some proposals; 90% of the authoritarian regimes voted for new internet regulations on a multilateral basis, 90% of democracies against. Enough! We are at a crossroads where the Net could take either direction, and they are not equal.

Final quote: Neils Bohr. How wonderful we have met with a paradox. Now we have some hope of making progress!

I’m not going to try and capture Q&A. Heading over to Twitter. Watch the webcast; I’ll post the URL in an amendment when it’s up on the IET website.

Has it been an objective discussion? In one sense yes. But in another, Robert Pepper clearly has a passionate belief in the model of governance which he is promoting. What’s been shared is experience, insight and vision. Well worth a review.

Links:
• BCS/IET Turing Lecture 2015: online report (BCS); or view the webcast replay from The IET
Proteus Digital Health including a video on their ingestible sensor
Watching the Waist of the Protocol Hourglass, Steve Deering, seminar 18 Jan 1998 at Carnegie-Mellon University (abstract only)
Turing Trust
Recycle-it (don’t be confused; other organisations with similar names exist on the web)

Master Data Management: sources and insights

Tomorrow I will be facilitating my last Corporate IT Forum event. After five years or so I’m standing down from the team, having valued the Forum first as a member and then, since my first retirement, being on the team. Tomorrow’s event is a webinar, presenting a member’s case study on their journey with Master Data Management (MDM).

There was a phase of my career when I was directly concerned with setting up what we’d now call Master Data for a global oil company. We were concerned to define the entities of interest to the enterprise. When systems (databases and the associated applications) were set up to hold live data and answer day to day or strategic questions, we wanted to avoid the confusions that could so easily arise. everyone thinks they know what a particular entity is. It ain’t necessarily that simple.

A couple of examples.

When we began the journey, we thought we’d start with a simple entity: Country. There are fewer than a couple of hundred countries in the world. We needed to know which country owned, licenced and taxed exploration and production. And everyone knows what a country is, don’t they?

Well, no. Just from our own still-almost-united islands: a simple question. Is Scotland (topically) a country? Is the Isle of Man? Is Jersey? In all those cases, there are some areas (e.g. foreign policy) where the effective answer is no; they are part of the single entity the United Kingdom. But in others (e.g. tax, legal systems, legislature) they are quite separate. And of course the list of countries is not immutable.

So: no single definitive list of countries. No standard list of representative codes either: again, do we use GB? or UK? Do we use international vehicle country codes, or Internet domain codes, or … What codes would be used in data coming in from outside? And finally: could we find an agreed person or function within the Company who would take responsibility for managing and maintaining this dataset, and whose decisions would be accepted by everyone with an interest and their own opinions.

And talking of data coming in from outside: I carried out a reconciliation exercise between two external sources of data on exploration activities in the UK North Sea. You’d think that would be quite well defined: the geological provinces, the licence blocks, the estimates of reserves and so on. record keeping in the UK would surely be up to the game.

But no: the two sources didn’t even agree on the names and definitions of the reservoirs. Bringing the data from these sources together was going to be a non-trivial task requiring geological and commercial expertise.

Then again, we went through a merger and discovered that two companies could allocate responsibility for entities (and for the data which represented them) quite differently within their organisations.

So: this is a well developed topic in information systems. Go back to a Forrester blog in 2012: analyst Michelle Goetz maintains forcefully that MDM is not about providing (in some IT-magic way) a Single Source of Truth. There ain’t no such animal. MDM is a fundamental tool for reconciling different data sources, so that the business can answer useful questions without being confused by different people who think they are talking about the same thing but aren’t, really.

It may be a two year old post, but it’s still relevant, and Michele Goetz is still one of Forrester’s lead analysts in this area. Forrester’s first-ever Wave for MDM solutions came out in February this year. It’s downloadable from some of the leading vendors (such as SAP or Informatica). There’s also a recent Wave on Product Information Management which is tagged “MDM in business terms”, and might be worth a look too. Browse for some of the other stuff.

Gartner have a toolkit of resources. Their famed Magic Quadrant exists in multiple versions e.g. for Product information and for Customer Data. I’d be unsure how the principles of MDM vary between domains so (without studying the reports) I’m not clear why the separation. You might do better with the MDM overview, which also dates from 2012. You will find RFP templates, a risk framework, and market guides. Bill O’Kane and Marcus Collins are key names. For Gartner subscribers, a good browse and an analyst call will be worthwhile.

Browse more widely too. Just one caution: MDM these days also means Mobile Device Management. Don’t get confused!
Links:
• Master Data Management Does Not Equal The Single Source Of Truth, Michele Goetz, Forrester blog, 26 Oct 2012
• The Forrester Wave™: Master Data Management Solutions, Q1 2014, 3 Feb 2014 (download from Informatica, link at foot of page
• PIM: MDM on Business Terms, Michele Goetz, 6 Jun 2014
• Master Data Management, Marcus Collins, Gartner, 9 Jul 2012

Benefits realisation: analyst insight

I’m facilitating an event tomorrow on “Optimising the benefits life cycle”. So as always I undertook my own prior research to see what the mainstream analysts have to offer.

Forrester was a disappointment. “Benefits Realization” (with a z) turns up quite a lot, but the research is primarily labelled “Lead to Revenue Management” – that is, it’s about sales. There is some material on the wider topic, but it dates back several years or longer. Though it’s always relevant to remember Forrester’s elevator project pitch from Chuck Gliedman: We are doing A to make B better, as measured by C, which is worth X dollars (pounds, euros …) to the organisation.

There is a lot of material from both academic researchers and organisations like PMI (Project Management Institute). But in the IT insight market, there seems to be remarkably little (do correct me …) except that the Corporate IT Forum, where I’ll be tomorrow, has returned to the issue regularly. Tomorrow’s event is the latest in the series. The Forum members clearly see this as important.

But so far as external material is concerned, this blog turns into a plug for a recent Gartner webinar by Richard Hunter, who (a fair number of years ago) added considerable value to an internal IT presentation I delivered on emerging technologies for our enterprise. I’m not going to review the whole presentation because it’s on open access from Gartner’s On Demand webinars. But to someone who experienced the measurement-oriented focus of a Six-Sigma driven IT team, it’s not a real surprise that Richard’s key theme is to identify and express the benefits before you start: in business terms, not technology-oriented language, and with an expectation that you will know how to measure and harvest the benefits. It’s not about on-time-on-budget; it’s about the business outcome. Shortening a process cycle from days to hours; reducing the provision for returns; and so on.

If this is your topic, spend an hour reviewing Richard’s presentation (complete with family dog in the background). It will be time well spent.

Links:
• Getting to Benefits Realization: What to Do and When to Do It, Richard Hunter, Gartner, 7 Aug 2014 (go to Gartner Webinars and search for Benefits Realization)
• Corporate IT Forum: Optimising the Benefits Lifecycle (workshop, 16 Sep 2014)

Analyst Directory update

It’s a long time since the InformationSpan blog index has been updated – not since February. To be fair, I had a look in May but there were too few changes to be significant. However, there’s now enough to report, and the index has been thoroughly reviewed and updated.

First, Gartner: a handful of new analysts have appeared. The main comments, though, relate to past acquisitions.

I’ve finally removed almost all references to AMR, but in true Gartner fashion there are some inconsistencies. If you look on Gartner’s Research marketing page, there is of course Gartner for Supply Chain Professionals, created out of the former AMR service. All traces of AMR seem to have disappeared until you look also at the Gartner for Enterprise Supply Chain Leaders service. The flyer for this service is headed “AMR Enterprise Supply Chain Leaders” and is replete with references to AMR services. It’s dated 2010, just after the acquisition; but it’s still on the system. I did not find any other reference to a service called Gartner for Enterprise Supply Chain Leaders.

Burton service have also been fully absorbed; most of the Burton analysts have left, the IT1 tag also seems to have disappeared, and one of the remaining accessible legacy blogs has moved to inaccessible. However, six Burton blogs can still be found and I’ve discovered there are also TypePad profiles linked to them. There’s also still one accessible (but moribund) Gartner IT1 blog, and a fair sprinkling (as always) of blogs left over from other analysts who have left.

There have been more changes to the Forrester page. First, perhaps most significantly: Forrester seem to have shed their Business Technology tag. It was a good one, but didn’t catch on; and I suppose George Colony has decided to go with the market. These services are now referred to as Technology Management.

There have, too, been some changes within Forrester’s categories. Business Process and Content & Collaboration seem to have become moribund (no new content for over two years), and there remain a number of still-extant blog names which redirect somewhere else (and have done so for some time). Interestingly, within the Marketing & Product Strategy group, there’s a blog which had been dormant since 2008 but Consumer Product Strategy has acquired a new posting recently. Forrester seem better than Gartner at tidying up when analysts leave, but there are three or four still-extant blogs from departed analysts.

I reviewed the Others page too. I haven’t added any new analyst sources (suggestions??) but Erica and Sam Driver’s ThinkBalm content has now been lost. Charlene Li’s Altimeter group now has a fully integrated blog section within the main website (not new, but I haven’t noted it before) as well as personal blogs maintained by Charlene herself and some colleagues. I have, though, included Euan Semple’s The Obvious which offers so many of us great insights and ideas. If George Colony hadn’t already grabbed Counterintuitive as his blog title, it would be a good alternative for Euan!

No Links here, but click the link at the head or right hand side of this blog to go to the InformationSpan Analyst Blogs Index.

Growth, Innovation and Leadership: Frost & Sullivan

I’m on a Frost and Sullivan webinar: Growth, Innovation and Leadership (GIL: a major Frost theme). It’s a half-hour panel to discuss successful types of innovation and examples of future innovative technologies with Roberta Gamble, Partner, Energy & Environmental Markets, and Jeff Cotrupe, Director, Stratecast. David Frigstad, Frost’s Chairman, is leading. The event recording will be available in due course.

Frigstad asserts that most industries are undergoing a cycle of disrupt, collapse, transform (or die: Disrupt or Die is an old theme of mine). We start with a concept called the Serendipity Innovation Engine. It’s based on tracking nine technology clusters; major trends; industry sectors; and the “application labs” undertaking development (which includes real labs and also standards bodies and others). And all of this is in the context of seven global challenges: education, security, environment,  economic development, healthcare, infrastructure, and human rights.

Handover to Gamble. This is a thread on industry convergence in energy and environment, seen as a single sector. Urbanisation, and the growth of upcoming economies, are major influences here in demand growth.

We do move to an IT element: innovation in smart homes and smart cities, with integration between sensor/actuator technology and social/cloud media: emphasising this, Google has just bought a smart home company (Nest Labs). City CIOs and City Managers are mentioned as key people – a very US-centric view when most urbanisation is not occurring in the developed world … we do return to implications for developing economies, where the message is that foundations for Smart (which includes effective, clean energy use) should be laid now while there is a relatively uncluttered base to start from.

Frigstad poses a question based on the idea that Big Data is one of the most disruptive trends in this market. Gamble suggests that parking is an example. Apps to find a parking spot, based on data from road sensors or connected parking meters, are not though only being piloted in San Francisco. Similar developments in the UK were mentioned at a Corporate IT Forum event I supported earlier this year.

It’s a segue into the next section: an introduction for Cotrupe, whose field is Big Data and Analytics. Examples of disruption around here include the Google car: who would have thought Google would be an automotive manufacturer? Is your competitor someone you wouldn’t expect? An old question, of course. The UK’s canal companies competed with each other and perhaps with the turnpike roads; they mainly didn’t foresee the railways.

Cotrupe’s main question is: What is Big Data really? He posits it as an element of data management, together with Analytics and BI. I’d want to think about that equation; it’s not intuitively the right way round. But high volume, rapidly moving data does have to be managed effectively for its benefit to be realised – delivering the data users need, when they need it, but not in to overwhelm them. And this means near real-time. It’s IT plus Data Science.

Frost suggest they are more conservative than some, because they see growth of the BD market held back by the sheer cost of large scale facilities.

We’re on the promised half hour for the primary conversations, but still going strong, basically talking with Cotrupe about various industry sectors where Big Data has potential: to support, for example, a move from branch based banking to personal service in an online environment. There’s some discussion of Big Data in government: how will this affect the style of government in perhaps the next 20 years? Cotrupe mentions a transformation in the speed of US immigration in recent years, where data is pre-fetched and the process takes minutes instead of hours. He’s advocating opening up, sharing of information: in other industries too, for example not being frozen by HIPAA requirements in (US) healthcare or, perhaps, EU data protection requirements. I have personal experience of obstructive customer service people trying to hide behind those, and in fact parading their lack of actual knowledge.

Cotrupe talks about privacy, not least in the wake of Snowden and what’s been learned about sharing between NSA and the UK agencies. Cotrupe would like to see theis ease of sharing brought to bear in other areas: but asks how we manage privacy here? There are companies which are leading the way in data collection in consumer-sensitive ways, and this needs to become standard practice. In any case, not collecting data you don’t need will reduce your data centre (should that be Data Center?) footprint.

As we come to a close, with a commercial for the September event in Silicon Valley, I have to say I’m not convinced this webinar was wholly coherent.

If you call something a Serendipity Innovation Engine I want to know how it relates to serendipity: that is, the chance identification of novel discoveries.

If you present a layered model, I expect the layers to relate (probably hierarchically) to one another. It would be more valuable to talk about the four elements of this model separately and be clearer about what each represents. For example, “Health and Wellness” occurs as a Technology Cluster (why?). It’s also a Mega Trend in a layer where Social Trends also sits; surely people’s concern about Health and Wellness is a social trend? Each layer seems to mix social, technical and other concerns.

I learned a  more useful framework when teaching the OU’s Personal Development course. This really is layered. The two internal layers (this is for personal development) are one’s immediate environment, and other elements of your working organisation. Then Zone 3 (near external) encompasses competitors, customers/clients, suppliers and local influences. Zone 4 (far external) includes national and international influences: social, technological, economic, environmental and political (STEEP). On this framework you can chart all the changes discussed in today’s webinar and, I think, more easily draw conclusions!

Links:
• Frost & Sullivan Growth Innovation & Leadership
• Google buys Nest Labs for $3.2bn …, The Guardian, 13 Jan 2014
• STEEP framework: Sheila Tyler, The Manager’s Good Study Guide (third edition, 2007). The Open University. Pages 198-202

Link: Heartbleed update

A quick follow up, back from a few days away.

Huffington Post have a recent update which notes that the Open SSL vulnerability applies in major products from Cisco and Juniper Networks. They also repeat what’s becoming the consensus on passwords: change your passwords for services which you know were vulnerable but have now been patched. There’s no point in changing a password which might still be at risk.

They reference the Mashable resource on what’s been patched a,md copy the patchable list: Google (and Gmail), Yahoo (and Yahoo Mail), Facebook, Pinterest, Instagram, Tumblr, Etsy, GoDaddy, Intuit, USAA, Box, Dropbox, GitHub, IFTTT, Minecraft, OKCupid, SoundCloud and Wunderlist.  A quick look, though, suggests that the Mashable article was a one-off and the list is not being kept updated.

The article also recommends turning off external access to your home network: the sort of capability, for example, that you might use for remote access through LogMeIn, TeamViewer or similar. If you’re not using this kind of facility, disable it. Your firewall should already be holding the line on this.

And check what your Internet provider is doing and the status of your wireless router. Being a BT user. with a BT Home Hub, I tried searching the bt.com website for information on Heartbleed but nothing surfaced. It would be nice to know.

Huffington suggests that, at the moment, public WiFi has to be treated as an unknown quantity since you can’t tell what infrastructure they use or whether it’s been patched. BT again doesn’t have any information on the impact of Heartbleed on BT Wifi (Openzone, as was) but it does say that user details are encrypted when you log in to their service. It’s perhaps ironic that they offer free Cisco VPN software, which you can download when connected to one of their hotspots. I didn’t know this. I’ll take it up for my laptop.

I also have an O2 Wifi locator app on my phone. There’s nothing about security on their website. Anyone with other Wifi-finder apps? Please check their sites and post a comment here about what you find.

Links:
• The Heartbleed Bug Goes Even Deeper Than We Realized – Here’s What You Should Do, Alexis Kleinman, The Huffington Post, 11 Apr 2014
• Security when using BT’s Wi-fi hotspots, BTWifi.com, with link to the Cisco offer
• The Heartbleed Hit List, Mashable, 9 Apr 2014
• What to make of Heartbleed? ITasITis, 4 Apr 2014

Facebook at 10, Microsoft at 40

OK, a slight stretch for a snappy headline but these have been two lead stories in the last few days.

Others will comment with more depth and more knowledge than I can on either Facebook’s tenth anniversary or the appointment of Satya Nadella to succeed Steve Ballmer (and, of course, Bill Gates) at the head of Microsoft. But I was remembering, quite a while ago now, a META Group event in London when the Web was just arriving and disintermediation was a new word. The speaker took a look at the banking industry, with new on-line start-ups starting to eat the lunch of the established financial institutions.

The point was this. The new entrants invested, typically, in just two things: infrastructure, and software development. Existing players had institutional weight; they had enterprises to keep in existence with all the corporate overheads that accumulate over time. with shareholders and stockmarket expectations and dividends. They needed to cut costs to compete with the new lean players. And (doesn’t it still happen?) they would target the IT budget. So the area of investment which differentiated their new competitors was precisely where they were dis-investing.

Microsoft is fast approaching 40. It’s a solid, established player with corporate overheads, strategies, shareholders. Is it still as lean and sharp as the company which turned on a sixpence (a dime, if you’re American; a 5p piece for the youngsters) when it “got” the Internet and realised that MSN and AOL were not going to be where most of the traffic went. Enter Internet Explorer, competing with Netscape; and the rest is history.

Well … we can look at areas in the recent past where that hasn’t been repeated. Smartphones? a lot of Windows phones have been sold, but Android and iPhone are the big players and an Office 365 subscription gives access to Office mobile software on these platforms as well as Windows. But on the other hand: Office 365 is a good model, for both consumers and Microsoft, because it converts intermittent capital costs for what is still essential software into predictable operational costs. And while capital versus operational is the language of the enterprise, where Microsoft’s heart arguably is these days, the concept works for individual licences. There are undoubtedly challenges, but a CEO with an Indian background may have the right insight and vision to work round all that unavoidable corporate baggage.

What about Facebook? Facebook has got to the stage where it is acquiring the corporate baggage (shareholders and so on). It’s had to face up to public perception, particularly over issues like personal online security. Both companies now find themselves covered in the main news sections and financial pages, like any other corporation, rather than only in  geek-tech reporting. They’ve gone mainstream.

So Facebook has new competitors in the social media space, sharper and newly innovative where Facebook is unavoidably solidifying. Microsoft is in a stable, continuing enterprise market which it understands; it appears not to understand the consumer market so well. Facebook is in precisely that consumer market, although a lot of enterprises use it to communicate with their own consumers. It’s a fashion market. What’s coming next? and how can Mark Zuckerberg stay ahead of the game?

No links here; just a personal opinion, and you can find lots of links with some easy searching!