Location services move indoors: Apple’s iBeacon

An incidental headline in Outsell’s information market monitoring email brought my attention to Apple’s new iBeacon technology, announced last year.

We’ve long been used to the idea that the smart devices we carry around with us might/can detect nearby things of interest: for example, alerting us to an offer from a store nearby. Location services, based on GPS, on your current WiFi connection, or on triangulation from your mobile signal, do this. So can active RFID.

But indoor location is difficult. Current technology is an updated version of the old nautical dead reckoning. It notes where you are when you lose your accurate GPS/cellular/WiFi positioning, and uses motion sensors to track.

iBeacon is different. It’s a nearer-proximity application and is based on Bluetooth detection of your smartphone. Apple says: Instead of using latitude and longitude to define the location, iBeacon uses a Bluetooth low energy signal, which iOS devices detect. So you need Bluetooth turned on as well as having an appropriate app loaded. This leaves you a modicum of control, I guess.

What alerted me was Outsell’s note that London-based online community specialist Verve has added Apple’s iBeacon technology to its Community Panel app, allowing it to track individual members as they travel into and around stores fitted with the iBeacon device. The report, from “MrWeb”, is firmly in the market research space. This is very much a retailer’s app; it tracks the device in detail through a store, identifying where the user spends time – and how long they stay there – and possibly triggering instant marketing surveys on that basis.

Verve is a newish (2008) company. They describe themselves as “The community panel for research”. Their business is the creation of community panels, acting as consultants to companies needing consumer-focussed research. There’s no  indication, therefore, of what incentives are offered to users to join panels; but one might assume instant offers would be the least of it. There is some client information in their “About Us” section (but one client is T-Mobile, which hasn’t existed independently since around the time Verve were formed, so one wonders …).

Apple’s developer website suggest a range of applications:

From welcoming people as they arrive at a sporting event to providing information about a nearby museum exhibit, iBeacon opens a new world of possibilities for location awareness, and countless opportunities for interactivity between iOS devices and iBeacon hardware

A link will take you through to a video from the 2014 WorldWide Developers Forum. This is awkward to get at: unless you’re using Safari on a recent MacOS you will need to download the file to play it. But it’s worth it; it takes you on a journey from existing RF triangulation, adding motion sensors when indoors and out of effective range, to the new beacon-based technology. And on the way it suggests more user-oriented applications, such as finding your way roung Heathrow Airport; or through an unfamiliar hospital on a family visit. Watch about the first 15 minutes, before it routes to coding stuff for developers.

Technically, interesting; a new twist on location services. Practically useful; but watch out (as always) for what it may do to your privacy. As they say: enjoy!

Links:
• iOS: understanding iBeacon, Apple
• iBeacon for Developers, Apple Developer website
• Verve Adds iBeacon Tech to Panel App, Mr Web Daily Rresearch News Online, 5 Mar 2015
• Verve: community panel research
Taking Core Location Indoors, Nav Patel, Apple WWDC, June 2014. Page down to find the expanded link

How complex can it be to open a new savings account ?

We’ve recently gone through the exercise of opening online access saving accounts, looking for online instant access accounts with something more than a derisory rate of interest. The exercise has been instructive and at some times extraordinarily frustrating. Terms and Conditions varied from a couple of pages to around forty. It’s worth sharing a few observations which relate, it seems to me, to pseudo-security and to not thinking from the customer’s perspective.

There was one genuine complication. We have recently moved house. Online identity confirmation uses electoral registers, so we don’t show up: and most providers therefore asked for some form of additional confirmation. I don’t have a problem with that, but some make it easy and some don’t!

I’ll name one provider: Virgin Money. Their online process ran like clockwork, their checks were easily completed, and we were up and running in better than even time. The documentation was brief and a model of clarity. And, since they provide the account with an “ordinary” sort code and account number, the initial deposit could be made easily by the third party who was holding our funds.

It’s a pity the others couldn’t take a leaf out of Virgin’s book.

Most of them asked for paper documentation, which is fair enough: typically a certified copy of a passport and a driving licence would do. Certification, like a passport photo, could be done by pretty much any professional: but our first attempt, asking our own bank to do it, met with a refusal. They will only do it for their own products – not even their own customers. The Post Office will do it, for a fee, which is a good solution if you’re new to an area and haven’t yet acquired a wide circle of professional friends. One provider, linked to a major supermarket (one which is somewhat in the news at the moment) wouldn’t even tell us what documents they would ask for until the account had been opened and the initial deposit made. Some were quite quick to send postal correspondence, others much slower. Access codes of course also arrived in the post: fair enough, I count that as good practice.

Then there’s the “linked account” issue. Many savings providers, especially the ones that aren’t clearing banks, require that you nominate a “linked” bank account which must already exist in your name. Some insist that you sign a direct debit in their favour from this account, so you’re not transferring money to them; they’re claiming it off you and you’re subject to their processes. I guess this may avoid the limit which most banks quite properly put on online transfers.

And the rules vary. Some will only accept deposits from this linked account. Some will only pay out to it. Some will only pay interest into it, and some will only add interest to the deposit. All these arcane rules get in the way of what you actually want to do, which is to deposit a sum of money and earn interest.

Third, one account had persistent problems trying to get through the login sequence using Internet Explorer on Windows 8 – hardly an uncommon platform. Firefox on Mac was fine! For another attempt, we persistently failed to get to the starting gate on the online system at all, even after three separate interactions with their tech helpdesk; guess what, they didn’t get the business.

So don’t ever believe a deposit account which says it only takes half an hour to set up. For a start, do make sure you read the T&Cs, and that you can live with how you will be able to deposit money and get it back (including on account closure). Expect to spend up to an hour reading the T&Cs, and another hour working through the setup process. Expect the security checks, other confirmations and postal correspondence to take at least a week and possibly two.

But here’s the key question. If Virgin can make it quick, easy and efficient – and yet, presumably, secure and compliant – why does any other organisation have to make it so complex and frustrating? IT people: don’t let your organisation swamp your interface work with un-necessary complexity!

Links (just one this week)
• Virgin Money: Instant Access e-Saver. See how simple it is!

Master Data Management: sources and insights

Tomorrow I will be facilitating my last Corporate IT Forum event. After five years or so I’m standing down from the team, having valued the Forum first as a member and then, since my first retirement, being on the team. Tomorrow’s event is a webinar, presenting a member’s case study on their journey with Master Data Management (MDM).

There was a phase of my career when I was directly concerned with setting up what we’d now call Master Data for a global oil company. We were concerned to define the entities of interest to the enterprise. When systems (databases and the associated applications) were set up to hold live data and answer day to day or strategic questions, we wanted to avoid the confusions that could so easily arise. everyone thinks they know what a particular entity is. It ain’t necessarily that simple.

A couple of examples.

When we began the journey, we thought we’d start with a simple entity: Country. There are fewer than a couple of hundred countries in the world. We needed to know which country owned, licenced and taxed exploration and production. And everyone knows what a country is, don’t they?

Well, no. Just from our own still-almost-united islands: a simple question. Is Scotland (topically) a country? Is the Isle of Man? Is Jersey? In all those cases, there are some areas (e.g. foreign policy) where the effective answer is no; they are part of the single entity the United Kingdom. But in others (e.g. tax, legal systems, legislature) they are quite separate. And of course the list of countries is not immutable.

So: no single definitive list of countries. No standard list of representative codes either: again, do we use GB? or UK? Do we use international vehicle country codes, or Internet domain codes, or … What codes would be used in data coming in from outside? And finally: could we find an agreed person or function within the Company who would take responsibility for managing and maintaining this dataset, and whose decisions would be accepted by everyone with an interest and their own opinions.

And talking of data coming in from outside: I carried out a reconciliation exercise between two external sources of data on exploration activities in the UK North Sea. You’d think that would be quite well defined: the geological provinces, the licence blocks, the estimates of reserves and so on. record keeping in the UK would surely be up to the game.

But no: the two sources didn’t even agree on the names and definitions of the reservoirs. Bringing the data from these sources together was going to be a non-trivial task requiring geological and commercial expertise.

Then again, we went through a merger and discovered that two companies could allocate responsibility for entities (and for the data which represented them) quite differently within their organisations.

So: this is a well developed topic in information systems. Go back to a Forrester blog in 2012: analyst Michelle Goetz maintains forcefully that MDM is not about providing (in some IT-magic way) a Single Source of Truth. There ain’t no such animal. MDM is a fundamental tool for reconciling different data sources, so that the business can answer useful questions without being confused by different people who think they are talking about the same thing but aren’t, really.

It may be a two year old post, but it’s still relevant, and Michele Goetz is still one of Forrester’s lead analysts in this area. Forrester’s first-ever Wave for MDM solutions came out in February this year. It’s downloadable from some of the leading vendors (such as SAP or Informatica). There’s also a recent Wave on Product Information Management which is tagged “MDM in business terms”, and might be worth a look too. Browse for some of the other stuff.

Gartner have a toolkit of resources. Their famed Magic Quadrant exists in multiple versions e.g. for Product information and for Customer Data. I’d be unsure how the principles of MDM vary between domains so (without studying the reports) I’m not clear why the separation. You might do better with the MDM overview, which also dates from 2012. You will find RFP templates, a risk framework, and market guides. Bill O’Kane and Marcus Collins are key names. For Gartner subscribers, a good browse and an analyst call will be worthwhile.

Browse more widely too. Just one caution: MDM these days also means Mobile Device Management. Don’t get confused!
Links:
• Master Data Management Does Not Equal The Single Source Of Truth, Michele Goetz, Forrester blog, 26 Oct 2012
• The Forrester Wave™: Master Data Management Solutions, Q1 2014, 3 Feb 2014 (download from Informatica, link at foot of page
• PIM: MDM on Business Terms, Michele Goetz, 6 Jun 2014
• Master Data Management, Marcus Collins, Gartner, 9 Jul 2012

Benefits realisation: analyst insight

I’m facilitating an event tomorrow on “Optimising the benefits life cycle”. So as always I undertook my own prior research to see what the mainstream analysts have to offer.

Forrester was a disappointment. “Benefits Realization” (with a z) turns up quite a lot, but the research is primarily labelled “Lead to Revenue Management” – that is, it’s about sales. There is some material on the wider topic, but it dates back several years or longer. Though it’s always relevant to remember Forrester’s elevator project pitch from Chuck Gliedman: We are doing A to make B better, as measured by C, which is worth X dollars (pounds, euros …) to the organisation.

There is a lot of material from both academic researchers and organisations like PMI (Project Management Institute). But in the IT insight market, there seems to be remarkably little (do correct me …) except that the Corporate IT Forum, where I’ll be tomorrow, has returned to the issue regularly. Tomorrow’s event is the latest in the series. The Forum members clearly see this as important.

But so far as external material is concerned, this blog turns into a plug for a recent Gartner webinar by Richard Hunter, who (a fair number of years ago) added considerable value to an internal IT presentation I delivered on emerging technologies for our enterprise. I’m not going to review the whole presentation because it’s on open access from Gartner’s On Demand webinars. But to someone who experienced the measurement-oriented focus of a Six-Sigma driven IT team, it’s not a real surprise that Richard’s key theme is to identify and express the benefits before you start: in business terms, not technology-oriented language, and with an expectation that you will know how to measure and harvest the benefits. It’s not about on-time-on-budget; it’s about the business outcome. Shortening a process cycle from days to hours; reducing the provision for returns; and so on.

If this is your topic, spend an hour reviewing Richard’s presentation (complete with family dog in the background). It will be time well spent.

Links:
• Getting to Benefits Realization: What to Do and When to Do It, Richard Hunter, Gartner, 7 Aug 2014 (go to Gartner Webinars and search for Benefits Realization)
• Corporate IT Forum: Optimising the Benefits Lifecycle (workshop, 16 Sep 2014)

Dark Web: good, bad, or amoral?

Last night I watched BBC’s Horizon programme reviewing the history and impact of what’s become known as the Dark Web. Here seems to be the scenario.

In the beginning, was the Internet. In the early days of the Web I wrote a strategic report for my company which triggered the adoption of web technology and internet email. One of the things I pointed out was that, in the precursors such as newsgroups, no-one was anonymous. Traffic has identifiers or, at least, IP addresses attached to it. People know who you are, and your company’s reputation hinges on your behaviour online. As the Internet of Things expands, the amount of information about individuals that can be analysed out of internet traffic expands exponentially with it.

Governments, particularly the US, recognised the potential for compromising security and the response was TOR (The Onion Router network) which passed traffic through a number of nodes to disguise its origin. The project moved to Open Source and has become widely used in response to the growing levels of surveillance of internet traffic, revealed most notably of course by Edward Snowden. Wikileaks uses TOR to facilitate anonymous contributions: it wasn’t tracking which identified Snowden, or Manning. It has been used extensively in recent events in the Middle East.

So at this point, governments are trying to put the genie back in the bottle: they invented TOR, but they don’t like it being used to hide information from them. Moreover, it is being used for criminal transactions on a substantial scale: and at this point Bitcoin becomes part of the picture, because (unlike conventionally banked money) it too is not inherently traceable.

There’s no firm conclusion drawn in the programme, and surely that’s right. Technology of this kind isn’t inherently good or bad: it is, in the strict sense of the word, amoral. But the uses people make of it, as with almost any technology, are not amoral. And the programme raises strong issues about the balance of privacy and security, both in their widest senses. The sources used are strong and reputable: Oxford University’s Internet Institute; Julia Angwin, an established technology researcher and writer, key individuals in the development of these technologies, Julian Assange of WikiLeaks, and not least Tim Berners-Lee who admits to having been perhaps naive in his early assessment of these issues.

While it’s still on iPlayer, it’s worth a watch.

Links:
• Inside the Dark Web, BBC Horizon, 3 Sep 2014 (available on iPlayer in the UK until 15 Sep)
• Tor Project online, and Wikipedia article
• Oxford Internet Institute
• Julia Angwin

Growth, Innovation and Leadership: Frost & Sullivan

I’m on a Frost and Sullivan webinar: Growth, Innovation and Leadership (GIL: a major Frost theme). It’s a half-hour panel to discuss successful types of innovation and examples of future innovative technologies with Roberta Gamble, Partner, Energy & Environmental Markets, and Jeff Cotrupe, Director, Stratecast. David Frigstad, Frost’s Chairman, is leading. The event recording will be available in due course.

Frigstad asserts that most industries are undergoing a cycle of disrupt, collapse, transform (or die: Disrupt or Die is an old theme of mine). We start with a concept called the Serendipity Innovation Engine. It’s based on tracking nine technology clusters; major trends; industry sectors; and the “application labs” undertaking development (which includes real labs and also standards bodies and others). And all of this is in the context of seven global challenges: education, security, environment,  economic development, healthcare, infrastructure, and human rights.

Handover to Gamble. This is a thread on industry convergence in energy and environment, seen as a single sector. Urbanisation, and the growth of upcoming economies, are major influences here in demand growth.

We do move to an IT element: innovation in smart homes and smart cities, with integration between sensor/actuator technology and social/cloud media: emphasising this, Google has just bought a smart home company (Nest Labs). City CIOs and City Managers are mentioned as key people – a very US-centric view when most urbanisation is not occurring in the developed world … we do return to implications for developing economies, where the message is that foundations for Smart (which includes effective, clean energy use) should be laid now while there is a relatively uncluttered base to start from.

Frigstad poses a question based on the idea that Big Data is one of the most disruptive trends in this market. Gamble suggests that parking is an example. Apps to find a parking spot, based on data from road sensors or connected parking meters, are not though only being piloted in San Francisco. Similar developments in the UK were mentioned at a Corporate IT Forum event I supported earlier this year.

It’s a segue into the next section: an introduction for Cotrupe, whose field is Big Data and Analytics. Examples of disruption around here include the Google car: who would have thought Google would be an automotive manufacturer? Is your competitor someone you wouldn’t expect? An old question, of course. The UK’s canal companies competed with each other and perhaps with the turnpike roads; they mainly didn’t foresee the railways.

Cotrupe’s main question is: What is Big Data really? He posits it as an element of data management, together with Analytics and BI. I’d want to think about that equation; it’s not intuitively the right way round. But high volume, rapidly moving data does have to be managed effectively for its benefit to be realised – delivering the data users need, when they need it, but not in to overwhelm them. And this means near real-time. It’s IT plus Data Science.

Frost suggest they are more conservative than some, because they see growth of the BD market held back by the sheer cost of large scale facilities.

We’re on the promised half hour for the primary conversations, but still going strong, basically talking with Cotrupe about various industry sectors where Big Data has potential: to support, for example, a move from branch based banking to personal service in an online environment. There’s some discussion of Big Data in government: how will this affect the style of government in perhaps the next 20 years? Cotrupe mentions a transformation in the speed of US immigration in recent years, where data is pre-fetched and the process takes minutes instead of hours. He’s advocating opening up, sharing of information: in other industries too, for example not being frozen by HIPAA requirements in (US) healthcare or, perhaps, EU data protection requirements. I have personal experience of obstructive customer service people trying to hide behind those, and in fact parading their lack of actual knowledge.

Cotrupe talks about privacy, not least in the wake of Snowden and what’s been learned about sharing between NSA and the UK agencies. Cotrupe would like to see theis ease of sharing brought to bear in other areas: but asks how we manage privacy here? There are companies which are leading the way in data collection in consumer-sensitive ways, and this needs to become standard practice. In any case, not collecting data you don’t need will reduce your data centre (should that be Data Center?) footprint.

As we come to a close, with a commercial for the September event in Silicon Valley, I have to say I’m not convinced this webinar was wholly coherent.

If you call something a Serendipity Innovation Engine I want to know how it relates to serendipity: that is, the chance identification of novel discoveries.

If you present a layered model, I expect the layers to relate (probably hierarchically) to one another. It would be more valuable to talk about the four elements of this model separately and be clearer about what each represents. For example, “Health and Wellness” occurs as a Technology Cluster (why?). It’s also a Mega Trend in a layer where Social Trends also sits; surely people’s concern about Health and Wellness is a social trend? Each layer seems to mix social, technical and other concerns.

I learned a  more useful framework when teaching the OU’s Personal Development course. This really is layered. The two internal layers (this is for personal development) are one’s immediate environment, and other elements of your working organisation. Then Zone 3 (near external) encompasses competitors, customers/clients, suppliers and local influences. Zone 4 (far external) includes national and international influences: social, technological, economic, environmental and political (STEEP). On this framework you can chart all the changes discussed in today’s webinar and, I think, more easily draw conclusions!

Links:
• Frost & Sullivan Growth Innovation & Leadership
• Google buys Nest Labs for $3.2bn …, The Guardian, 13 Jan 2014
• STEEP framework: Sheila Tyler, The Manager’s Good Study Guide (third edition, 2007). The Open University. Pages 198-202

What to make of Heartbleed?

I watched the BBC News report last night about the security hole in Open SSL. With its conclusion that everyone should change all their passwords, now … and the old chestnut that you should keep separate passwords for every service you use, never write them down, and so on. Thankfully by this morning common sense is beginning to prevail. The Guardian passes on advice to check if services have been patched first; and offer a link to a tool that will check a site for you.

First, as they say, other Secure Socket Layer implementations are available. While a lot of secure web connections do rely on Open SSL, it’s not by any means universal.

Second, as always, dig behind the news. As Techcrunch did. This is the first vulnerability to have its own website and “cool logo”; this was launched by Codenomicon in Finland which started by creating notes for its own internal use and then took what it calls a “Bugs 2.0″ approach to put their information out there. I remember doing something similar way back in Year 2000 days. Incidentally, the Open SSL report (very brief) credits Google Security for discovering the bug. It also identifies the versions which are vulnerable. (There’s a note there that says that if users can’t upgrade to the fixed version, they can recompile Open SSL with -DOPENSSL_NO_HEARTBEATS which, I’m guessing, gives a clue as to the naming of the bug.)

If you want real information, then, go to Heartbleed.com. The Codenomicon Q&A is posted there. In brief: this is not a problem with the specification of SSL/TLS; it’s an implementation bug in OpenSSL. It has been around a long time, but there’s no evidence of significant exploitation. A fix is already available, but needs to be rolled out.

What was clear, too, is that the BBC reporter (and some others) don’t understand the Open Source process. The Guardian asserts that “anyone can update” the code, and leads readers to suppose that someone can maliciously insert a vulnerability. Conspiracy theories suggest that this might even be part of the NSA’s attack on internet security. But of course that ain’t the case. Yes, anyone can join an Open Source project: but code updates don’t automatically get put out there. Bugs can get through, just as they can in commercial software: but testing and versioning is a pretty rigorous process.

Also, this is a server-side problem not an end-user issue. So yes, change your passwords on key services that handle your critical resources  if you’re worried but it might be worth, first, checking whether they’re likely to be using Open SSL. Your bank probably isn’t. There’s a useful list of possibly vulnerable services on Mashable (Facebook: change it; LinkedIn: no need; and so on)

And what do you do about passwords? We use so many online services and accounts that unless you have a systematic approach to passwords you’ll never cope. Personally, I have a standard, hopefully unguessable password I use for all low-criticality services; another, much stronger, for a small handful of critical and really personal ones; and a system which makes it fairly easy to recover passwords for a range of intermediate sites (rely on their Reset Password facility and keep a record of when this has been last used). But also, for online purchases, I use a separate credit card with a deliberately low credit limit. Don’t just rely on technology!

Links:
• Heartbleed, The First Security Bug With A Cool Logo, TechCrunch, 9 Apr 2014
• Heartbleed bug, website from Codenomicon (Finland) – use this site for onward references to official vulnerability reports and other sources
• OpenSSL project
• The Heartbleed Hit List, Mashable, 9 Apr 2014
Heartbleed: don’t rush to update passwords, security experts warn, Alex Hearn, The Guardian, 9 Apr 2014
• Heartbleed bug: Public urged to reset all passwords, Rory Cellan-Jones (main report), BBC, 9 Apr 2014
Test (your) server for Heartbleed, service from Filippo Valsorda as referenced in The Guardian. I’m unclear why this service is registered in the British Indian Ocean Territory (.io domain) since Filippo’s bio says he is currently attending “hacker school in NYC”. On your own head be it.