Master Data Management: sources and insights 23 Sep 2014Posted by Tony Law in Impact of IT, Insight services, IT is business, ITasITis, Managing IT.
add a comment
Tomorrow I will be facilitating my last Corporate IT Forum event. After five years or so I’m standing down from the team, having valued the Forum first as a member and then, since my first retirement, being on the team. Tomorrow’s event is a webinar, presenting a member’s case study on their journey with Master Data Management (MDM).
There was a phase of my career when I was directly concerned with setting up what we’d now call Master Data for a global oil company. We were concerned to define the entities of interest to the enterprise. When systems (databases and the associated applications) were set up to hold live data and answer day to day or strategic questions, we wanted to avoid the confusions that could so easily arise. everyone thinks they know what a particular entity is. It ain’t necessarily that simple.
A couple of examples.
When we began the journey, we thought we’d start with a simple entity: Country. There are fewer than a couple of hundred countries in the world. We needed to know which country owned, licenced and taxed exploration and production. And everyone knows what a country is, don’t they?
Well, no. Just from our own still-almost-united islands: a simple question. Is Scotland (topically) a country? Is the Isle of Man? Is Jersey? In all those cases, there are some areas (e.g. foreign policy) where the effective answer is no; they are part of the single entity the United Kingdom. But in others (e.g. tax, legal systems, legislature) they are quite separate. And of course the list of countries is not immutable.
So: no single definitive list of countries. No standard list of representative codes either: again, do we use GB? or UK? Do we use international vehicle country codes, or Internet domain codes, or … What codes would be used in data coming in from outside? And finally: could we find an agreed person or function within the Company who would take responsibility for managing and maintaining this dataset, and whose decisions would be accepted by everyone with an interest and their own opinions.
And talking of data coming in from outside: I carried out a reconciliation exercise between two external sources of data on exploration activities in the UK North Sea. You’d think that would be quite well defined: the geological provinces, the licence blocks, the estimates of reserves and so on. record keeping in the UK would surely be up to the game.
But no: the two sources didn’t even agree on the names and definitions of the reservoirs. Bringing the data from these sources together was going to be a non-trivial task requiring geological and commercial expertise.
Then again, we went through a merger and discovered that two companies could allocate responsibility for entities (and for the data which represented them) quite differently within their organisations.
So: this is a well developed topic in information systems. Go back to a Forrester blog in 2012: analyst Michelle Goetz maintains forcefully that MDM is not about providing (in some IT-magic way) a Single Source of Truth. There ain’t no such animal. MDM is a fundamental tool for reconciling different data sources, so that the business can answer useful questions without being confused by different people who think they are talking about the same thing but aren’t, really.
It may be a two year old post, but it’s still relevant, and Michele Goetz is still one of Forrester’s lead analysts in this area. Forrester’s first-ever Wave for MDM solutions came out in February this year. It’s downloadable from some of the leading vendors (such as SAP or Informatica). There’s also a recent Wave on Product Information Management which is tagged “MDM in business terms”, and might be worth a look too. Browse for some of the other stuff.
Gartner have a toolkit of resources. Their famed Magic Quadrant exists in multiple versions e.g. for Product information and for Customer Data. I’d be unsure how the principles of MDM vary between domains so (without studying the reports) I’m not clear why the separation. You might do better with the MDM overview, which also dates from 2012. You will find RFP templates, a risk framework, and market guides. Bill O’Kane and Marcus Collins are key names. For Gartner subscribers, a good browse and an analyst call will be worthwhile.
Browse more widely too. Just one caution: MDM these days also means Mobile Device Management. Don’t get confused!
• Master Data Management Does Not Equal The Single Source Of Truth, Michele Goetz, Forrester blog, 26 Oct 2012
• The Forrester Wave™: Master Data Management Solutions, Q1 2014, 3 Feb 2014 (download from Informatica, link at foot of page
• PIM: MDM on Business Terms, Michele Goetz, 6 Jun 2014
• Master Data Management, Marcus Collins, Gartner, 9 Jul 2012
Benefits realisation: analyst insight 15 Sep 2014Posted by Tony Law in Impact of IT, Insight services, IT is business, ITasITis, Managing IT, Tech Watch.
Tags: benefits, Gartner, Richard Hunter
add a comment
I’m facilitating an event tomorrow on “Optimising the benefits life cycle”. So as always I undertook my own prior research to see what the mainstream analysts have to offer.
Forrester was a disappointment. “Benefits Realization” (with a z) turns up quite a lot, but the research is primarily labelled “Lead to Revenue Management” – that is, it’s about sales. There is some material on the wider topic, but it dates back several years or longer. Though it’s always relevant to remember Forrester’s elevator project pitch from Chuck Gliedman: We are doing A to make B better, as measured by C, which is worth X dollars (pounds, euros …) to the organisation.
There is a lot of material from both academic researchers and organisations like PMI (Project Management Institute). But in the IT insight market, there seems to be remarkably little (do correct me …) except that the Corporate IT Forum, where I’ll be tomorrow, has returned to the issue regularly. Tomorrow’s event is the latest in the series. The Forum members clearly see this as important.
But so far as external material is concerned, this blog turns into a plug for a recent Gartner webinar by Richard Hunter, who (a fair number of years ago) added considerable value to an internal IT presentation I delivered on emerging technologies for our enterprise. I’m not going to review the whole presentation because it’s on open access from Gartner’s On Demand webinars. But to someone who experienced the measurement-oriented focus of a Six-Sigma driven IT team, it’s not a real surprise that Richard’s key theme is to identify and express the benefits before you start: in business terms, not technology-oriented language, and with an expectation that you will know how to measure and harvest the benefits. It’s not about on-time-on-budget; it’s about the business outcome. Shortening a process cycle from days to hours; reducing the provision for returns; and so on.
If this is your topic, spend an hour reviewing Richard’s presentation (complete with family dog in the background). It will be time well spent.
• Getting to Benefits Realization: What to Do and When to Do It, Richard Hunter, Gartner, 7 Aug 2014 (go to Gartner Webinars and search for Benefits Realization)
• Corporate IT Forum: Optimising the Benefits Lifecycle (workshop, 16 Sep 2014)
Analyst Directory update 9 Sep 2014Posted by Tony Law in Impact of IT, Insight services, ITasITis, Managing IT, Technorati.
Tags: Gartner Forrester Semple
add a comment
It’s a long time since the InformationSpan blog index has been updated – not since February. To be fair, I had a look in May but there were too few changes to be significant. However, there’s now enough to report, and the index has been thoroughly reviewed and updated.
First, Gartner: a handful of new analysts have appeared. The main comments, though, relate to past acquisitions.
I’ve finally removed almost all references to AMR, but in true Gartner fashion there are some inconsistencies. If you look on Gartner’s Research marketing page, there is of course Gartner for Supply Chain Professionals, created out of the former AMR service. All traces of AMR seem to have disappeared until you look also at the Gartner for Enterprise Supply Chain Leaders service. The flyer for this service is headed “AMR Enterprise Supply Chain Leaders” and is replete with references to AMR services. It’s dated 2010, just after the acquisition; but it’s still on the system. I did not find any other reference to a service called Gartner for Enterprise Supply Chain Leaders.
Burton service have also been fully absorbed; most of the Burton analysts have left, the IT1 tag also seems to have disappeared, and one of the remaining accessible legacy blogs has moved to inaccessible. However, six Burton blogs can still be found and I’ve discovered there are also TypePad profiles linked to them. There’s also still one accessible (but moribund) Gartner IT1 blog, and a fair sprinkling (as always) of blogs left over from other analysts who have left.
There have been more changes to the Forrester page. First, perhaps most significantly: Forrester seem to have shed their Business Technology tag. It was a good one, but didn’t catch on; and I suppose George Colony has decided to go with the market. These services are now referred to as Technology Management.
There have, too, been some changes within Forrester’s categories. Business Process and Content & Collaboration seem to have become moribund (no new content for over two years), and there remain a number of still-extant blog names which redirect somewhere else (and have done so for some time). Interestingly, within the Marketing & Product Strategy group, there’s a blog which had been dormant since 2008 but Consumer Product Strategy has acquired a new posting recently. Forrester seem better than Gartner at tidying up when analysts leave, but there are three or four still-extant blogs from departed analysts.
I reviewed the Others page too. I haven’t added any new analyst sources (suggestions??) but Erica and Sam Driver’s ThinkBalm content has now been lost. Charlene Li’s Altimeter group now has a fully integrated blog section within the main website (not new, but I haven’t noted it before) as well as personal blogs maintained by Charlene herself and some colleagues. I have, though, included Euan Semple’s The Obvious which offers so many of us great insights and ideas. If George Colony hadn’t already grabbed Counterintuitive as his blog title, it would be a good alternative for Euan!
No Links here, but click the link at the head or right hand side of this blog to go to the InformationSpan Analyst Blogs Index.
Tags: Big Data, Frost and Sullivan, Smart
add a comment
I’m on a Frost and Sullivan webinar: Growth, Innovation and Leadership (GIL: a major Frost theme). It’s a half-hour panel to discuss successful types of innovation and examples of future innovative technologies with Roberta Gamble, Partner, Energy & Environmental Markets, and Jeff Cotrupe, Director, Stratecast. David Frigstad, Frost’s Chairman, is leading. The event recording will be available in due course.
Frigstad asserts that most industries are undergoing a cycle of disrupt, collapse, transform (or die: Disrupt or Die is an old theme of mine). We start with a concept called the Serendipity Innovation Engine. It’s based on tracking nine technology clusters; major trends; industry sectors; and the “application labs” undertaking development (which includes real labs and also standards bodies and others). And all of this is in the context of seven global challenges: education, security, environment, economic development, healthcare, infrastructure, and human rights.
Handover to Gamble. This is a thread on industry convergence in energy and environment, seen as a single sector. Urbanisation, and the growth of upcoming economies, are major influences here in demand growth.
We do move to an IT element: innovation in smart homes and smart cities, with integration between sensor/actuator technology and social/cloud media: emphasising this, Google has just bought a smart home company (Nest Labs). City CIOs and City Managers are mentioned as key people – a very US-centric view when most urbanisation is not occurring in the developed world … we do return to implications for developing economies, where the message is that foundations for Smart (which includes effective, clean energy use) should be laid now while there is a relatively uncluttered base to start from.
Frigstad poses a question based on the idea that Big Data is one of the most disruptive trends in this market. Gamble suggests that parking is an example. Apps to find a parking spot, based on data from road sensors or connected parking meters, are not though only being piloted in San Francisco. Similar developments in the UK were mentioned at a Corporate IT Forum event I supported earlier this year.
It’s a segue into the next section: an introduction for Cotrupe, whose field is Big Data and Analytics. Examples of disruption around here include the Google car: who would have thought Google would be an automotive manufacturer? Is your competitor someone you wouldn’t expect? An old question, of course. The UK’s canal companies competed with each other and perhaps with the turnpike roads; they mainly didn’t foresee the railways.
Cotrupe’s main question is: What is Big Data really? He posits it as an element of data management, together with Analytics and BI. I’d want to think about that equation; it’s not intuitively the right way round. But high volume, rapidly moving data does have to be managed effectively for its benefit to be realised – delivering the data users need, when they need it, but not in to overwhelm them. And this means near real-time. It’s IT plus Data Science.
Frost suggest they are more conservative than some, because they see growth of the BD market held back by the sheer cost of large scale facilities.
We’re on the promised half hour for the primary conversations, but still going strong, basically talking with Cotrupe about various industry sectors where Big Data has potential: to support, for example, a move from branch based banking to personal service in an online environment. There’s some discussion of Big Data in government: how will this affect the style of government in perhaps the next 20 years? Cotrupe mentions a transformation in the speed of US immigration in recent years, where data is pre-fetched and the process takes minutes instead of hours. He’s advocating opening up, sharing of information: in other industries too, for example not being frozen by HIPAA requirements in (US) healthcare or, perhaps, EU data protection requirements. I have personal experience of obstructive customer service people trying to hide behind those, and in fact parading their lack of actual knowledge.
Cotrupe talks about privacy, not least in the wake of Snowden and what’s been learned about sharing between NSA and the UK agencies. Cotrupe would like to see theis ease of sharing brought to bear in other areas: but asks how we manage privacy here? There are companies which are leading the way in data collection in consumer-sensitive ways, and this needs to become standard practice. In any case, not collecting data you don’t need will reduce your data centre (should that be Data Center?) footprint.
As we come to a close, with a commercial for the September event in Silicon Valley, I have to say I’m not convinced this webinar was wholly coherent.
If you call something a Serendipity Innovation Engine I want to know how it relates to serendipity: that is, the chance identification of novel discoveries.
If you present a layered model, I expect the layers to relate (probably hierarchically) to one another. It would be more valuable to talk about the four elements of this model separately and be clearer about what each represents. For example, “Health and Wellness” occurs as a Technology Cluster (why?). It’s also a Mega Trend in a layer where Social Trends also sits; surely people’s concern about Health and Wellness is a social trend? Each layer seems to mix social, technical and other concerns.
I learned a more useful framework when teaching the OU’s Personal Development course. This really is layered. The two internal layers (this is for personal development) are one’s immediate environment, and other elements of your working organisation. Then Zone 3 (near external) encompasses competitors, customers/clients, suppliers and local influences. Zone 4 (far external) includes national and international influences: social, technological, economic, environmental and political (STEEP). On this framework you can chart all the changes discussed in today’s webinar and, I think, more easily draw conclusions!
• Frost & Sullivan Growth Innovation & Leadership
• Google buys Nest Labs for $3.2bn …, The Guardian, 13 Jan 2014
• STEEP framework: Sheila Tyler, The Manager’s Good Study Guide (third edition, 2007). The Open University. Pages 198-202
Link: Heartbleed update 15 Apr 2014Posted by Tony Law in Impact of IT, ITasITis, Managing IT, Tech Watch, Technorati, Uncategorized.
Tags: Cisco, Heartbleed, security
add a comment
A quick follow up, back from a few days away.
Huffington Post have a recent update which notes that the Open SSL vulnerability applies in major products from Cisco and Juniper Networks. They also repeat what’s becoming the consensus on passwords: change your passwords for services which you know were vulnerable but have now been patched. There’s no point in changing a password which might still be at risk.
They reference the Mashable resource on what’s been patched a,md copy the patchable list: Google (and Gmail), Yahoo (and Yahoo Mail), Facebook, Pinterest, Instagram, Tumblr, Etsy, GoDaddy, Intuit, USAA, Box, Dropbox, GitHub, IFTTT, Minecraft, OKCupid, SoundCloud and Wunderlist. A quick look, though, suggests that the Mashable article was a one-off and the list is not being kept updated.
The article also recommends turning off external access to your home network: the sort of capability, for example, that you might use for remote access through LogMeIn, TeamViewer or similar. If you’re not using this kind of facility, disable it. Your firewall should already be holding the line on this.
And check what your Internet provider is doing and the status of your wireless router. Being a BT user. with a BT Home Hub, I tried searching the bt.com website for information on Heartbleed but nothing surfaced. It would be nice to know.
Huffington suggests that, at the moment, public WiFi has to be treated as an unknown quantity since you can’t tell what infrastructure they use or whether it’s been patched. BT again doesn’t have any information on the impact of Heartbleed on BT Wifi (Openzone, as was) but it does say that user details are encrypted when you log in to their service. It’s perhaps ironic that they offer free Cisco VPN software, which you can download when connected to one of their hotspots. I didn’t know this. I’ll take it up for my laptop.
I also have an O2 Wifi locator app on my phone. There’s nothing about security on their website. Anyone with other Wifi-finder apps? Please check their sites and post a comment here about what you find.
• The Heartbleed Bug Goes Even Deeper Than We Realized – Here’s What You Should Do, Alexis Kleinman, The Huffington Post, 11 Apr 2014
• Security when using BT’s Wi-fi hotspots, BTWifi.com, with link to the Cisco offer
• The Heartbleed Hit List, Mashable, 9 Apr 2014
• What to make of Heartbleed? ITasITis, 4 Apr 2014
Insight providers and market evaluation 6 Nov 2013Posted by Tony Law in Impact of IT, Insight services, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
This is a slightly extended version of a response in LinkedIn to Michael Rasmussen, who has published some thought (“a rant”) about Gartner’s Magic Quadrant.
MQ is a highly influential and long established analyst tool. As an insight services user in enterprise IT, I made use of MQs regularly and would also review similar tools such as Forrester’s Wave when a purchasing decision was being made. Like anything else, it’s essential to know just what a tool like this is, how it’s created and what it does and does not convey. The same is true of Gartner’s Hype Cycle, as I’ve commented elsewhere.
Michael highlights several concerns about Gartner’s recently updated MQ in his own area of considerable expertise, that is, global risk and compliance (GRC). Do read his original, which I won’t attempt to summarise; see the link below. Here’s my response.
Michael, having read the whole post in your blog, a couple of comments from a user’s perspective. First: I wholly agree that Forrester’s Wave value is in the open availability both of the evaluation criteria and of the base data; it would be fantastic to see the same from Gartner. This isn’t just an issue of general open-ness. Since a user can adjust the weightings on the Forrester evaluations, it becomes a much more practical tool.
Second, I remember the moment of revelation when I realised there is a whole industry out there called Analyst Relations, that is, people employed by (big) vendors to influence the analysts. Users often don’t realise that’s how the insight market works.
Third, new approaches do emerge. I’d be interested in your take on Phil Fersht’s Blueprint methodology at Horses for Sources (HfS).
My own analysis of the insight market itself classifies providers in various dimensions. One of these looks at reach, both geographic and content: from global generalists (Gartner for example) through to niche (often start-ups – you yourself have progressed from niche to global specialist since you left Forrester). Perhaps tools like the Wave or MQ should have similar dimensions so that the innovative new providers can be properly assessed.
To add a couple more points. As a technology innovation researcher, I was always well aware that small start-ups often offered innovative options which larger vendors didn’t have or hadn’t got round to. But you took the risk of the enterprise falling apart, failing to deliver, or just failing. Experimental technologies always carry risk and the options are tactical (innovation for shorter-term business benefit) not strategic. Gartner I’m sure would assert that innovation is handled by their Vision dimension in the MQ but, as Mike points out, there are thresholds and other elements which mean that these tools don’t make it into MQs. HfS makes innovation explicit.
Second, in business-critical areas which are highly specific to your business area it’s unlikely that an insight provider will know as much as you do. Don’t automatically assume that a MQ or any other tool will deliver the right answer. Use the tools most certainly, but be prepared to reason your way to, argue for and adopt a solution which is at odds with what the tools say. You must of course be able to justify this, but the general answer may not be right for you.
• Gartner GRC Magic Quadrant Rant, Part 3, Mike Rasmussen, GRC Pundit, 23 Oct 2013
• The HfS Blueprint Methodology Explained, Jamie Snowden and others, HfS Research, Oct 2013
• GRC 20/20 research (Mike Rasmussen)
Business Process Improvement 17 Sep 2013Posted by Tony Law in Impact of IT, IT is business, ITasITis, Managing IT, Technorati, Uncategorized.
add a comment
Working for GlaxoSmithKline IT, after the 2000 merger, developed my familiarity with business process improvement (small letters) and with Six Sigma methods and metrics. I would never call myself an expert. Routine training was to Green Belt level, without taking the qualifying exam, and I don’t have the instincts which make a leading practitioner able to pick the right tools to adopt for any specific need.
But it taught me a lot, which can be applied well beyond IT. First: as a previous CEO used to say, “If you don’t keep score, you’re only practising”. So, to drive and verify and improvement, you need metrics. But pick the right ones, which will show you where you are. Establish your baseline before you start doing anything. Use the metrics to demonstrate the change (you hope!). And when the improved process has reached the status of business-as-usual, you can probably drop the measure. It’s no longer needed.
Second: a saying that was drummed into us. “Don’t tinker!”. Don’t make changes on the basis of “I think …” without the analysis. Don’t over-react to one-off incidents: processes have variability, and some outliers will happen naturally.
And third: develop and demonstrate your own (internal IT) understanding and improvements before you try to work with the rest of the business. IT has, perhaps, an unique overview of what goes on across the company, and is almost always a participant in any business improvement project. So there’s good leverage there: but you have to gain credibility first. It takes a lot to get to the point where, when a business leader asks for an IT development, you can say “Why? What improvement are you driving? Who will own it? How will you measure it?”
Well: tomorrow I’m facilitating a Corporate IT Forum event on Business Process Improvement (BPI). I’m expecting the twin threads of, first, identifying and improving IT’s own processes; and, second, putting that experience and expertise at the service of the business as a whole. Where are the sources of information and analysis?
Gartner have a Leaders Key Initiative on BPI. The overview, as recent as July this year, has a natty graphic showing the BPI practitioner as a juggler (operations, transformation, skills, technology and innovation) under pressure from both business and technology forces. They offer a number of tools for maturity assessment “across IT disciplines” (what about the rest-of-business?); key metrics (that’s IT spending and staffing, not how to measure a process); and best practices across several competencies. It seems, though, towards the end to lapse back into business process management (BPM) not BPI.
There isn’t a lot in the Gartner blogs, but a useful post from Samantha Searle earlier this year challenges us to avoid the word “Process” (unless your business-side colleagues are process engineers or in manufacturing). That kind of gells with the observation that Gartner probably, under the covers, maintain an IT-oriented focus because Process is very present in the key initiative!
Similarly I don’t find a great deal in Forrester specifically around BPI. But there’s a stronger focus on the interplay of IT expertise and whole-business improvement. A recent report, for example, discusses the shift from “a tactical process improvement charter” to a more strategic role across the enterprise. This requires a plan “for optimizing the BPM practice to deliver on new strategic drivers and business objectives”. That sounds more like it.
Interestingly, a search collected a link to Cambridge University which I expected to be to the business school or computer science. But it’s to their internal management services division with a one-page (one-slide, really) graphic and definition of BPI. Take a look. But the Judge Institute of Management Studies does indeed have a Centre for Process Excellence and Innovation, also worth reviewing.
There’s a lot of material you can find by searching. Too much to survey. Assess with care!
• Business Process Improvement Leaders Key Initiative Overview, Gartner, 25 Jul 2013 (search Gartner for ID:G00251230)
• 10 New Year Resolutions for BPM Practitioners #2: Don’t Mention the “P-word …, Samantha Searle, Gartner blogs, 8 Feb 2013
• Optimize Your Business Process Excellence Program To Meet Shifting Priorities, Clay Richardson, Forrester report, 6 Jun 2013
• Business Process Improvement, University of Cambridge, Management and Information Services Division (undated)
• Centre for Process Excellence and Innovation, Judge Institute, University of Cambridge
Overdue update: Gartner blog index 9 Aug 2013Posted by Tony Law in Insight services, ITasITis, Managing IT, Technorati.
add a comment
I’ve finally done a full update on the Gartner Blogs index published on informationspan.com. There are three significant changes (as well as the normal turnover of analysts).
- Gartner have introduced three new areas within their Markets coverage (that is, the area for IT sales professionals): Digital Marketing; Servers & Storage – Comparative Hardware; and Servers & Storage – Competitive Positioning. The technical Servers and Storage area is unchanged.
- Digital Marketing has become the first area within Gartner’s Marketing area to offer blogs.
- the former Burton Group group of services, which has been marketed as Gartner IT1, now comes under the heading Gartner for Technical Professionals. There’s only one IT1 blog at the moment. But I’ve discovered that the legacy Burton blog content, which I had thought was deleted, is mostly still accessible. Their last content was posted in early 2010 but they may still have value.
As a result of this, I’ve made changes to the structure of the blog index.
- I’ve split the index of blogs by coverage area into two: one containing the technology-related blogs and the second the remainder which now are: Gartner Services and Management; those with the Vertical Industry focus; and a new section for Marketing.
- I’ve re-introduced a page linking to the legacy Burton Group blogs; one of them (Identity and Privacy) has completely disappeared but the others are still reachable.
Gartner Services and Management currently includes a handful of blogs from Executive Program advisors; one blog from the Supply Chain service (developed after the integration of AMR, of which nothing identifiable now remains); and a long-moribund but still accessible blog by my old META Group acquaintance Val Sribar, now a Gartner GVP.
I’ve also refreshed the list of blogs indexed by the custom Google Search of Gartner blogs, which appears on the lead page. Visit http://www.informationspan.com/analystblogs.htm.
Just to remind you: you can use this index for all sorts of functions Gartner don’t provide:
- go straight to your favourite analyst’s blog
- see whether a blog has (reasonably) recent content without having to visit it
- look for blogs on specific Gartner coverage areas
- find blogs which aren’t included in Gartner’s Blog Home page
- search specifically across the entire Gartner Blog space
Please tell me how you use this index, and how it might improve.