Tags: Big Data, Frost and Sullivan, Smart
add a comment
I’m on a Frost and Sullivan webinar: Growth, Innovation and Leadership (GIL: a major Frost theme). It’s a half-hour panel to discuss successful types of innovation and examples of future innovative technologies with Roberta Gamble, Partner, Energy & Environmental Markets, and Jeff Cotrupe, Director, Stratecast. David Frigstad, Frost’s Chairman, is leading. The event recording will be available in due course.
Frigstad asserts that most industries are undergoing a cycle of disrupt, collapse, transform (or die: Disrupt or Die is an old theme of mine). We start with a concept called the Serendipity Innovation Engine. It’s based on tracking nine technology clusters; major trends; industry sectors; and the “application labs” undertaking development (which includes real labs and also standards bodies and others). And all of this is in the context of seven global challenges: education, security, environment, economic development, healthcare, infrastructure, and human rights.
Handover to Gamble. This is a thread on industry convergence in energy and environment, seen as a single sector. Urbanisation, and the growth of upcoming economies, are major influences here in demand growth.
We do move to an IT element: innovation in smart homes and smart cities, with integration between sensor/actuator technology and social/cloud media: emphasising this, Google has just bought a smart home company (Nest Labs). City CIOs and City Managers are mentioned as key people – a very US-centric view when most urbanisation is not occurring in the developed world … we do return to implications for developing economies, where the message is that foundations for Smart (which includes effective, clean energy use) should be laid now while there is a relatively uncluttered base to start from.
Frigstad poses a question based on the idea that Big Data is one of the most disruptive trends in this market. Gamble suggests that parking is an example. Apps to find a parking spot, based on data from road sensors or connected parking meters, are not though only being piloted in San Francisco. Similar developments in the UK were mentioned at a Corporate IT Forum event I supported earlier this year.
It’s a segue into the next section: an introduction for Cotrupe, whose field is Big Data and Analytics. Examples of disruption around here include the Google car: who would have thought Google would be an automotive manufacturer? Is your competitor someone you wouldn’t expect? An old question, of course. The UK’s canal companies competed with each other and perhaps with the turnpike roads; they mainly didn’t foresee the railways.
Cotrupe’s main question is: What is Big Data really? He posits it as an element of data management, together with Analytics and BI. I’d want to think about that equation; it’s not intuitively the right way round. But high volume, rapidly moving data does have to be managed effectively for its benefit to be realised – delivering the data users need, when they need it, but not in to overwhelm them. And this means near real-time. It’s IT plus Data Science.
Frost suggest they are more conservative than some, because they see growth of the BD market held back by the sheer cost of large scale facilities.
We’re on the promised half hour for the primary conversations, but still going strong, basically talking with Cotrupe about various industry sectors where Big Data has potential: to support, for example, a move from branch based banking to personal service in an online environment. There’s some discussion of Big Data in government: how will this affect the style of government in perhaps the next 20 years? Cotrupe mentions a transformation in the speed of US immigration in recent years, where data is pre-fetched and the process takes minutes instead of hours. He’s advocating opening up, sharing of information: in other industries too, for example not being frozen by HIPAA requirements in (US) healthcare or, perhaps, EU data protection requirements. I have personal experience of obstructive customer service people trying to hide behind those, and in fact parading their lack of actual knowledge.
Cotrupe talks about privacy, not least in the wake of Snowden and what’s been learned about sharing between NSA and the UK agencies. Cotrupe would like to see theis ease of sharing brought to bear in other areas: but asks how we manage privacy here? There are companies which are leading the way in data collection in consumer-sensitive ways, and this needs to become standard practice. In any case, not collecting data you don’t need will reduce your data centre (should that be Data Center?) footprint.
As we come to a close, with a commercial for the September event in Silicon Valley, I have to say I’m not convinced this webinar was wholly coherent.
If you call something a Serendipity Innovation Engine I want to know how it relates to serendipity: that is, the chance identification of novel discoveries.
If you present a layered model, I expect the layers to relate (probably hierarchically) to one another. It would be more valuable to talk about the four elements of this model separately and be clearer about what each represents. For example, “Health and Wellness” occurs as a Technology Cluster (why?). It’s also a Mega Trend in a layer where Social Trends also sits; surely people’s concern about Health and Wellness is a social trend? Each layer seems to mix social, technical and other concerns.
I learned a more useful framework when teaching the OU’s Personal Development course. This really is layered. The two internal layers (this is for personal development) are one’s immediate environment, and other elements of your working organisation. Then Zone 3 (near external) encompasses competitors, customers/clients, suppliers and local influences. Zone 4 (far external) includes national and international influences: social, technological, economic, environmental and political (STEEP). On this framework you can chart all the changes discussed in today’s webinar and, I think, more easily draw conclusions!
• Frost & Sullivan Growth Innovation & Leadership
• Google buys Nest Labs for $3.2bn …, The Guardian, 13 Jan 2014
• STEEP framework: Sheila Tyler, The Manager’s Good Study Guide (third edition, 2007). The Open University. Pages 198-202
Link: Heartbleed update 15 Apr 2014Posted by Tony Law in Impact of IT, ITasITis, Managing IT, Tech Watch, Technorati, Uncategorized.
Tags: Cisco, Heartbleed, security
add a comment
A quick follow up, back from a few days away.
Huffington Post have a recent update which notes that the Open SSL vulnerability applies in major products from Cisco and Juniper Networks. They also repeat what’s becoming the consensus on passwords: change your passwords for services which you know were vulnerable but have now been patched. There’s no point in changing a password which might still be at risk.
They reference the Mashable resource on what’s been patched a,md copy the patchable list: Google (and Gmail), Yahoo (and Yahoo Mail), Facebook, Pinterest, Instagram, Tumblr, Etsy, GoDaddy, Intuit, USAA, Box, Dropbox, GitHub, IFTTT, Minecraft, OKCupid, SoundCloud and Wunderlist. A quick look, though, suggests that the Mashable article was a one-off and the list is not being kept updated.
The article also recommends turning off external access to your home network: the sort of capability, for example, that you might use for remote access through LogMeIn, TeamViewer or similar. If you’re not using this kind of facility, disable it. Your firewall should already be holding the line on this.
And check what your Internet provider is doing and the status of your wireless router. Being a BT user. with a BT Home Hub, I tried searching the bt.com website for information on Heartbleed but nothing surfaced. It would be nice to know.
Huffington suggests that, at the moment, public WiFi has to be treated as an unknown quantity since you can’t tell what infrastructure they use or whether it’s been patched. BT again doesn’t have any information on the impact of Heartbleed on BT Wifi (Openzone, as was) but it does say that user details are encrypted when you log in to their service. It’s perhaps ironic that they offer free Cisco VPN software, which you can download when connected to one of their hotspots. I didn’t know this. I’ll take it up for my laptop.
I also have an O2 Wifi locator app on my phone. There’s nothing about security on their website. Anyone with other Wifi-finder apps? Please check their sites and post a comment here about what you find.
• The Heartbleed Bug Goes Even Deeper Than We Realized – Here’s What You Should Do, Alexis Kleinman, The Huffington Post, 11 Apr 2014
• Security when using BT’s Wi-fi hotspots, BTWifi.com, with link to the Cisco offer
• The Heartbleed Hit List, Mashable, 9 Apr 2014
• What to make of Heartbleed? ITasITis, 4 Apr 2014
Insight providers and market evaluation 6 Nov 2013Posted by Tony Law in Impact of IT, Insight services, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
This is a slightly extended version of a response in LinkedIn to Michael Rasmussen, who has published some thought (“a rant”) about Gartner’s Magic Quadrant.
MQ is a highly influential and long established analyst tool. As an insight services user in enterprise IT, I made use of MQs regularly and would also review similar tools such as Forrester’s Wave when a purchasing decision was being made. Like anything else, it’s essential to know just what a tool like this is, how it’s created and what it does and does not convey. The same is true of Gartner’s Hype Cycle, as I’ve commented elsewhere.
Michael highlights several concerns about Gartner’s recently updated MQ in his own area of considerable expertise, that is, global risk and compliance (GRC). Do read his original, which I won’t attempt to summarise; see the link below. Here’s my response.
Michael, having read the whole post in your blog, a couple of comments from a user’s perspective. First: I wholly agree that Forrester’s Wave value is in the open availability both of the evaluation criteria and of the base data; it would be fantastic to see the same from Gartner. This isn’t just an issue of general open-ness. Since a user can adjust the weightings on the Forrester evaluations, it becomes a much more practical tool.
Second, I remember the moment of revelation when I realised there is a whole industry out there called Analyst Relations, that is, people employed by (big) vendors to influence the analysts. Users often don’t realise that’s how the insight market works.
Third, new approaches do emerge. I’d be interested in your take on Phil Fersht’s Blueprint methodology at Horses for Sources (HfS).
My own analysis of the insight market itself classifies providers in various dimensions. One of these looks at reach, both geographic and content: from global generalists (Gartner for example) through to niche (often start-ups – you yourself have progressed from niche to global specialist since you left Forrester). Perhaps tools like the Wave or MQ should have similar dimensions so that the innovative new providers can be properly assessed.
To add a couple more points. As a technology innovation researcher, I was always well aware that small start-ups often offered innovative options which larger vendors didn’t have or hadn’t got round to. But you took the risk of the enterprise falling apart, failing to deliver, or just failing. Experimental technologies always carry risk and the options are tactical (innovation for shorter-term business benefit) not strategic. Gartner I’m sure would assert that innovation is handled by their Vision dimension in the MQ but, as Mike points out, there are thresholds and other elements which mean that these tools don’t make it into MQs. HfS makes innovation explicit.
Second, in business-critical areas which are highly specific to your business area it’s unlikely that an insight provider will know as much as you do. Don’t automatically assume that a MQ or any other tool will deliver the right answer. Use the tools most certainly, but be prepared to reason your way to, argue for and adopt a solution which is at odds with what the tools say. You must of course be able to justify this, but the general answer may not be right for you.
• Gartner GRC Magic Quadrant Rant, Part 3, Mike Rasmussen, GRC Pundit, 23 Oct 2013
• The HfS Blueprint Methodology Explained, Jamie Snowden and others, HfS Research, Oct 2013
• GRC 20/20 research (Mike Rasmussen)
Business Process Improvement 17 Sep 2013Posted by Tony Law in Impact of IT, IT is business, ITasITis, Managing IT, Technorati, Uncategorized.
add a comment
Working for GlaxoSmithKline IT, after the 2000 merger, developed my familiarity with business process improvement (small letters) and with Six Sigma methods and metrics. I would never call myself an expert. Routine training was to Green Belt level, without taking the qualifying exam, and I don’t have the instincts which make a leading practitioner able to pick the right tools to adopt for any specific need.
But it taught me a lot, which can be applied well beyond IT. First: as a previous CEO used to say, “If you don’t keep score, you’re only practising”. So, to drive and verify and improvement, you need metrics. But pick the right ones, which will show you where you are. Establish your baseline before you start doing anything. Use the metrics to demonstrate the change (you hope!). And when the improved process has reached the status of business-as-usual, you can probably drop the measure. It’s no longer needed.
Second: a saying that was drummed into us. “Don’t tinker!”. Don’t make changes on the basis of “I think …” without the analysis. Don’t over-react to one-off incidents: processes have variability, and some outliers will happen naturally.
And third: develop and demonstrate your own (internal IT) understanding and improvements before you try to work with the rest of the business. IT has, perhaps, an unique overview of what goes on across the company, and is almost always a participant in any business improvement project. So there’s good leverage there: but you have to gain credibility first. It takes a lot to get to the point where, when a business leader asks for an IT development, you can say “Why? What improvement are you driving? Who will own it? How will you measure it?”
Well: tomorrow I’m facilitating a Corporate IT Forum event on Business Process Improvement (BPI). I’m expecting the twin threads of, first, identifying and improving IT’s own processes; and, second, putting that experience and expertise at the service of the business as a whole. Where are the sources of information and analysis?
Gartner have a Leaders Key Initiative on BPI. The overview, as recent as July this year, has a natty graphic showing the BPI practitioner as a juggler (operations, transformation, skills, technology and innovation) under pressure from both business and technology forces. They offer a number of tools for maturity assessment “across IT disciplines” (what about the rest-of-business?); key metrics (that’s IT spending and staffing, not how to measure a process); and best practices across several competencies. It seems, though, towards the end to lapse back into business process management (BPM) not BPI.
There isn’t a lot in the Gartner blogs, but a useful post from Samantha Searle earlier this year challenges us to avoid the word “Process” (unless your business-side colleagues are process engineers or in manufacturing). That kind of gells with the observation that Gartner probably, under the covers, maintain an IT-oriented focus because Process is very present in the key initiative!
Similarly I don’t find a great deal in Forrester specifically around BPI. But there’s a stronger focus on the interplay of IT expertise and whole-business improvement. A recent report, for example, discusses the shift from “a tactical process improvement charter” to a more strategic role across the enterprise. This requires a plan “for optimizing the BPM practice to deliver on new strategic drivers and business objectives”. That sounds more like it.
Interestingly, a search collected a link to Cambridge University which I expected to be to the business school or computer science. But it’s to their internal management services division with a one-page (one-slide, really) graphic and definition of BPI. Take a look. But the Judge Institute of Management Studies does indeed have a Centre for Process Excellence and Innovation, also worth reviewing.
There’s a lot of material you can find by searching. Too much to survey. Assess with care!
• Business Process Improvement Leaders Key Initiative Overview, Gartner, 25 Jul 2013 (search Gartner for ID:G00251230)
• 10 New Year Resolutions for BPM Practitioners #2: Don’t Mention the “P-word …, Samantha Searle, Gartner blogs, 8 Feb 2013
• Optimize Your Business Process Excellence Program To Meet Shifting Priorities, Clay Richardson, Forrester report, 6 Jun 2013
• Business Process Improvement, University of Cambridge, Management and Information Services Division (undated)
• Centre for Process Excellence and Innovation, Judge Institute, University of Cambridge
Overdue update: Gartner blog index 9 Aug 2013Posted by Tony Law in Insight services, ITasITis, Managing IT, Technorati.
add a comment
I’ve finally done a full update on the Gartner Blogs index published on informationspan.com. There are three significant changes (as well as the normal turnover of analysts).
- Gartner have introduced three new areas within their Markets coverage (that is, the area for IT sales professionals): Digital Marketing; Servers & Storage – Comparative Hardware; and Servers & Storage – Competitive Positioning. The technical Servers and Storage area is unchanged.
- Digital Marketing has become the first area within Gartner’s Marketing area to offer blogs.
- the former Burton Group group of services, which has been marketed as Gartner IT1, now comes under the heading Gartner for Technical Professionals. There’s only one IT1 blog at the moment. But I’ve discovered that the legacy Burton blog content, which I had thought was deleted, is mostly still accessible. Their last content was posted in early 2010 but they may still have value.
As a result of this, I’ve made changes to the structure of the blog index.
- I’ve split the index of blogs by coverage area into two: one containing the technology-related blogs and the second the remainder which now are: Gartner Services and Management; those with the Vertical Industry focus; and a new section for Marketing.
- I’ve re-introduced a page linking to the legacy Burton Group blogs; one of them (Identity and Privacy) has completely disappeared but the others are still reachable.
Gartner Services and Management currently includes a handful of blogs from Executive Program advisors; one blog from the Supply Chain service (developed after the integration of AMR, of which nothing identifiable now remains); and a long-moribund but still accessible blog by my old META Group acquaintance Val Sribar, now a Gartner GVP.
I’ve also refreshed the list of blogs indexed by the custom Google Search of Gartner blogs, which appears on the lead page. Visit http://www.informationspan.com/analystblogs.htm.
Just to remind you: you can use this index for all sorts of functions Gartner don’t provide:
- go straight to your favourite analyst’s blog
- see whether a blog has (reasonably) recent content without having to visit it
- look for blogs on specific Gartner coverage areas
- find blogs which aren’t included in Gartner’s Blog Home page
- search specifically across the entire Gartner Blog space
Please tell me how you use this index, and how it might improve.
Enterprise grade public cloud: IDC’s take 19 Jun 2013Posted by Tony Law in Cloud, Consumerization, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
I’m on an AT&T webcast relating to public cloud infrastructure and its growth. Allow that this is primarly a US-focussed perspective. It’s AT&T sponsored, but delivered by IDC. It’s being recorded, and I’ll add the URL when it’s available.
Much of the underlying data comes from IDC’s winter 2012 CloudTrack Survey, with around 500 respondents. Five elements: the pace of change; deployment; networking; workloads; and next-generation solutions.
IDC refer to the “third platform”, not just second platform; and with spend growing nearly 12% per year compared to less than 1% for second platform. Third platform will account for almost 25% of this combined spend by 2020, and in the next three years spend on external services will grow to around an eighth of “traditional” IT spend. Over three quarters of North American companies are already using public cloud services.
There’s a useful categorisation of cloud deployment models, with names that speak for themselves. Self-run private or managed private; dedicated (externally) hosted or virtual private cloud; or public. Running across these are the decisions about on- or off-site, and dedicated or shared infrastructure. That eighth of spend shift over the next three years depends on these decisions.
Virtual-private cloud (VPC) has clout, through additional security and control, better connectivity into corporate networks, and more controlled SLAs but are built on public cloud infrastructure. AT&T believe shared services will command the lion’s share of the developing spend, although the split between dedicated and shared is more equal right now. This is what AT&T imply by “enterprise grade public cloud”.
Connectivity is crucial (remember, AT&T is a network company …) and there is an opportunity to connect VPC through an MPLS (multi-protocol label switching) high-availability cloud network rather than the public internet. Integration to the corporate network is close to seamless. IDC believe this option overcomes many enterprise objections to VPC cloud usage. And the CloudTrack survey suggests that any major workload coming up for reinvestment is at least going to be considered for cloud migration.
Noticeably, the workloads most likely to be moved are about the key elements of the “third platform”: social, big data (and analytics) and mobile. Where relevant, emerging markets also make a strong contribution to the importance of the third platform. Enterprises will need competencies across cloud and all these; they may not be tagged as cloud initiatives, but in these spaces cloud is crucial for developments to be effective, and those developments will be combinations of the four technology spaces. There’s a graphic for this; look in the webcast when it’s online (I’ll add the URL when it’s available).
On the half hour. Transition from the IDC analyst (Frank Gens, Senior Vice President and Chief Analyst) to Amy Machi, AT&T representative. This is a sales pitch for the combination of IBM’s Smart Cloud solution and AT&T’s VPN (NetBond), and you’ll get less notes. But with so much discussion about the limitations of service agreements with providers, it’s interesting that IBM trail over 70 auditable automated tasks available to clients, and cloud-based ITIL processes. Also, an important point is that AT&T will scale network capability in line with the demands on the scaleable cloud resource being claimed at IBM’s end of the wire. For anyone looking seriously at this version of the Cloud option, several case studies show the variation in possibilities.
Note, too, that at the present this is a US service and users need to be an AT&T customer. It will extend to Europe and Asia/Pacific relatively soon.
So: in response to questions, Frank Gens believes that investment in new capabilities will swamp legacy migration onto the third platform. And IT managers (VP/SVP) are coming to accept a reputable cloud service provider as having security at least as good as their own and possibly better, but the network has remained a vulnerability. With a managed MPLS network, rather than public infrastructure, these concerns are mitigating.
Benchmarking: sources 17 Apr 2013Posted by Tony Law in Insight services, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
I’m facilitating tomorrow a Corporate IT Forum discussion on twenty-first century benchmarking. It’s a wide topic. This post is a set of links and some comments, based on the InformationSpan database of 700 research and analyst firms. But I’m always grateful for updates: please comment!
The Forum itself operates a benchmarking service for clients, so there’s a declaration of interest to make but I am not myself a member of it. Primarily this is crowd sourced: it invites members to contribute their own data, and to compare themselves against their peers.
• Computer Economics provides a range of benchmarking data, not all financial. I’d consider it a primary source and worth a subscription. It provides a wide range of data. Major studies include IT Spending and Staffing Benchmarks and Worldwide Technology Trends. Their Management Advisories look at ROI and TCO, Risk Management and other topics. Too many to list here. Take a look for yourself.
• InterUnity Group “provides leading companies with strategy, competitive intelligence, and benchmarking to improve business performance.” It’s not clear what areas of benchmarking are actually covered or whether the focus is primarily financial
• The component services of the Corporate Executive Board will be worth investigating. Using the Researched Sharing model for content, CEB services such as the CIO Executive Board link and correlate information and tools from clients.
• Ventana Research undertakes benchmark research as one of its primary activities, drawing information from its own community, social media and the company’s “media partners”.
• The Data Warehousing Institute undertakes benchmarking in its key area, primarily business intelligence. They publish an annual BI Benchmark Report.
This is a rapid post in advance of the event. Look for a wider-ranging Coverage Report from InformationSpan when I’ve time to develop the theme.