Insight providers and market evaluation 6 Nov 2013Posted by Tony Law in Impact of IT, Insight services, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
This is a slightly extended version of a response in LinkedIn to Michael Rasmussen, who has published some thought (“a rant”) about Gartner’s Magic Quadrant.
MQ is a highly influential and long established analyst tool. As an insight services user in enterprise IT, I made use of MQs regularly and would also review similar tools such as Forrester’s Wave when a purchasing decision was being made. Like anything else, it’s essential to know just what a tool like this is, how it’s created and what it does and does not convey. The same is true of Gartner’s Hype Cycle, as I’ve commented elsewhere.
Michael highlights several concerns about Gartner’s recently updated MQ in his own area of considerable expertise, that is, global risk and compliance (GRC). Do read his original, which I won’t attempt to summarise; see the link below. Here’s my response.
Michael, having read the whole post in your blog, a couple of comments from a user’s perspective. First: I wholly agree that Forrester’s Wave value is in the open availability both of the evaluation criteria and of the base data; it would be fantastic to see the same from Gartner. This isn’t just an issue of general open-ness. Since a user can adjust the weightings on the Forrester evaluations, it becomes a much more practical tool.
Second, I remember the moment of revelation when I realised there is a whole industry out there called Analyst Relations, that is, people employed by (big) vendors to influence the analysts. Users often don’t realise that’s how the insight market works.
Third, new approaches do emerge. I’d be interested in your take on Phil Fersht’s Blueprint methodology at Horses for Sources (HfS).
My own analysis of the insight market itself classifies providers in various dimensions. One of these looks at reach, both geographic and content: from global generalists (Gartner for example) through to niche (often start-ups – you yourself have progressed from niche to global specialist since you left Forrester). Perhaps tools like the Wave or MQ should have similar dimensions so that the innovative new providers can be properly assessed.
To add a couple more points. As a technology innovation researcher, I was always well aware that small start-ups often offered innovative options which larger vendors didn’t have or hadn’t got round to. But you took the risk of the enterprise falling apart, failing to deliver, or just failing. Experimental technologies always carry risk and the options are tactical (innovation for shorter-term business benefit) not strategic. Gartner I’m sure would assert that innovation is handled by their Vision dimension in the MQ but, as Mike points out, there are thresholds and other elements which mean that these tools don’t make it into MQs. HfS makes innovation explicit.
Second, in business-critical areas which are highly specific to your business area it’s unlikely that an insight provider will know as much as you do. Don’t automatically assume that a MQ or any other tool will deliver the right answer. Use the tools most certainly, but be prepared to reason your way to, argue for and adopt a solution which is at odds with what the tools say. You must of course be able to justify this, but the general answer may not be right for you.
• Gartner GRC Magic Quadrant Rant, Part 3, Mike Rasmussen, GRC Pundit, 23 Oct 2013
• The HfS Blueprint Methodology Explained, Jamie Snowden and others, HfS Research, Oct 2013
• GRC 20/20 research (Mike Rasmussen)
Even lightweight articles can mislead … 19 Oct 2013Posted by Tony Law in Impact of IT, Insight services, ITasITis, Tech Watch, Technorati.
add a comment
Today’s inbox flags a short report in TechRepublic by Eric Eckel looking (yet again) at the Total Cost of Ownership differential between a current iMac and a midrange Windows PC (Windows 7, not Windows 8 by the way). This is something you can argue about for ever and I’m not joining that debate.
I read it, though, since I’m a Mac user myself and like to see where these arguments are going. And my approach is always to dig behind the presentation and go back to original sources.
Well, there’s an authoritative reference here. The writer quotes Gartner to the effect that “in June, Gartner predicted that iOS/OS X will soon surpass Windows as the most popular computer platform”.
There were several issues with this reference. First and most obviously: the text carried an active link which, I assumed, would be to a Gartner press release or an authoritative report of the Gartner report. No such thing. It was to a prior report in another trade publication: MacWorld. Not only that: it wasn’t a link to the article; only to the top-level MacWorld front page. No use at all for finding an artlcle written back in June, when Gartner’s report came out. More on this below.
Second: the link says “iOS/OS X”. The Gartner figures combine iPhones and iPads with Macs. And it balances this by including Windows smartphones, which by most accounts are not the most successful technology. But the TechRepublic article does not discuss smartphones and tablets. It’s about the TCO of business desktop computers . Data should be restricted to what’s relevant.
Third: yes the MacWorld article does report Gartner as predicting iOs/OS X overtaking Windows – by 2015. But the conversation then splits.
MacWorld itself goes on to discuss the success of Android: “sales of devices based on Google’s Android operating system [will] beat the combined sales of Apple and Windows products”. When? – this year. This is from the Gartner research.
But there’s the other follow on. A quick search for other reports of the Gartner work reports their expectation that Windows will bounce back and “pull away again” by 2017 – see PC World, for example. It’s worth noting that the two articles (MacWorld and PC World) are by the same IDG News Service reporter, Martyn Williams, but spun differently for the Mac and Windows audiences. As are other reports in, for example, Computerworld. Everyone’s syndicated the same piece, near enough.
Next, who are the Gartner analysts? Carolina Milanesi, the Gartner analyst quoted by Williams, works in Gartner’s Mobile and Wireless area (not mainstream PCs) and contributes to Gartner’s regular client webinars for device market forecasting, in tandem with Ranjit Atwal who covers PCs, laptops and handheld devices.
I haven’t been able to see the original Gartner research; it’s in a client report. The original reporting is from June this year: a Gartner press release from that date quotes figures up to 2014 only, and on that timescale they predict Windows remains ahead of the combination of OS X and iOS. Credit to one report by Blair Hanley Frank in GeekWire which links to the press release directly, instead of relying on someone else’s reporting. The problem with using Gartner research is that most of it is priced to limit it, in effect, to paying clients: so you have to rely on press releases, on free research which they release (and yes, it does exist), and on reports by others who’ve been at events such as Symposium. Oh yes, and on blogs: don’t forget that if you find Gartner’s Blog Network impenetrable then InformationSpan offers a full index by either analyst name or coverage area, and a search too. The link’s in the side bar here.
So using Gartner research requires a little digging, but it isn’t that hard. It’s too easy to un-critically use someone else’s reporting to support a case you’re trying to make. I’ve got nothing against persuasive writing, but this case study shows the importance of (a) going back to original sources; (b) being critical about the sources you cite; and (c) looking more widely than the reference you first thought of!
• iMac vs. a comparable Windows box: The TCO lowdown, Erik Eckel, TechWorld, 15 Oct 2013
• The MacWorld article referenced by Eckel is: Apple devices ‘to overtake Windows by 2015′, Martyn Williams, MacWorld, 15 Jun 2013
• Alternative report: Apple OSes to narrow gap with Windows, says Gartner, Martyn Williams, PC World, 24 Jun 2013
• Android vs. Apple vs. Windows: Forecast shows shift for PCs, tablets, devices, Blair Hanley Frank, GeekWire, 24 Jun 2013
• Gartner Says Worldwide PC, Tablet and Mobile Phone Shipments to Grow 5.9 Percent in 2013 …, Gartner press release, 24 June 2013
• Gartner analysts: Carolina Milanesi and Ranjit Atwal
Enterprise grade public cloud: IDC’s take 19 Jun 2013Posted by Tony Law in Cloud, Consumerization, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
I’m on an AT&T webcast relating to public cloud infrastructure and its growth. Allow that this is primarly a US-focussed perspective. It’s AT&T sponsored, but delivered by IDC. It’s being recorded, and I’ll add the URL when it’s available.
Much of the underlying data comes from IDC’s winter 2012 CloudTrack Survey, with around 500 respondents. Five elements: the pace of change; deployment; networking; workloads; and next-generation solutions.
IDC refer to the “third platform”, not just second platform; and with spend growing nearly 12% per year compared to less than 1% for second platform. Third platform will account for almost 25% of this combined spend by 2020, and in the next three years spend on external services will grow to around an eighth of “traditional” IT spend. Over three quarters of North American companies are already using public cloud services.
There’s a useful categorisation of cloud deployment models, with names that speak for themselves. Self-run private or managed private; dedicated (externally) hosted or virtual private cloud; or public. Running across these are the decisions about on- or off-site, and dedicated or shared infrastructure. That eighth of spend shift over the next three years depends on these decisions.
Virtual-private cloud (VPC) has clout, through additional security and control, better connectivity into corporate networks, and more controlled SLAs but are built on public cloud infrastructure. AT&T believe shared services will command the lion’s share of the developing spend, although the split between dedicated and shared is more equal right now. This is what AT&T imply by “enterprise grade public cloud”.
Connectivity is crucial (remember, AT&T is a network company …) and there is an opportunity to connect VPC through an MPLS (multi-protocol label switching) high-availability cloud network rather than the public internet. Integration to the corporate network is close to seamless. IDC believe this option overcomes many enterprise objections to VPC cloud usage. And the CloudTrack survey suggests that any major workload coming up for reinvestment is at least going to be considered for cloud migration.
Noticeably, the workloads most likely to be moved are about the key elements of the “third platform”: social, big data (and analytics) and mobile. Where relevant, emerging markets also make a strong contribution to the importance of the third platform. Enterprises will need competencies across cloud and all these; they may not be tagged as cloud initiatives, but in these spaces cloud is crucial for developments to be effective, and those developments will be combinations of the four technology spaces. There’s a graphic for this; look in the webcast when it’s online (I’ll add the URL when it’s available).
On the half hour. Transition from the IDC analyst (Frank Gens, Senior Vice President and Chief Analyst) to Amy Machi, AT&T representative. This is a sales pitch for the combination of IBM’s Smart Cloud solution and AT&T’s VPN (NetBond), and you’ll get less notes. But with so much discussion about the limitations of service agreements with providers, it’s interesting that IBM trail over 70 auditable automated tasks available to clients, and cloud-based ITIL processes. Also, an important point is that AT&T will scale network capability in line with the demands on the scaleable cloud resource being claimed at IBM’s end of the wire. For anyone looking seriously at this version of the Cloud option, several case studies show the variation in possibilities.
Note, too, that at the present this is a US service and users need to be an AT&T customer. It will extend to Europe and Asia/Pacific relatively soon.
So: in response to questions, Frank Gens believes that investment in new capabilities will swamp legacy migration onto the third platform. And IT managers (VP/SVP) are coming to accept a reputable cloud service provider as having security at least as good as their own and possibly better, but the network has remained a vulnerability. With a managed MPLS network, rather than public infrastructure, these concerns are mitigating.
Benchmarking: sources 17 Apr 2013Posted by Tony Law in Insight services, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
I’m facilitating tomorrow a Corporate IT Forum discussion on twenty-first century benchmarking. It’s a wide topic. This post is a set of links and some comments, based on the InformationSpan database of 700 research and analyst firms. But I’m always grateful for updates: please comment!
The Forum itself operates a benchmarking service for clients, so there’s a declaration of interest to make but I am not myself a member of it. Primarily this is crowd sourced: it invites members to contribute their own data, and to compare themselves against their peers.
• Computer Economics provides a range of benchmarking data, not all financial. I’d consider it a primary source and worth a subscription. It provides a wide range of data. Major studies include IT Spending and Staffing Benchmarks and Worldwide Technology Trends. Their Management Advisories look at ROI and TCO, Risk Management and other topics. Too many to list here. Take a look for yourself.
• InterUnity Group “provides leading companies with strategy, competitive intelligence, and benchmarking to improve business performance.” It’s not clear what areas of benchmarking are actually covered or whether the focus is primarily financial
• The component services of the Corporate Executive Board will be worth investigating. Using the Researched Sharing model for content, CEB services such as the CIO Executive Board link and correlate information and tools from clients.
• Ventana Research undertakes benchmark research as one of its primary activities, drawing information from its own community, social media and the company’s “media partners”.
• The Data Warehousing Institute undertakes benchmarking in its key area, primarily business intelligence. They publish an annual BI Benchmark Report.
This is a rapid post in advance of the event. Look for a wider-ranging Coverage Report from InformationSpan when I’ve time to develop the theme.
Some Open Source notes 9 Feb 2013Posted by Tony Law in Consumerization, IT marketplace, ITasITis, Tech Watch, Technorati.
add a comment
In my persona as an Associate Lecturer of the Open University, I promised some brief notes on Open Source software to help a colleague who’s leading a Staff Development workshop in a couple of weeks’ time.
Educational providers always need to find workable inexpensive software to provision their students. Around 1990 I taught the first Open University course which took ICT facilities to the students in their homes, rather than requiring them to book time on terminals hosted by friendly local institutions. The DT200 course existed in the days of DOS, but it used an early on-screen word processor (FirstWordPlus on the GEM GUI), a cut-down version of the Lotus 1-2-3 spreadsheet, and the CoSy conferencing system. The configuration was an Amstrad 640 with two 5.25 inch floppy drives and no hard disk. Oh, and the mouse port was on the left hand side which is why, more than 20 years later, I still use my mouse left handed.
I promised some notes, as I said. And I thought I’d share them more widely. I use Open Source software quite freely but nothing startling. I also use other freeware and a handful of niche purchased products, such as Graphic Converter for the relatively limited image manipulation I need to do.
My main OU course now is the ICT foundation course which introduces students to a range of practical ICT tools as well as the social and global context in which the technologies operate. It uses Audacity for audio recording, which I’d been using for some time already for creating podcasts for students on another course. It uses FreeMind for mind maps. Alongside this it uses tools like Picasa for image manipulation which is free (from Google) but of course isn’t Open Source.
I use a Mac but run it sometimes as a Windows machine using BootCamp. On Windows I don’t maintain a Microsoft Office licence so I use Open Office. While there are some compatibility issues with on-screen presentation I haven’t hit any significant problems. I know there are some, but they haven’t affected anything I’ve needed to do. I use the VLC media player on Mac for Windows Media Player formats, since Microsoft no longer make a player for Mac.
The Firefox browser and other elements of the Mozilla family are of course Open Sourced and Firefox is my browser of choice. I use the internal web server on my Mac which is a version of Apache.
For application development I use Cincom Smalltalk which is a full object-oriented environment and although it’s commercially owned it’s developed by its OS community. I learned Smalltalk, also 20 years ago, when working on a collaborative academic-industry research project and I still love it.
Working in industry, as I did until recently, I encountered a lot of suspicion about Open Source. More recently I think it’s abated somewhat but it’s still there.
The debate around OS in the commercial IT sector focusses on accountability – not knowing who is accountable for quality or who can be sued (to put it bluntly) for any real problems. It’s difficult for procurement-minded professionals to accept that a community of interest is likely to have higher quality standards and to identify and fix problems more quickly than a major for-profit software supplier.
This attitude has softened over the past several years, not least because some software (such as Apache and Linux) has become widely used in the enterprise. To my reading there are (at least) two reasons. Cost (obviously) but also licensing.
It’s a lot easier to promote a web service when you don’t have to license according to the number of users. Quality has become a given for the most widely used products. Security can be easier to assure and handle when there can be access to source code. And acquiring OS software through a distributor does offer some assurance of quality. There have been some high profile espousals of OS software, such as Linux or Open Office in government departments which are supremely cost-conscious, but these haven’t had an enormous impact in the wider commercial marketplace.
What is, I think, true is that as more specialised niche requirements have been accepted within the enterprise, there’s a recognition that either open source or niche (= small startup) providers may be the only route to a solution. Someone, somewhere, has created an open source community around your need.
There are various definitions of what constitutes Open Source. By one definition, a specification is “Open” if it is published, so that it can be used by other platforms – as other word processing software can create documents in Microsoft’s format. Conversely, the Open Document Format was defined through an open process: but isn’t yet accepted as the leading standard for interoperability. This is the open process I learned about through by participation in the Object Management Group’s work. Building consensus and reconciling different viewpoints, including those of commercial developers, takes time: but there is often a strong academic foundation, and academic rigour often sustains a longer-lasting and more effective standard. Or, again, there is development through an open community which brings many minds to bear on problems; which converges on useful solutions; but which can become self-perpetuating so that the vision does not always grow or, where necessary, change.
• Sourceforge: one of the strongest groups of Open source communities
• Sourceforge is host to Audacity and to FreeMind
• Linux (of course)
• Apache (the Apache Software Foundation) also hosts Open Office
• Mozilla for Firefox, Thunderbird and more
• Smalltalk (see this page for versions)
• VLC media player
• Graphic Converter from Lemkesoft
• Picasa from Google
• Gem Desktop
Anatomy of a crash and recovery 6 Mar 2012Posted by Tony Law in ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
So I’ve been struggling, the last several days, through the consequences of a hard disk crash on my trusty, but five-year-old, iMac. Application of the standard tools (Apple’s Disk Utility, and Disk Warrior) maintained access for a while but the machine now simply refuses to boot MacOS, and the tools won’t recover it. It just hangs. So I’m on Plan B, with a hairy workload making the timing as inopportune as it could be.
The tendency is to assume that if you have backups (and I have – Time Machine is fantastic!) then everything’s ok. Well, it is, but I thought it might be instructive to catalogue some of the issues and problems.
I’ve bitten the bullet and ordered a new machine. I daresay that if I cleared the old one, reformatted the disk and re-installed everything it might be able to mark the bad blocks, or whatever’s the problem. But that’s not much different from re-installing to a new machine, and this one is indeed five years old and won’t be capable of running the latest OSX upgrades. So, a new iMac is on its way.
And that was the first frustration, I’m not far from an Apple Store, and hoped I’d just be able to hike over there and order what I needed. But they only stock the basic models, and won’t do in-store upgrades (memory etc) so it’s had to be online and wait ten days.
Hence, my mainstream work has to transfer to the laptop. The Bootcamp partition on the iMac is still fine, so I can boot the machine in Windows and I’m using it in that version right now to write this. Anything through a browser is fine, so my Open University email and online work just transfers over; and I can do bits in Open Office. But I don’t have most of my software on Windows.
I don’t, in any case, want to end up with work spread between two machines; and the Windows partition isn’t backed up as I haven’t, hitherto, used it for anything permanent. And I haven’t been able to get the wireless keyboard to work with it, ever (see later) so I’m on my old Apple wired keyboard with a coffee spill which has debilitated the left hand Shift and CTRL keys (which I use more than the right hand ones, wouldn’t you just know). So it’s the Macbook for most of the work.
Well, everything’s on the Time Machine backup. Simple, surely, to just haul files onto the Macbook (overnight, perhaps) and away we go.
Well, no. I haven’t figured out why, but a proportion of the files on the backup give trouble. Quite a lot transfer fine. But a high proportion flag up that I don’t have permission to write to a folder somewhere down the chain. So, initially, I’m going down the chains and copying collections of individual files, at which point I get a prompt for a password and it’s ok. I don’t figure this, as I’m an Administrator on the machine. According to the permissions I have full read/write access. And there doesn’t seem much difference between the files that transfer and the ones that won’t. But there we are.
Weird work-around coming up. I’ve got the backup disk connected to the Macbook: I can’t make the Windows iMac see it on Firewire, but I can see Windows on the network. So I use the Macbook to copy directly from the backup to the shared drive on Windows, and then copy back from Windows to the Macbook’s own hard drive. No permission problems or password prompts. Ho, hum! I now have an almost complete rebuild. I’ve had to do it in limited batches because the Windows partition is not all that big, but I can leave the copy jobs running and it works.
There’s some software I won’t reinstall until I have the new machine, and that’s going to be a pain because some of it’s licensed and I may need to get new licence keys (things like Classic Menu and Graphic Converter, not to mention Office and my Bootcamp Windows). And of course, there will be masses of updates to re-apply.
I have quite a lot of aliases, to provide alternative paths to some files and folders: while these appear in the right places on the rebuild, they don’t “work” until they’ve been re-assigned.
And there is, of course, a lot of information that’s in places other than my well-defined data area. Mail was ok; I moved the Office Identities folder across, and it worked. I use Apple’s Address Book and Calendar, not Microsoft’s, so I can replicate with the mobile phone using iSync. Find and copy the Address Book files across, and everything works. Calendar, not so good; I had to carefully re-import data into the Macbook calendar, one Calendar Group at a time, to maintain the structure. Websites, on the Web Server: find and copy; that’s fine. Microsoft Office templates: I know where those are so that’s ok. There’ll be more; but that’s where I am right now.
Remember the wireless keyboard? I’ve switched it to working with the Macbook. And of course I’ve now connected the Time Machine drive to the Macbook too so I’m still being backed up.
Shifting to the new machine, when it arrives, will no doubt throw up new issues. But for the moment, I’ve got work to do!
Links: none, this time