Benchmarking: sources 17 Apr 2013Posted by Tony Law in Insight services, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
I’m facilitating tomorrow a Corporate IT Forum discussion on twenty-first century benchmarking. It’s a wide topic. This post is a set of links and some comments, based on the InformationSpan database of 700 research and analyst firms. But I’m always grateful for updates: please comment!
The Forum itself operates a benchmarking service for clients, so there’s a declaration of interest to make but I am not myself a member of it. Primarily this is crowd sourced: it invites members to contribute their own data, and to compare themselves against their peers.
• Computer Economics provides a range of benchmarking data, not all financial. I’d consider it a primary source and worth a subscription. It provides a wide range of data. Major studies include IT Spending and Staffing Benchmarks and Worldwide Technology Trends. Their Management Advisories look at ROI and TCO, Risk Management and other topics. Too many to list here. Take a look for yourself.
• InterUnity Group “provides leading companies with strategy, competitive intelligence, and benchmarking to improve business performance.” It’s not clear what areas of benchmarking are actually covered or whether the focus is primarily financial
• The component services of the Corporate Executive Board will be worth investigating. Using the Researched Sharing model for content, CEB services such as the CIO Executive Board link and correlate information and tools from clients.
• Ventana Research undertakes benchmark research as one of its primary activities, drawing information from its own community, social media and the company’s “media partners”.
• The Data Warehousing Institute undertakes benchmarking in its key area, primarily business intelligence. They publish an annual BI Benchmark Report.
This is a rapid post in advance of the event. Look for a wider-ranging Coverage Report from InformationSpan when I’ve time to develop the theme.
Some Open Source notes 9 Feb 2013Posted by Tony Law in Consumerization, IT marketplace, ITasITis, Tech Watch, Technorati.
add a comment
In my persona as an Associate Lecturer of the Open University, I promised some brief notes on Open Source software to help a colleague who’s leading a Staff Development workshop in a couple of weeks’ time.
Educational providers always need to find workable inexpensive software to provision their students. Around 1990 I taught the first Open University course which took ICT facilities to the students in their homes, rather than requiring them to book time on terminals hosted by friendly local institutions. The DT200 course existed in the days of DOS, but it used an early on-screen word processor (FirstWordPlus on the GEM GUI), a cut-down version of the Lotus 1-2-3 spreadsheet, and the CoSy conferencing system. The configuration was an Amstrad 640 with two 5.25 inch floppy drives and no hard disk. Oh, and the mouse port was on the left hand side which is why, more than 20 years later, I still use my mouse left handed.
I promised some notes, as I said. And I thought I’d share them more widely. I use Open Source software quite freely but nothing startling. I also use other freeware and a handful of niche purchased products, such as Graphic Converter for the relatively limited image manipulation I need to do.
My main OU course now is the ICT foundation course which introduces students to a range of practical ICT tools as well as the social and global context in which the technologies operate. It uses Audacity for audio recording, which I’d been using for some time already for creating podcasts for students on another course. It uses FreeMind for mind maps. Alongside this it uses tools like Picasa for image manipulation which is free (from Google) but of course isn’t Open Source.
I use a Mac but run it sometimes as a Windows machine using BootCamp. On Windows I don’t maintain a Microsoft Office licence so I use Open Office. While there are some compatibility issues with on-screen presentation I haven’t hit any significant problems. I know there are some, but they haven’t affected anything I’ve needed to do. I use the VLC media player on Mac for Windows Media Player formats, since Microsoft no longer make a player for Mac.
The Firefox browser and other elements of the Mozilla family are of course Open Sourced and Firefox is my browser of choice. I use the internal web server on my Mac which is a version of Apache.
For application development I use Cincom Smalltalk which is a full object-oriented environment and although it’s commercially owned it’s developed by its OS community. I learned Smalltalk, also 20 years ago, when working on a collaborative academic-industry research project and I still love it.
Working in industry, as I did until recently, I encountered a lot of suspicion about Open Source. More recently I think it’s abated somewhat but it’s still there.
The debate around OS in the commercial IT sector focusses on accountability – not knowing who is accountable for quality or who can be sued (to put it bluntly) for any real problems. It’s difficult for procurement-minded professionals to accept that a community of interest is likely to have higher quality standards and to identify and fix problems more quickly than a major for-profit software supplier.
This attitude has softened over the past several years, not least because some software (such as Apache and Linux) has become widely used in the enterprise. To my reading there are (at least) two reasons. Cost (obviously) but also licensing.
It’s a lot easier to promote a web service when you don’t have to license according to the number of users. Quality has become a given for the most widely used products. Security can be easier to assure and handle when there can be access to source code. And acquiring OS software through a distributor does offer some assurance of quality. There have been some high profile espousals of OS software, such as Linux or Open Office in government departments which are supremely cost-conscious, but these haven’t had an enormous impact in the wider commercial marketplace.
What is, I think, true is that as more specialised niche requirements have been accepted within the enterprise, there’s a recognition that either open source or niche (= small startup) providers may be the only route to a solution. Someone, somewhere, has created an open source community around your need.
There are various definitions of what constitutes Open Source. By one definition, a specification is “Open” if it is published, so that it can be used by other platforms – as other word processing software can create documents in Microsoft’s format. Conversely, the Open Document Format was defined through an open process: but isn’t yet accepted as the leading standard for interoperability. This is the open process I learned about through by participation in the Object Management Group’s work. Building consensus and reconciling different viewpoints, including those of commercial developers, takes time: but there is often a strong academic foundation, and academic rigour often sustains a longer-lasting and more effective standard. Or, again, there is development through an open community which brings many minds to bear on problems; which converges on useful solutions; but which can become self-perpetuating so that the vision does not always grow or, where necessary, change.
• Sourceforge: one of the strongest groups of Open source communities
• Sourceforge is host to Audacity and to FreeMind
• Linux (of course)
• Apache (the Apache Software Foundation) also hosts Open Office
• Mozilla for Firefox, Thunderbird and more
• Smalltalk (see this page for versions)
• VLC media player
• Graphic Converter from Lemkesoft
• Picasa from Google
• Gem Desktop
Anatomy of a crash and recovery 6 Mar 2012Posted by Tony Law in ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
So I’ve been struggling, the last several days, through the consequences of a hard disk crash on my trusty, but five-year-old, iMac. Application of the standard tools (Apple’s Disk Utility, and Disk Warrior) maintained access for a while but the machine now simply refuses to boot MacOS, and the tools won’t recover it. It just hangs. So I’m on Plan B, with a hairy workload making the timing as inopportune as it could be.
The tendency is to assume that if you have backups (and I have – Time Machine is fantastic!) then everything’s ok. Well, it is, but I thought it might be instructive to catalogue some of the issues and problems.
I’ve bitten the bullet and ordered a new machine. I daresay that if I cleared the old one, reformatted the disk and re-installed everything it might be able to mark the bad blocks, or whatever’s the problem. But that’s not much different from re-installing to a new machine, and this one is indeed five years old and won’t be capable of running the latest OSX upgrades. So, a new iMac is on its way.
And that was the first frustration, I’m not far from an Apple Store, and hoped I’d just be able to hike over there and order what I needed. But they only stock the basic models, and won’t do in-store upgrades (memory etc) so it’s had to be online and wait ten days.
Hence, my mainstream work has to transfer to the laptop. The Bootcamp partition on the iMac is still fine, so I can boot the machine in Windows and I’m using it in that version right now to write this. Anything through a browser is fine, so my Open University email and online work just transfers over; and I can do bits in Open Office. But I don’t have most of my software on Windows.
I don’t, in any case, want to end up with work spread between two machines; and the Windows partition isn’t backed up as I haven’t, hitherto, used it for anything permanent. And I haven’t been able to get the wireless keyboard to work with it, ever (see later) so I’m on my old Apple wired keyboard with a coffee spill which has debilitated the left hand Shift and CTRL keys (which I use more than the right hand ones, wouldn’t you just know). So it’s the Macbook for most of the work.
Well, everything’s on the Time Machine backup. Simple, surely, to just haul files onto the Macbook (overnight, perhaps) and away we go.
Well, no. I haven’t figured out why, but a proportion of the files on the backup give trouble. Quite a lot transfer fine. But a high proportion flag up that I don’t have permission to write to a folder somewhere down the chain. So, initially, I’m going down the chains and copying collections of individual files, at which point I get a prompt for a password and it’s ok. I don’t figure this, as I’m an Administrator on the machine. According to the permissions I have full read/write access. And there doesn’t seem much difference between the files that transfer and the ones that won’t. But there we are.
Weird work-around coming up. I’ve got the backup disk connected to the Macbook: I can’t make the Windows iMac see it on Firewire, but I can see Windows on the network. So I use the Macbook to copy directly from the backup to the shared drive on Windows, and then copy back from Windows to the Macbook’s own hard drive. No permission problems or password prompts. Ho, hum! I now have an almost complete rebuild. I’ve had to do it in limited batches because the Windows partition is not all that big, but I can leave the copy jobs running and it works.
There’s some software I won’t reinstall until I have the new machine, and that’s going to be a pain because some of it’s licensed and I may need to get new licence keys (things like Classic Menu and Graphic Converter, not to mention Office and my Bootcamp Windows). And of course, there will be masses of updates to re-apply.
I have quite a lot of aliases, to provide alternative paths to some files and folders: while these appear in the right places on the rebuild, they don’t “work” until they’ve been re-assigned.
And there is, of course, a lot of information that’s in places other than my well-defined data area. Mail was ok; I moved the Office Identities folder across, and it worked. I use Apple’s Address Book and Calendar, not Microsoft’s, so I can replicate with the mobile phone using iSync. Find and copy the Address Book files across, and everything works. Calendar, not so good; I had to carefully re-import data into the Macbook calendar, one Calendar Group at a time, to maintain the structure. Websites, on the Web Server: find and copy; that’s fine. Microsoft Office templates: I know where those are so that’s ok. There’ll be more; but that’s where I am right now.
Remember the wireless keyboard? I’ve switched it to working with the Macbook. And of course I’ve now connected the Time Machine drive to the Macbook too so I’m still being backed up.
Shifting to the new machine, when it arrives, will no doubt throw up new issues. But for the moment, I’ve got work to do!
Links: none, this time
Alan Turing at the Turing: 100 years old 21 Feb 2012Posted by Tony Law in Impact of IT, ITasITis, Tech Watch, Technorati.
add a comment
It’s 100 years since the birth of Alan Turing, and I’m attending the eponymous annual lecture given in his honour under the joint auspices of the British Computer Society (BCS) and the Institution for Engineering and Technology (IET). The lecturer, Prof. Ray Dolan of UCL, intends to review cognitive neuroscience as “hidden legacy” of Turing’s definition, and investigation, of “computable numbers”.
As my wife is a Counsellor with an interest in the working of the brain, this may be of interest beyond the full-house IT constituency currently gathered.
Well, the lecture theatre was crowded so it would have been antisocial to blog as we went. Just a short retrospective, then, and I’ll link to the video replay when it’s available.
Judging from the comments prefacing audience questions after the lecture, I may be in a minority: but to me this was an opportunity missed. We certainly heard an erudite lecture on brain function. And at the start we had a short treatise on Bayesian logic. This deals with the unravelling of uncertain data.
If I understand it correctly, you start with observations which, with a degree of uncertainty, represent the state of what’s being investigated. In the case of Turing’s work at Bletchley Park, they had observations of intercepted coded messages, and were attempting to infer the settings on the coding machine that had created them. In neurobiology, they observe the areas of brain activity with the intention of figuring how the brain works.
This seems like a fruitful parallel, though one of the experiments which Prof Dolan described turned out to have a non-Bayesian interpretation. There’s certainly a computational problem here, described as “Start with a theory of how certain parameters give rise to the data you observe, and attempt to go from the data to the parameters”. I can identify with that: as a spectroscopist, I used least-squares analysis to infer the component absorptions in both Mossbauer and infrared spectra of minerals. But by this time the parallels with Turing had been lost, apart from occasional references back to the original scene setting.
Which was a pity, since I guess most of the people at the lecture were IT people, not neurobiologists. I’d have appreciated a lecture following both threads at once. I still wonder if it’s possible to pursue the parallel, so that Turing’s work and this undoubtedly fascinating field could be explored, as it were, in a “twisted pair” of threads.
But make your own mind up. Follow the link below to the IET’s page for the lecture, and click the link to “Play webcast”.
• From cryptanalysis to cognitive neuroscience – a hidden legacy of Alan Turing: the BCS/IET Turing Lecture, IET 20 Feb 2012
Insight coverage: Consumerisation 21 Feb 2012Posted by Tony Law in Consumerization, Insight services, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
Tomorrow I’m part of the team delivering the Corporate IT Forum’s Consumerisation Summit in London. That’s prompted me to create the latest InformationSpan insight services Coverage Report.
Coverage Reports identify the major, second tier and niche insight providers who can effectively support enterprise IT in their strategy, decision making and operational management. In the case of consumerisation, a review of our database of over 400 IT insight providers is revealing.
There’s a strong tendency for consumerisation (or, in North American coverage, “consumerization”) to be equated to the use of smart endpoint devices. Certainly the movement began with enabling cheaper, consumer-side PCs rather than corporately procured devices with a tailored enterprise desktop; and the use, now, of smartphones, tablets and other Bring Your Own devices is a key part of the topic. With, of course, its attendant concerns for appropriate use, security, information protection and so on.
But consumerisation, properly understood, must encompass the wide range of consumer-end online services and applications: freeware (such as the Open Smalltalk which I use for programming); consumer cloud services (where Google Apps started); replacements for conventional technologies (such as the fax-to-email service which provides my rarely-used fax reception capability); and much, much more. I surveyed these in a presentation a couple of years ago; see the link below.
So I define consumerisation as the use, in the enterprise, of technologies provisioned directly by users through the open consumer marketplace – or, at the least, technologies also commonly purchased and used directly by end consumers. I categorise these into: collaboration platforms; communications; research; contact management; and infrastructure.
This Coverage Report identifies who covers what, based on what I can see on their websites. While, as mentioned, a lot of coverage is confined to smart devices, there are providers who look well beyond this and take a more positive attitude (as opposed to lock-down-everything). Forrester Research, of the majors, has been looking for some years at the impact of Generation Y on the workforce and the end-user experience they bring, and this informs their coverage. Horizon Watching, as always, punches above its weight.
CSC’s Leading Edge Forum were probably the first to fully identify this trend, and have around ten years’ well developed coverage. The surprise in the survey is a second-tier provider called Info-Tech Research, who also have a range of strategy starters, tools and other resources.
For a bit more information about the report, visit InformationSpan, below. Other links to providers are in the report which costs £150 from informationspan.com.
• Coverage report: Consumerisation. InformationSpan, Feb 2012 (brochure)
• Can Web 2.0 run your Business? InformationSpan presentation, BCS Consultancy SIG, Jan 2010 (free download)
• Consumerisation Summit, Corporate IT Forum, 22 Feb 2012
Gartner integrates Burton; blogs index updated 11 Nov 2011Posted by Tony Law in Insight services, ITasITis, Tech Watch, Technorati.
add a comment
Over the last few months, Gartner have finally and fully integrated the Burton Group services and analysts acquired in January 2010.
The IT1 service is now referred to as Gartner IT1, although the Burton name is still attached to Gartner’s lead web page for the service. But the separate Burton Group website, which was maintained independently for a while, has now joined the AMR site in being consigned to oblivion.
Gartner’s online page outlines how they differentiate IT1 from the mainstream Gartner technical service. They pitch IT1 as adding the technical depth to the mainstream (“detailed technical insight to help your technical architects and engineers deliver outstanding results”). This was indeed the rationale for acquiring Burton: the need to provide service-oriented IT professionals with deep technical support for their architectural and implementation decisions, and an admission that Gartner, as they were, did not have the full resources needed to deliver this insight – though I’m not sure they would have admitted it before the acquisition!
At the same time, the Burton legacy blogs have also joined AMR in the Delete basket. This means InformationSpan has been able to simplify our Blogs Index for Gartner by removing references to Burton information. It’s been updated, with a few new names and other changes.
We’ve also introduced new indicators to identify blogs which are active and those which are, in various stages, dormant. Currently, of 123 Gartner analyst blogs which are accessible online, only 53 have content published within the last three months. For a further 18, the most recent post is between 3 and 6 months old; for 10, between 6 and 9; for 9, between 9 and 12; and 33 are at least a year out of date and sometimes significantly more. Also of these 123 blogs, 24 are still on the system but don’t appear in Gartner’s own list of analysts who are blogging. Some of these relate to analysts who have left Gartner: but not all; correspondingly, not all blogs are removed when an analysts leaves. It’s a touch confusing, but our index shows clearly what’s what and who’s who.
On the positive side: all the titled blogs, including Mastering the Hype Cycle (which had been dormant) have recent content. The Symposium blog is particularly worth visiting at the moment, while the Autumn cycle of Symposium events continues.
Links for PCI DSS 8 Nov 2011Posted by Tony Law in Impact of IT, IT is business, ITasITis, Managing IT, Tech Watch, Technorati.
Tags: PCI DSS
add a comment
I’m facilitating a workshop next week on PCI DSS and as usual here are some of the links I’ve identified, including some recent enforcement casework.
For the uninitiated: PCI is the Payment Card Industry and DSS is its Data Security Standard. PCI is an international body, and the standards are effectively set by the “acquirers” – that’s PCI-speak for those bodies such as card issuers and banks who “acquire” the transactions and transfer money.
National information security requirements are very much to the fore too. In the UK the Information Commissioner’s Office (ICO) recently took enforcement action against Lush, the cosmetics firm, and their press release uses that case to emphasise that organisations must implement PCI DSS, or some equivalent standard, in order to be meet the basic requirements for compliance. This issue was resolved by an undertaking from Lush, but ICO information outlines all the enforcement options and potential penalties.
Compliance to standards doesn’t replace the need to understand potential vulnerabilities, not least when using embedded page elements that can be hijacked!
PCI – Payment Card Industry
PCI DSS – PCI Data Security Standards
CSRF: Cross-Site Request Forgery
IDS : intrusion detection system
IPS: Intrusion Prevention System
ISA: Internal Security Assessor
QSA: Qualified Security Assessor
ISO: Independent Sales Organisation (in this context!)
• PCI SSC Data Security Standards Overview, from PCI Security Standards Council
• ICO warns retailers to implement PCI-DSS or face “enforcement action”, Security Vibes, 12 Aug 2011
• Online security must be a priority for retailers, says ICO, ICO Press Release, 9 Aug 2011
• Taking action: data protection and privacy and electronic communications, ICO information (including a list of recent prosecutions)
• PCI DSS: An Acquirers guide for PCI Compliance Best Practices, from the PCI Compliance Guide (an independent PCI source)
• Cross-Site Request Forgery (CSRF), information from the Open Web Application Security Project (OWASP)
Green IT Expo: presentations published 8 Nov 2011Posted by Tony Law in IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
Keynote presentations from the Green IT Expo (see previous postings) have now been posted. Simon Mingay’s presentation from Gartner is not available (now there’s a surprise) and be warned that the link behind the rubric “Presentation Unavailable” goes to the following presentation from Verdantix.
• Green IT Expo presentations
• A Gartner perspective on Green IT, ITasITis, 1 Nov 2011
• Green IT; encountering Connection Research, ITasITis, 1 Nov 2011
• Green 3: Andy Lawrence of 451, ITasITis, 1 Nov 2011