Trials of an upgrade

I decided, recently, to upgrade MS Office on my computers: a desktop and laptop Mac. There were a handful of reasons. First, I’d hit a limitation on my desktop machine which resulted because I was still running an old copy (Mac 2004) designed for the previous non-Intel architecture and running, therefore, under emulation. Second, the versions on the two machines were out of step: I upgraded the laptop to 2008 but not the desktop. And third, I’ve been getting a lot of mail database crashes and hoped an upgrade would cure this.

For the base Office software, no problems except the one I’d anticipated. Microsoft’s Visual Basic for Applications isn’t supported on the Mac within the new formats. What I’d not realised is that, so long as I continue to save in the older format, it does work. So I get an annoying prompt every time I save the spreadsheets that use it, but that’s all. Better than I anticipated.

And a few aggravations. The new window header takes up much more vertical space than the old one; it would be nice to be able to turn off the function tab ribbon. Sort in Excel has been changed; the new version is more capable than the old one, but it would be nice to have a “revert to classic” for the specifier. And Conditional Formatting, which I use a lot, has been substantially changed; I have to use formulas for everything (no wizard) and the defaults are dreadful: again, “revert to classic” would be nice. How did they decide on the new formats to offer as built-in? It’s a tedious way round to get to what I normally use.

But I daresay I’ll get used to these. The killer, though, was the replacement of a mail client called Entourage with one called Outlook. As one commentator pointed out: the name shift is highly significant because, instead of a client designed to integrate with the Mac platform the focus is now Exchange Server. Which of course I don’t have! And the things that have gone missing include: proper replication with the Apple Address Book which is my primary contacts directory (so that I can replicate out to my mobile phone) – I ticked the box, but it just didn’t do it (but see below …); the ability to duplicate messages (copy-and-paste), which I use for a handful of template messages that I keep in my Drafts folder; and a very useful facility which lets an Entourage user drag a mailbox to the desktop, thereby creating an archive copy which, in my case, I use to move the mailboxes I need onto my laptop when working away from home. In Outlook, this doesn’t work and I’d have to use categories and a selective Export.

Well, you’ve guessed it. I’ll use the 2011 versions of Word, Excel and Powerpoint. But for mail I’ve gone back to Entourage. Luckily, I did very little mail work in Outlook and I reset the system to “Leave messages on server” before I started. But the cycle cost me a day’s work because trying to get my contacts into the Outlook client created enormous numbers of replication conflicts which had to be sorted out (eventually, Outlook had decided it would replicate after all). That’s the sort of problem which reverting Address Book to the backup doesn’t solve.

Is this just a rant? Not really. I remember, very many years ago, a story of a meeting of heads of University computing services with their supplier, ICL, to talk about compatibility for the Fortran compiler for the new 2900-series computers about to be installed by most of them. And in the end, one of the customers ran out of patience. He took from his bag the Fortran manual for the old computer, banged it on the desk, and defined the problem in four words: “That’s your compatibility standard”.

My Office problems are just the latest version of this issue. Microsoft aren’t especially bad at it; the big things, like changes to document formats, are well handled with filters and converters. Sure, there are things it makes sense to drop or change. But it’s the little features that mess up the users. Why is upgrade compatibility such a hard issue to understand?

Turing stays home

Or at least, the archive of his papers will be doing so. It’s been announced that Alan Turing’s papers will now be able to be preserved at Bletchley Park, where the UK’s wartime (1940s) code-breaking efforts were centred and where Turing himself worked.

It’s probably only the technically minded who know Alan Turing’s name – including members of the BCS and the IET who perpetuate it through the annual Turing Lecture (this year’s event was recently held in London). But it should be more widely known; his contribution to the ending of the Nazi regime was enormous.

I’ve written this short note on a day when I’ve had a conversation with friend and artist Cyril Mount, some of whose paintings from the North Africa Campaign – and since – are about to be exhibited in a major event hosted by the Peace Studies Department at Bradford University. Cyril’s experiences were formative making him, if not specifically a peace campaigner, then certainly a peace advocate. Turing ended his own life a few years after the war, having been prosecuted for a homosexual relationship. Intolerance may have worn a more savage face in the Nazi regime, but it existed in this country too and, as Cyril’s paintings remind us, it’s still part of today’s world in many places.

• Enigma genius Alan Turing papers saved for the nation, BBC News, 25 Feb 2011
• An audience with … Donald Knuth, BCS/IET Turing Lecture (1 Feb), ITasITis, 2Feb 2011
• Ruffling Feathers, exhibition by Cyril Mount, Bradford, 4-25 March 2011

Tekrati has gone: Barbara French hasn’t

For those who haven’t noticed: Tekrati closed its doors (well, its websites) last Friday February 11th. So the first note is: thanks to Barbara French for all that she’s done through Tekrati over the years.

Thankfully the important elements of Tekrati have moved with Barbara to become a personal project. This includes the Analyst Directory and Barbara’s insights on the insight marketplace: the latter through her blog, and the former through a new presence which is now also a blog and hence open to comments and suggestions.

Barbara, best wishes in your new manifestation and, also, my personal thanks for the friendly reception when InformationSpan started and for various interactions over the last three years.

• Sway: analyst relations, influencer marketing & the business of influence
• The new Analyst Directory
Tekrati Closes in February 2011, Sway, 4 Feb 2011

Constellation: the next step

R “Ray” Wang launched his new venture, Constelation Research, on leaving Altimeter last November. Now, with over a dozen colleagues, a new full-featured website, and a research agenda, Constellation is preparing for an official launch “next week”.

The team have published a research agenda which their email says is focussed on eight “key disruptive areas that permeate existing and traditional coverage areas”. That’s a bit circuitous but what I think they’re saying is that they intend to focus on disruptive technology themes that run across the conventional analyst agendas.

The eight are:
• Cloud
• Mobile
• Social
• Analytics and game theory
• Unified communications and video
• Internet of things
• Sustainability and Corporate social responsibility
• Legacy Optimization
• Industry specific research in Next Gen Government (Gov 2.0); and
• Customer Experience for Services Based Industries

There’s a detailed plan for published research. It’s presented name-by-name; though not all fifteen analyst names feature in the list. Presenting it this way suggests that the research agenda is driven by the individuals’ interests, past pedigree in their former firms, and expertise.

There’s nothing wrong with that, given the intention to create a group of high profile individuals; but I’m not entirely convinced by the assertion that these are all “disruptive” areas. Legacy optimisation, for example: Ray Wang’s own impressive list of planned research includes “SAP Optimization Options” and “Renegotiating your Oracle contracts”.

And these themes are not missing from conventional coverage. Gartner, for example, have a Key Initiative on Cloud. Green IT is everywhere. And the Internet of Things sounds like Forrester’s X Internet theme up to a couple of years ago, or Yankee Group’s Anywhere Revolution.

A better categorisation shows up on Constellation’s “Suggestions” page, where the team solicit suggestions for the research agenda. There, they offer three broad themes. Navigating through disruptive technologies is one; but they add Designing next generation business models, and Funding innovation through legacy optimization. That seems to fit rather better.

Constellation are offering Open Research: reports which will be made available free – that’s in addition to the blog postings. It will be interesting to see what is published in this channel. Currently the only report there is a version of, and explanation for, the research agenda – Constellation’s co-authored planning assumptions and some insights or recommendations. There are names against these (actually, Twitter names), which is illuminating, and one or two nice phrases (“Enterprise social software migraine”, Sameer Patel).

So am I bold enough to predict a future for Constellation?

Constellation is clearly aiming to be a coordinated team, rather than just an umbrella organisation for individuals. The group is not large enough to be a broad-ranging (generalist) insight service. It won’t be able to take on Gartner or Forrester across the board. On the other hand: neither is it specialised enough to target a clear niche – in the way that, for example, Mike Rasmussen’s Corporate Integrity has achieved. And what will happen if/when the suggestions which the team are soliciting through their website move into areas where the existing team don’t operate?

The business model is the Gartner/Forrester one: a research plan, advisory services and consulting engagements, and a fee service to enterprise clients (what Constellation call “Buy side”) and to vendors. As Barbara French commented in November, it’s not disruptive in the insight marketplace; so Constellation will have to make its way in this marketplace when times are hard and buyers are looking to cut costs.

I’ll take a bet on its success, simply because R “Ray” Wang himself is a luminary in his field and people will beat a path to his door. Some at least of his colleagues are on a similar level. But I suspect that his organisation, and the business model, and the agenda may have to flex a time or two.

• Constellation Research; see the Research Agenda
• Constellation Research: what can users expect?, ITasITis, 15 Nov 2010
• SageCircle podcast: conversation with R “Ray” Wang (look for the 18 Jan 2011 podcast)

Ken Olsen’s VAX computer, and me

Following the death this week of Ken Olsen, founder of Digital Equipment Corp (DEC), here’s a personal look back at his creations (the PDP and VAX computers) and their impact on the computing scene.

The PDP-8 and PDP-11 were primarily engineers’ and scientists’ machines used in laboratories and in research environments. But with the VAX (Virtual Address eXtension) minicomputer, DEC created a general purpose machine which soon developed its way out of the lab and into the enterprise.

My first encounter with the VAX was on a brief study tour of US universities in the late 1970s, when I myself was working in University computing in London. Our US counterparts, of course, charged for their services and there were two models. A significant per-job charge could be allied to a small CPU rate, favouring large jobs. In this case, Departments with small-job workloads were buying their own VAXs to run this stuff. A small job charge with a large CPU rate was more friendly to small jobs: and guess what? The large jobs went to a departmental VAX. And it usually was a VAX. And the charging model had to adapt to the workload, so it got skewed even further in the same direction … It was relevant to us in the UK, because although we didn’t charge real money we did allocate resources to users and we did charge usage against these non-financial “budgets”.

When I moved into enterprise IT, with BP in the early 1980s, the VAX was just emerging from the scientists’ patch into the more general light of day. We ran databases. We developed a string of applications – mostly in Fortran. We supported the specific activities of our Division, though not yet the enterprise applications such as finance and HR: that came later.

And the range of machines made it possible to put larger machines in the centre, smaller ones in the overseas and outlying offices. You could be well assured that software developed on one would run on another – though I did get caught out on an overseas installation once, when the machine that “owned” the magnetic tape drive wasn’t the one intended to host the software on the tape I’d brought with me. And more important than portability of software was the portability of people that it facilitated. We could take a rig engineer or geophysicist from the Middle East, move them to a North Sea base, and they would be immediately in a familiar IT environment.

The VAX taught us a lot.

With networking came enterprise facilities such as shared information, online chat, and email: things we take for granted now (and Lotus Notes, which had such an enormous influence, was itself strongly influenced by these VAX applications). We could run programs, or access data, on remote machines. And, of course, we had terminal access. This was something I’d been well used to in the pioneering University centre I worked for, but which was by no means commonplace: cards and lineprinters still ruled, in many places.

Clustering, particularly, came in as a major step forward: the ability to share workload among a group of CPUs. But the software vendors scuppered it with per-CPU licensing rather than per-user: we had to restrict some software to specific machines in a cluster, which rather missed the point! Licensing lags behind developments in technology architectures, now as then.

Third: we had programmable terminals, so we could set up on-screen forms for a more friendly user interface (the VT100 and its successors). No open standards here, of course; highly proprietary.

And fourth, perhaps most important: we made the transition from a small specialist unit to an enterprise IT service running a general purpose workload. We changed from being a small group of close colleagues to a more structured, more formally managed function. We adopted standards and defined architectures. We created something approaching a data management strategy. So we were educated, fairly painlessly, in what was needed for the different scale of operation.

And the VAX coped with it. It ran scientific and enterprise workloads on the same machines. In VAX/VMS it had a good general purpose job control language which meant that you could offer users a largely protected environment for an application: properly done, there was relatively little that needed to be left around after the job had run (logical names, peripheral assignments and so on). Though not all my colleagues were as careful; I came to realise that I’d learnt many lessons the hard way in a University service where almost anything might legitimately be run on the machine.

In the end, though, VAX became a victim of its own success. It was so easy to extend the installation, particularly when networking and clustering arrived, that the senior executive would cry “Not another VAX ??” as the latest purchase request was tabled. In the end there were so many machines in so many offices, in and out of machine rooms, that the spend was out of control and the estate became unmanageable. So a lot of the local machines were withdrawn. Facilities were consolidated into a small number of larger centres, accessed over the increasingly capable DECnet. Then, of course, it was easy to ditch the lot and instal a different architecture. So began the decline.

Kenneth Harry Olsen, 20 February 1926 – 6 February 2011

• Ken Olsen obituary, Guardian, 9 Feb 2011
• Ken Olsen, Wikipedia (checked 10 Feb 2011)
• Ken Olsen, cofounder of Digital passes 2/6/2011, HP blog hum, 8 Feb 2011, with links
• PDP, VAX and other historical records from the William Bader museum website

For other links and information, use your favourinte search engine!

Microsoft licensing: links

One of my regular blog triggers: I’m facilitating an event next week looking at the options for Microsoft licensing in the enterprise, as Microsoft changes its licence offerings and also begins to see the impact of Office 365 (aka BPOS+) on its user deals. One of the things that jumps out, researching this, is the stream of advice about dealing with “new” Microsoft licensing strategies. It’s been regularly appearing over the last ten years at least. The need to do it isn’t new; the specifics are different.

There’s Select, or there was, and now there’s Select Plus, or there will be. There are Open and Enterprise agreements, Client Access Licences (e.g. for Sharepoint), and internet-based models. Microsoft are changing the landscape, and enterprises dependent (one way or another) on their software need to wise up.

I reported from the Eurodata seminar I attended ten days ago; but one of the sessions I didn’t describe was from Simon Vernoum of Bytes Software, who seems to get around and do a fair number of these sessions. I’ve found, on the web, an old Bytes presentation but it’s a year old and the point is that the model is changing.

So: where do you look? Corporate IT Forum members will of course be at the event next week. Guy Creese of Gartner (Burton Group subsection) has an interesting though brief blog post about the impact of Office 365: expect the Burton team to know quite a lot more. Gartner’s research reports don’t seem to cover licensing much; Forrester have some specifics. Gartner consultants probably know more, but they don’t publish.

And there are the vendors who will guide you through the morass and help get the right deal for you. Bytes is one; there are others such as SoftwareOne. It may surprise some people to realise that licence management companies are part of the Microsoft Partner Network.

• Microsoft Licensing Programs: Discover all the Options, Corporate IT Forum event, 15 Feb 2010 (member access)
• Software Licensing, from SoftwareOne
• Licensing solutions, from Bytes Software Solutions
Microsoft Office 365: The Market Impact, Guy Creese, Gartner, 20 Oct 2010
• Five New Microsoft Licensing Twists That Every IT Buyer Should Know, Forrester Research, 9 Jul 2010 (client access only to complete report)
• Microsoft EA Enrollment Amendment Will Facilitate Move to the Cloud, Gartner, 4 Nov 2010 (client access only to complete report)
Microsoft update in London, ITasITis, 28 Jan 2011

Mendele’ev’s Google (or vice versa)

Mendele’ev published the Periodic Table of the Elements in 1869. As a chemistry graduate, I once knew this intimately. It’s not just a classification; it’s a tool for understanding, and for prediction. And the underlying principles were good enough to cope with the discovery of elements Mendele’ev himself had never heard of: radium and uranium for a start (the work of Pierre and Marie Curie), and then all the trans-uranic elements which, on earth at least, have only been created artificially.

Well … a tweet from Frank Zimper (nice to tweet you, Frank, danke!) alerted me to this Periodic Table of the Google elements. I really like it as a chemist as well as an IT practitioner. It doesn’t just categorise; it observes many features of the “real” periodic table, like the groupings (alkali/alkaline earth elements, transition metals, and so on) and even follows the boundary between non-metals and metals in the B group on the right hand side.

Great piece of work! Enjoy it at