jump to navigation

SAPphire and Supernova: two reasons for a visit to Constellation 18 Jun 2014

Posted by Tony Law in Impact of IT, Insight services, IT marketplace, ITasITis, Tech Watch, Technorati.
Tags: , ,
add a comment

R “Ray” Wang’s Constellation group is worth watching anyway. But just now there are a couple of good reasons.

First, if you’re a SAP user, they have coverage of the recent SAPphire conference. Remember that Ray’s primary expertise, from his days at Forrester, is in ERP. Just go to Constellation and search for “Sapphire 2014″ for pre- and post-event analysis. There are of course also replays and other notes on the SAP website, if you want to go back to the originals.

Secondly, they are launching the call for this year’s Supernova innovation awards. Again, worth watching if your focus includes the what, how and who of innovation in business. As I’ve commented before, I’m not clear on the relationship between this Supernova event and the one formerly hosted by Kevin Wehrbach of the Wharton Business School (University of Pennsylvania) but Wehrbach’s Supernova hasn’t happened since 2010 and was described by him in 2012 as “on hold”.

Note, by the way, that their URL has changed from constellationrg.com to just constellationr.com.

Links:
• Constellation: search for Sapphire 2014
• Call for Applications: SuperNova Awards for leaders in disruptive technology, Courtney Sato, Constellation, 17 Jun 2014
• SAPPHIRE NOW 2014 (SAP Events)

Why I hate the new Google Maps 17 Apr 2014

Posted by Tony Law in Impact of IT, IT marketplace, ITasITis, Social issues, Tech Watch, Technorati, Uncategorized.
Tags: , , ,
add a comment

I finally allowed myself to be pushed into using the new Google Maps instead of the old familiar one.

Here are all the things that I cannot do as easily as previously.

1 – have it open by default with my own location rather than the blanket map of the USA

2 – immediately find my own list of custom maps. It’s an extra click and I have to know that it appears as a drop down from the search bar. Custom maps have become a lot more complicated to create and manage, too, with “layers” and so on. And there’s a different set of marker icons, differently styled from the old ones. So modifying an existing map, such as the one I maintain for Brighton Early Music Festival, won’t be straightforward if I want to maintain consistent styling.

3 – sharing has changed. It used to be simple: create a map, and embed the HTML provided. Now, for example, the Brighton Early Music Festival map doesn’t properly display the venue markers. Never had a problem before. Still working on this one!

4 – “search nearby” was a simple click from the pin marker on the old version. These pin markers have got “smart” which means that if I search for Victoria Coach Station, when I click or hover on the pin what I get is a list of all the coach services which leave from there. If I right click, I get three options: Directions to here; Directions from here; and What’s here, which doesn’t seem to do anything. If I search for Ebury Street (essentially the same location) I get a pin with no smart hover at all. But the marker does not now pop up nearby information, Directions, Save and Search Nearby options.

5 – no accessible help without going out to separate web pages; and even then the instructions don’t make sense. For example, Google says that “Search nearby” is on a drop down you find by clicking the search box. No, it doesn’t. Not in Firefox. It does, though, appear to work in Chrome. I don’t like being pushed to a different browser.

6 – having found Search nearby, I get given (of course) a set of strange, supposedly related, links. Well I suppose this is what Google does. But for me, it gets in the way.

7 – extra panels and drop-downs obscure parts of the map I’m trying to look at

Now all this, and more, is partly the natural response to changing a familiar application. Let’s assume that overall the product is fuller-featured and more flexible than the old version, and its links to the rest of Google’s information are more capable. But software vendors in general are not always good at user-oriented upgrades. Keep the backward compatibility unless there’s a really, really good reason not to. Icon redesigns, and added complexity in the user interface, are not good reasons.

I’m exploring alternatives. Apple’s new map application doesn’t have near the same level of functionality, and older offerings such as Streetmap haven’t really moved on either. But for (UK) route planning, for example, I’m now using either AA or RAC route planner – which still have the simple, straightforward A-to-B interface.

Links:
• Google Maps (new version)
• How to search “nearby” in new Google Maps? Google Forum, 11 Jun 2013
• Google Removes “Search Nearby” Function From Updated Google Maps, contributor to Slashdot, 16 Jan 2014
• Route planners from the AA and RAC
Streetmap (UK)

What to make of Heartbleed? 10 Apr 2014

Posted by Tony Law in Impact of IT, IT is business, IT marketplace, ITasITis, Social media, Tech Watch, Technorati.
Tags: ,
1 comment so far

I watched the BBC News report last night about the security hole in Open SSL. With its conclusion that everyone should change all their passwords, now … and the old chestnut that you should keep separate passwords for every service you use, never write them down, and so on. Thankfully by this morning common sense is beginning to prevail. The Guardian passes on advice to check if services have been patched first; and offer a link to a tool that will check a site for you.

First, as they say, other Secure Socket Layer implementations are available. While a lot of secure web connections do rely on Open SSL, it’s not by any means universal.

Second, as always, dig behind the news. As Techcrunch did. This is the first vulnerability to have its own website and “cool logo”; this was launched by Codenomicon in Finland which started by creating notes for its own internal use and then took what it calls a “Bugs 2.0″ approach to put their information out there. I remember doing something similar way back in Year 2000 days. Incidentally, the Open SSL report (very brief) credits Google Security for discovering the bug. It also identifies the versions which are vulnerable. (There’s a note there that says that if users can’t upgrade to the fixed version, they can recompile Open SSL with -DOPENSSL_NO_HEARTBEATS which, I’m guessing, gives a clue as to the naming of the bug.)

If you want real information, then, go to Heartbleed.com. The Codenomicon Q&A is posted there. In brief: this is not a problem with the specification of SSL/TLS; it’s an implementation bug in OpenSSL. It has been around a long time, but there’s no evidence of significant exploitation. A fix is already available, but needs to be rolled out.

What was clear, too, is that the BBC reporter (and some others) don’t understand the Open Source process. The Guardian asserts that “anyone can update” the code, and leads readers to suppose that someone can maliciously insert a vulnerability. Conspiracy theories suggest that this might even be part of the NSA’s attack on internet security. But of course that ain’t the case. Yes, anyone can join an Open Source project: but code updates don’t automatically get put out there. Bugs can get through, just as they can in commercial software: but testing and versioning is a pretty rigorous process.

Also, this is a server-side problem not an end-user issue. So yes, change your passwords on key services that handle your critical resources  if you’re worried but it might be worth, first, checking whether they’re likely to be using Open SSL. Your bank probably isn’t. There’s a useful list of possibly vulnerable services on Mashable (Facebook: change it; LinkedIn: no need; and so on)

And what do you do about passwords? We use so many online services and accounts that unless you have a systematic approach to passwords you’ll never cope. Personally, I have a standard, hopefully unguessable password I use for all low-criticality services; another, much stronger, for a small handful of critical and really personal ones; and a system which makes it fairly easy to recover passwords for a range of intermediate sites (rely on their Reset Password facility and keep a record of when this has been last used). But also, for online purchases, I use a separate credit card with a deliberately low credit limit. Don’t just rely on technology!

Links:
• Heartbleed, The First Security Bug With A Cool Logo, TechCrunch, 9 Apr 2014
• Heartbleed bug, website from Codenomicon (Finland) – use this site for onward references to official vulnerability reports and other sources
• OpenSSL project
• The Heartbleed Hit List, Mashable, 9 Apr 2014
Heartbleed: don’t rush to update passwords, security experts warn, Alex Hearn, The Guardian, 9 Apr 2014
• Heartbleed bug: Public urged to reset all passwords, Rory Cellan-Jones (main report), BBC, 9 Apr 2014
Test (your) server for Heartbleed, service from Filippo Valsorda as referenced in The Guardian. I’m unclear why this service is registered in the British Indian Ocean Territory (.io domain) since Filippo’s bio says he is currently attending “hacker school in NYC”. On your own head be it.

Horses for Sources: what’s with outsourcing 6 Feb 2014

Posted by Tony Law in Insight services, IT marketplace, ITasITis, Tech Watch, Technorati.
add a comment

I’m on a webinar by HfS Research: my first direct encounter with Phil Fersht’s organisation. It’s a where-are-we-going session called “Outlook for the Extended Enterprise”. This post will update live, as we go.

Primarily we’re discussing “extended’ in the sense of multiple outsourced operations, not of industry alliances and cooperative business. HfS’s own research, done in conjunction with KPMG, seems to be painting quite a poor picture of outsourcing value beyond running standard operations. “Talent, technology and analytics value”, Phil asserts, are frequently absent. Once the initial savings are off the books, value doesn’t develop in, for example, exploiting “big data”.

Business-enablement of IT is a gap. I’m beginning to feel like this conversation might have happened equally any time in the last ten, perhaps 20 years. What’s interesting is a breakdown of “BPO maturity” into four quartiles. There seems to be a gap which companies are about to cross to get into the top quartile.

What are the problems? Fear of change; lack of vision; silo operations. The espoused change is to a centre-led organisation; the pros and cons of this haven’t been discussed though. The point’s already been made that perhaps not all enterprises can achieve effective globally-managed business services (which means IT, HR and so on). Maybe that should be “… nor should they”?

Microphone being passed to Ed Caso of Wells Fargo Securities. He’s a senior analyst and has just switched the screen to presenter split-screen. Finally got into proper presentation mode. He’s offering a survey, I think, of the key providers in the outsource market. It’s the sort of analysis which Gartner and the others started out in … Some comments about the financial situation in India and its impact; changes in some providers. And a note that a lot of early 10-year contracts are coming up for review and re-tender. There are visa and immigration issues in several major economies, which might drive more work offshore as it becomes harder to identify skilled staff entitled to work in the home country.

Enterprise-wide sourcing is linked to wider awareness of options, a portfolio approach (provider, location and skills) rather than single-source, hybrid cloud usage, and worries about data security post-Snowden (see my previous post on this). And the providers are further challenged by SMAC (Social, Mobile, Analytics, Cloud): opportunities for the providers, but long term contracts don’t fit the speed of technology development. There’s still a tendency to be more comfortable with deliverables-based contracting rather than value-based.

Another change of speaker: Mike Friend of HfS. Where Caso was US-focussed, Friend is looking at Europe in the context of some fiscal optimism. There’s a prediction for IT oursourcing to grow at around 3.5% through the next four years, and BPO 6.1%, led by the UK market and particularly public sector spending. He’s mentioning a lot of individual companies.

So where do we go? Charles Sutherland of HfS takes over on process automation – that is, avoiding direct people costs – invoking more capable and “friendly” tools. This is still in the context of sourcing: looking for providers who can offer this as a way forward. It’s a potential differentiator in the market. Sutherland is encouraging buyers to look beyond simple cost. He’s suggesting what the signs might be that this is moving in the market, through 2014.

And the final speaker: Ned May of HfS on “the impact of digital”: the SMAC stack again, emphasising the need to embrace all four elements. The speaker does accept that “digital is not new” but I thought it had been around at least since the inauguration of the Web in the mid 1990s. The examples seem to be describing how what goes round comes around, perhaps with a new view of its capabilities. Experimentation will change to planned projects, but skunkworks projects will be of value. This isn’t just a technology change, it’s a mindset change. Some people have been saying this for a long time!

And finally: workforce issues, Christa Degna Manning. Who doesn’t seem to be accessible … emphasising the importance of a back channel for management issues on web calls! The issue is HR outsourcing as, like other areas, this moves to second/third generation outsourcing. Perhaps no longer primarily to support the HR practitioner, but to support and develop the employee.

The key question is whether this is still same-old outsourcing, or whether the trends discussed earlier apply here too. That is,  to look for what the webinar regards as higher-maturity outsourcing: the role of talent, for example, and long term benefits; managing contractors and non-employees; connection through collaboration technologies and perhaps to the world of crowd-sourcing and micro-work contracting (think Amazon Mechanical Turk). I’m reminded of John Adair’s long-established Venn diagram depicting management as the intersection of Task, Team and Individual.

Webcast preview link: http://www.horsesforsources.com/the-hfs-2014-outlook_012814. A replay link when I have it.

Over time, but a couple of quick questions to wrap up. The question of handling IP (I presume this means the IP that the outsource process generates). Providers like to be able to re-use (perhaps by back-licensing) processes, for example, developed within a contract.  A bit more elaboration about “digital”. I clearly need to figure out what HfS mean when they say “digital” but I think it means digitally-captured business information from, perhaps, unconventional, distributed, and big-data sources. And a question about how this works in a shared services model (which is not the same as global business services, even within the one enterprise).

Time to drop off the call. I’ll add some reflections, and tidy this up, tomorrow.

Facebook at 10, Microsoft at 40 5 Feb 2014

Posted by Tony Law in Cloud, Impact of IT, IT is business, IT marketplace, ITasITis, Managing IT, Social media, Technorati.
add a comment

OK, a slight stretch for a snappy headline but these have been two lead stories in the last few days.

Others will comment with more depth and more knowledge than I can on either Facebook’s tenth anniversary or the appointment of Satya Nadella to succeed Steve Ballmer (and, of course, Bill Gates) at the head of Microsoft. But I was remembering, quite a while ago now, a META Group event in London when the Web was just arriving and disintermediation was a new word. The speaker took a look at the banking industry, with new on-line start-ups starting to eat the lunch of the established financial institutions.

The point was this. The new entrants invested, typically, in just two things: infrastructure, and software development. Existing players had institutional weight; they had enterprises to keep in existence with all the corporate overheads that accumulate over time. with shareholders and stockmarket expectations and dividends. They needed to cut costs to compete with the new lean players. And (doesn’t it still happen?) they would target the IT budget. So the area of investment which differentiated their new competitors was precisely where they were dis-investing.

Microsoft is fast approaching 40. It’s a solid, established player with corporate overheads, strategies, shareholders. Is it still as lean and sharp as the company which turned on a sixpence (a dime, if you’re American; a 5p piece for the youngsters) when it “got” the Internet and realised that MSN and AOL were not going to be where most of the traffic went. Enter Internet Explorer, competing with Netscape; and the rest is history.

Well … we can look at areas in the recent past where that hasn’t been repeated. Smartphones? a lot of Windows phones have been sold, but Android and iPhone are the big players and an Office 365 subscription gives access to Office mobile software on these platforms as well as Windows. But on the other hand: Office 365 is a good model, for both consumers and Microsoft, because it converts intermittent capital costs for what is still essential software into predictable operational costs. And while capital versus operational is the language of the enterprise, where Microsoft’s heart arguably is these days, the concept works for individual licences. There are undoubtedly challenges, but a CEO with an Indian background may have the right insight and vision to work round all that unavoidable corporate baggage.

What about Facebook? Facebook has got to the stage where it is acquiring the corporate baggage (shareholders and so on). It’s had to face up to public perception, particularly over issues like personal online security. Both companies now find themselves covered in the main news sections and financial pages, like any other corporation, rather than only in  geek-tech reporting. They’ve gone mainstream.

So Facebook has new competitors in the social media space, sharper and newly innovative where Facebook is unavoidably solidifying. Microsoft is in a stable, continuing enterprise market which it understands; it appears not to understand the consumer market so well. Facebook is in precisely that consumer market, although a lot of enterprises use it to communicate with their own consumers. It’s a fashion market. What’s coming next? and how can Mark Zuckerberg stay ahead of the game?

No links here; just a personal opinion, and you can find lots of links with some easy searching!

Insight providers and market evaluation 6 Nov 2013

Posted by Tony Law in Impact of IT, Insight services, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment

This is a slightly extended version of a response in LinkedIn to Michael Rasmussen, who has published some thought (“a rant”) about Gartner’s Magic Quadrant.

MQ is a highly influential and long established analyst tool. As an insight services user in enterprise IT, I made use of MQs regularly and would also review similar tools such as Forrester’s Wave when a purchasing decision was being made. Like anything else, it’s essential to know just what a tool like this is, how it’s created and what it does and does not convey. The same is true of Gartner’s Hype Cycle, as I’ve commented elsewhere.

Michael highlights several concerns about Gartner’s recently updated MQ in his own area of considerable expertise, that is, global risk and compliance (GRC). Do read his original, which I won’t attempt to summarise; see the link below. Here’s my response.


Michael, having read the whole post in your blog, a couple of comments from a user’s perspective. First: I wholly agree that Forrester’s Wave value is in the open availability both of the evaluation criteria and of the base data; it would be fantastic to see the same from Gartner. This isn’t just an issue of general open-ness. Since a user can adjust the weightings on the Forrester evaluations, it becomes a much more practical tool.

Second, I remember the moment of revelation when I realised there is a whole industry out there called Analyst Relations, that is, people employed by (big) vendors to influence the analysts. Users often don’t realise that’s how the insight market works.

Third, new approaches do emerge. I’d be interested in your take on Phil Fersht’s Blueprint methodology at Horses for Sources (HfS).

My own analysis of the insight market itself classifies providers in various dimensions. One of these looks at reach, both geographic and content: from global generalists (Gartner for example) through to niche (often start-ups – you yourself have progressed from niche to global specialist since you left Forrester). Perhaps tools like the Wave or MQ should have similar dimensions so that the innovative new providers can be properly assessed.


To add a couple more points. As a technology innovation researcher, I was always well aware that small start-ups often offered innovative options which larger vendors didn’t have or hadn’t got round to. But you took the risk of the enterprise falling apart, failing to deliver, or just failing. Experimental technologies always carry risk and the options are tactical (innovation for shorter-term business benefit) not strategic. Gartner I’m sure would assert that innovation is handled by their Vision dimension in the MQ but, as Mike points out, there are thresholds and other elements which mean that these tools don’t make it into MQs. HfS makes innovation explicit.

Second, in business-critical areas which are highly specific to your business area it’s unlikely that an insight provider will know as much as you do. Don’t automatically assume that a MQ or any other tool will deliver the right answer. Use the tools most certainly, but be prepared to reason your way to, argue for and adopt a solution which is at odds with what the tools say. You must of course be able to justify this, but the general answer may not be right for you.

Links:
• Gartner GRC Magic Quadrant Rant, Part 3, Mike Rasmussen, GRC Pundit, 23 Oct 2013
• The HfS Blueprint Methodology Explained, Jamie Snowden and others, HfS Research, Oct 2013
GRC 20/20 research (Mike Rasmussen)

Enterprise grade public cloud: IDC’s take 19 Jun 2013

Posted by Tony Law in Cloud, Consumerization, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment

I’m on an AT&T webcast relating to public cloud infrastructure and its growth. Allow that this is primarly a US-focussed perspective. It’s AT&T sponsored, but delivered by IDC. It’s being recorded, and I’ll add the URL when it’s available.

Much of the underlying data comes from IDC’s winter 2012 CloudTrack Survey, with around 500 respondents. Five elements: the pace of change; deployment; networking; workloads; and next-generation solutions.

IDC refer to the “third platform”, not just second platform; and with spend growing nearly 12% per year compared to less than 1% for second platform. Third platform will account for almost 25% of this combined spend by 2020, and in the next three years spend on external services will grow to around an eighth of “traditional” IT spend. Over three quarters of North American companies are already using public cloud services.

There’s a useful categorisation of cloud deployment models, with names that speak for themselves. Self-run private or managed private; dedicated (externally) hosted or virtual private cloud; or public. Running across these are the decisions about on- or off-site, and dedicated or shared infrastructure. That eighth of spend shift over the next three years depends on these decisions.

Virtual-private cloud (VPC) has clout, through additional security and control, better connectivity into corporate networks, and more controlled SLAs but are built on public cloud infrastructure. AT&T believe shared services will command the lion’s share of the developing spend, although the split between dedicated and shared is more equal right now. This is what AT&T imply by “enterprise grade public cloud”.

Connectivity is crucial (remember, AT&T is a network company …) and there is an opportunity to connect VPC through an MPLS (multi-protocol label switching) high-availability cloud network rather than the public internet. Integration to the corporate network is close to seamless. IDC believe this option overcomes many enterprise objections to VPC cloud usage. And the CloudTrack survey suggests that any major workload coming up for reinvestment is at least going to be considered for cloud migration.

Noticeably, the workloads most likely to be moved are about the key elements of the “third platform”: social, big data (and analytics) and mobile. Where relevant, emerging markets also make a strong contribution to the importance of the third platform. Enterprises will need competencies across cloud and all these; they may not be tagged as cloud initiatives, but in these spaces cloud is crucial for developments to be effective, and those developments will be combinations of the four technology spaces. There’s a graphic for this; look in the webcast when it’s online (I’ll add the URL when it’s available).

On the half hour. Transition from the IDC analyst (Frank Gens, Senior Vice President and Chief Analyst) to Amy Machi, AT&T representative. This is a sales pitch for the combination of IBM’s Smart Cloud solution and AT&T’s VPN (NetBond), and you’ll get less notes. But with so much discussion about the limitations of service agreements with providers, it’s interesting that IBM trail over 70 auditable automated tasks available to clients, and cloud-based ITIL processes. Also, an important point is that AT&T will scale network capability in line with the demands on the scaleable cloud resource being claimed at IBM’s end of the wire. For anyone looking seriously at this version of the Cloud option, several case studies show the variation in possibilities.

Note, too, that at the present this is a US service and users need to be an AT&T customer. It will extend to Europe and Asia/Pacific relatively soon.

So: in response to questions, Frank Gens believes that investment in new capabilities will swamp legacy migration onto the third platform. And IT managers (VP/SVP) are coming to accept a reputable cloud service provider as having security at least as good as their own and possibly better, but the network has remained a vulnerability. With a managed MPLS network, rather than public infrastructure, these concerns are mitigating.

Benchmarking: sources 17 Apr 2013

Posted by Tony Law in Insight services, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment

I’m facilitating tomorrow a Corporate IT Forum discussion on twenty-first century benchmarking. It’s a wide topic. This post is a set of links and some comments, based on the InformationSpan database of 700 research and analyst firms. But I’m always grateful for updates: please comment!

The Forum itself operates a benchmarking service for clients, so there’s a declaration of interest to make but I am not myself a member of it. Primarily this is crowd sourced: it invites members to contribute their own data, and to compare themselves against their peers.

• Computer Economics provides a range of benchmarking data, not all financial. I’d consider it a primary source and worth a subscription. It provides a wide range of data. Major studies include IT Spending and Staffing Benchmarks and Worldwide Technology Trends. Their Management Advisories look at ROI and TCO, Risk Management and other topics. Too many to list here. Take a look for yourself.
• InterUnity Group “provides leading companies with strategy, competitive intelligence, and benchmarking to improve business performance.” It’s not clear what areas of benchmarking are actually covered or whether the focus is primarily financial
• The component services of the Corporate Executive Board will be worth investigating. Using the Researched Sharing model for content, CEB services such as the CIO Executive Board link and correlate information and tools from clients.
• Ventana Research undertakes benchmark research as one of its primary activities, drawing information from its own community, social media and the company’s “media partners”.
• The Data Warehousing Institute undertakes benchmarking in its key area, primarily business intelligence. They publish an annual BI Benchmark Report.

Major insight firms also cover benchmarking. Gartner‘s IT Topics include Cost Optimization and IT Metrics. A search on the Forrester website also shows a wide range of coverage.

This is a rapid post in advance of the event. Look for a wider-ranging Coverage Report from InformationSpan when I’ve time to develop the theme.

ICT professional standards in the UK: what a mess 11 Apr 2013

Posted by Tony Law in IT marketplace, ITasITis, Managing IT, Technorati.
add a comment

I teach a couple of Open University courses. In one of them, I’ve just got to the point where we encourage the students to work through the industry skills frameworks. The aim is to benchmark their skills and to identify both longer term career direction and short term professional development targets.

A few years ago it was confusing, but manageable. My first contact with this area was quite some years ago when the British Computer Society began to develop from an academic interest group into the professional organisation it is today. It began to review applications for membership. To benchmark (that word again) applicants’ status and career progression, it needed a framework. Out of this grew the Industry Structure Model, which identified a number of career tracks. This developed into the Skills Framework for the Information Age (SFIA), which is still a great set of definitions for ICT career people. More below, about SFIA.

When I first came back to this teaching, five years ago, the then government had created an enormous, wide-ranging family of National Occupational Standards (NOS). These were divided among a number of defined industry sectors and Sector Skills Councils. Some of the areas were fairly obvious, like Engineering. Others, perhaps less so, like Contact Centres. The general principle was a good one: that in the main, skills were only defined once. So, anyone whose role included management looked to the Management framework. It wasn’t re-defined in every profession. Anyone who used IT (and I mean, used as a user) could benchmark those skills against the IT User NOS standard. These “generic” skills were, as it were, imported into the professional portfolio which defined actual roles in real organisations.

Well, what have we now?

1. Originally, there was the overall IT Professional Competency model  (e-skills Procom). This has been discontinued so far as I can tell. It now exists only in the National Archive – under the “NVQ” section although Procom is not an NVQ framework (!).

Procom provided a framework of seven disciplines:

  • Sales and marketing
  • Business change
  • Programme and project management
  • Solutions architecture
  • Solution development and implementation
  • Information management and security
  • IT service management and delivery

2. Of these, disciplines 4, 5, 6, 7 are represented in the IT/Telecom Professional NOS of 2009. The SSC, e-skills UK, still exists and this framework is still current on the e-skills website. These are, though, hidden in a link right at the bottom of the page. Currently, look for “NOS” in the purple footer.

The IT/Telecom Professional framework categorises capabilities at five levels: Junior Technician; Associate Professional; Professional; Lead Professional; Senior Professional. It categorises its criteria according to Performance; Knowledge; and Understanding.

Alongside this, e-skills maintains the IT User NOS which is valuable for almost anyone, We all use IT user skills. This framework defines three levels: Foundation, Intermediate, and Advanced. The Advanced level overlaps into the IT Professional framework, covering user application development (Access, say, or Excel). This is also the framework where you’ll find user skills with software, be they office tools or specialised business applications.

3. The Skills Framework for the Information Age (SFIA) still exists and is now at version 5. It’s available as a spreadsheet download.

SFIA defines the following skill areas:

  • Strategy and architecture
  • Business Change
  • Solution development and implementation
  • Service Management
  • Procurement & management support
  • Client interface (i.e. sales & marketing)

It defines levels from 1 (junior) to 7 (which equates to senior management or CIO). Not all cells in the model have definitions at all levels: for example, within Strategy & Architecture the cell “Corporate governance of IT” begins at level 6. SFIA does have the advantage that it encompasses management to the most senior levels as well as technical capabilities.

4. Since late 2012 there is the IT Skills Academy. It is itself confusing.

First, it references a full set of role descriptions in its Standards section. The rubric says that “The IT Professional Standards have been organised and aligned to the relevant SFIA skills and levels.”. What this actually means is that the Standards are not aligned to SFIA, but there is a correlation table showing where matches have been identified.

They are not aligned to the NOS either. Again, some areas map across although the names are not quite the same. The disciplines here are:

  • Architecture, Analysis & Design
  • Business Change
  • Information Management and Security
  • IT Project Management
  • IT Service Management and Delivery
  • Sales & Marketing
  • Solution Development & Implementation
  • Transferable Competencies (three flavours: Personal, Business and Leadership).

The sub-categories of each discipline have definitions from Level 3 to level 6. The definitions are, like the NOS, divided as Performance; Knowledge; Understanding.

The Transferable section is well worth having. With the change to the NOS database overall, these general skills are now much harder to find elsewhere.

5. The Skills Academy website also offers the Professional Profile. This matches the categories and levels (3-6) of the Framework, but the descriptions are considerably simplified with a handful of “Do you do these things?” criteria.

6. Finally there is what you get to from the new NOS website. Searching this website is now far inferior to what used to be provided. The Search delivers only PDF documents for individual “cells” in the overall model, with titles such as “Software Development Level 5 Role”. Note the use of “Level 5″ which is not the categorisation used in the NOS. The content appears to be cloned from the NOS, but the sub-elements have been reorganised and you have to look at the content to infer that Level 5 equates to Professional.

There’s no link, as there used to be, back from these framework documents to the Sector Council or to the overall Suite, and there’s no search which will identify appropriate suites for a capability (as was the case on the old NOS website). Link to Search for indexes for both “Occupations” and “Suites”, but this assumes you already know what you’re looking for …

This is a horribly confused and confusing situation.

Links:
• IT Professional Competency model  (e-skills Procom), in the National Archive
• e-skills NOS page: look for links to IT/Telecom Professional and IT User frameworks
Skills Framework for the Information Age (SFIA)
• IT Skills Academy: IT Professional Standards, and the simplified My IT Professional Profile tool
• See: National Skills academy framework backed by UK employers, Computer Weekly, 4 Oct 2012
• The NOS website is now maintained by the UK Commission for Employment and Skills (UKCES). The former URL (ukstandards.org.uk) redirects here.
• The NOS Search page is indexes, not searches. It has tabs for Organisations, Occupations and Suites.

Some Open Source notes 9 Feb 2013

Posted by Tony Law in Consumerization, IT marketplace, ITasITis, Tech Watch, Technorati.
add a comment

In my persona as an Associate Lecturer of the Open University, I promised some brief notes on Open Source software to help a colleague who’s leading a Staff Development workshop in a couple of weeks’ time.

Educational providers always need to find workable inexpensive software to provision their students. Around 1990 I taught the first Open University course which took ICT facilities to the students in their homes, rather than requiring them to book time on terminals hosted by friendly local institutions. The DT200 course existed in the days of DOS, but it used an early on-screen word processor (FirstWordPlus on the GEM GUI), a cut-down version of the Lotus 1-2-3 spreadsheet, and the CoSy conferencing system. The configuration was an Amstrad 640 with two 5.25 inch floppy drives and no hard disk. Oh, and the mouse port was on the left hand side which is why, more than 20 years later, I still use my mouse left handed.

I promised some notes, as I said. And I thought I’d share them more widely. I use Open Source software quite freely but nothing startling. I also use other freeware and a handful of niche purchased products, such as Graphic Converter for the relatively limited image manipulation I need to do.

My main OU course now is the ICT foundation course which introduces students to a range of practical ICT tools as well as the social and global context in which the technologies operate. It uses Audacity for audio recording, which I’d been using for some time already for creating podcasts for students on another course. It uses FreeMind for mind maps. Alongside this it uses tools like Picasa for image manipulation which is free (from Google) but of course isn’t Open Source.

I use a Mac but run it sometimes as a Windows machine using BootCamp. On Windows I don’t maintain a Microsoft Office licence so I use Open Office. While there are some compatibility issues with on-screen presentation I haven’t hit any significant problems. I know there are some, but they haven’t affected anything I’ve needed to do. I use the VLC media player on Mac for Windows Media Player formats, since Microsoft no longer make a player for Mac.

The Firefox browser and other elements of the Mozilla family are of course Open Sourced and Firefox is my browser of choice. I use the internal web server on my Mac which is a version of Apache.

For application development I use Cincom Smalltalk which is a full object-oriented environment and although it’s commercially owned it’s developed by its OS community. I learned Smalltalk, also 20 years ago, when working on a collaborative academic-industry research project and I still love it.

Working in industry, as I did until recently, I encountered a lot of suspicion about Open Source. More recently I think it’s abated somewhat but it’s still there.

The debate around OS in the commercial IT sector focusses on accountability – not knowing who is accountable for quality or who can be sued (to put it bluntly) for any real problems. It’s difficult for procurement-minded professionals to accept that a community of interest is likely to have higher quality standards and to identify and fix problems more quickly than a major for-profit software supplier.

This attitude has softened over the past several years, not least because some software (such as Apache and Linux) has become widely used in the enterprise. To my reading there are (at least) two reasons. Cost (obviously) but also licensing.

It’s a lot easier to promote a web service when you don’t have to license according to the number of users. Quality has become a given for the most widely used products. Security can be easier to assure and handle when there can be access to source code. And acquiring OS software through a distributor does offer some assurance of quality. There have been some high profile espousals of OS software, such as Linux or Open Office in government departments which are supremely cost-conscious, but these haven’t had an enormous impact in the wider commercial marketplace.

What is, I think, true is that as more specialised niche requirements have been accepted within the enterprise, there’s a recognition that either open source or niche (= small startup) providers may be the only route to a solution. Someone, somewhere, has created an open source community around your need.

There are various definitions of what constitutes Open Source. By one definition, a specification is “Open” if it is published, so that it can be used by other platforms – as other word processing software can create documents in Microsoft’s format. Conversely, the Open Document Format was defined through an open process: but isn’t yet accepted as the leading standard for interoperability. This is the open process I learned about through by participation in the Object Management Group’s work. Building consensus and reconciling different viewpoints, including those of commercial developers, takes time: but there is often a strong academic foundation, and academic rigour often sustains a longer-lasting and more effective standard. Or, again, there is development through an open community which brings many minds to bear on problems; which converges on useful solutions; but which can become self-perpetuating so that the vision does not always grow or, where necessary, change.

Links:
• Sourceforge: one of the strongest groups of Open source communities
• Sourceforge is host to Audacity and to FreeMind
• Linux (of course)
• Apache (the Apache Software Foundation) also hosts Open Office
• Mozilla for Firefox, Thunderbird and more
• Smalltalk (see this page for versions)
VLC media player
Commercial products:
• Graphic Converter from Lemkesoft
• Picasa from Google
History:
• Gem Desktop

Follow

Get every new post delivered to your Inbox.

Join 120 other followers