Insight providers and market evaluation 6 Nov 2013Posted by Tony Law in Impact of IT, Insight services, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
This is a slightly extended version of a response in LinkedIn to Michael Rasmussen, who has published some thought (“a rant”) about Gartner’s Magic Quadrant.
MQ is a highly influential and long established analyst tool. As an insight services user in enterprise IT, I made use of MQs regularly and would also review similar tools such as Forrester’s Wave when a purchasing decision was being made. Like anything else, it’s essential to know just what a tool like this is, how it’s created and what it does and does not convey. The same is true of Gartner’s Hype Cycle, as I’ve commented elsewhere.
Michael highlights several concerns about Gartner’s recently updated MQ in his own area of considerable expertise, that is, global risk and compliance (GRC). Do read his original, which I won’t attempt to summarise; see the link below. Here’s my response.
Michael, having read the whole post in your blog, a couple of comments from a user’s perspective. First: I wholly agree that Forrester’s Wave value is in the open availability both of the evaluation criteria and of the base data; it would be fantastic to see the same from Gartner. This isn’t just an issue of general open-ness. Since a user can adjust the weightings on the Forrester evaluations, it becomes a much more practical tool.
Second, I remember the moment of revelation when I realised there is a whole industry out there called Analyst Relations, that is, people employed by (big) vendors to influence the analysts. Users often don’t realise that’s how the insight market works.
Third, new approaches do emerge. I’d be interested in your take on Phil Fersht’s Blueprint methodology at Horses for Sources (HfS).
My own analysis of the insight market itself classifies providers in various dimensions. One of these looks at reach, both geographic and content: from global generalists (Gartner for example) through to niche (often start-ups – you yourself have progressed from niche to global specialist since you left Forrester). Perhaps tools like the Wave or MQ should have similar dimensions so that the innovative new providers can be properly assessed.
To add a couple more points. As a technology innovation researcher, I was always well aware that small start-ups often offered innovative options which larger vendors didn’t have or hadn’t got round to. But you took the risk of the enterprise falling apart, failing to deliver, or just failing. Experimental technologies always carry risk and the options are tactical (innovation for shorter-term business benefit) not strategic. Gartner I’m sure would assert that innovation is handled by their Vision dimension in the MQ but, as Mike points out, there are thresholds and other elements which mean that these tools don’t make it into MQs. HfS makes innovation explicit.
Second, in business-critical areas which are highly specific to your business area it’s unlikely that an insight provider will know as much as you do. Don’t automatically assume that a MQ or any other tool will deliver the right answer. Use the tools most certainly, but be prepared to reason your way to, argue for and adopt a solution which is at odds with what the tools say. You must of course be able to justify this, but the general answer may not be right for you.
• Gartner GRC Magic Quadrant Rant, Part 3, Mike Rasmussen, GRC Pundit, 23 Oct 2013
• The HfS Blueprint Methodology Explained, Jamie Snowden and others, HfS Research, Oct 2013
• GRC 20/20 research (Mike Rasmussen)
Business Process Improvement 17 Sep 2013Posted by Tony Law in Impact of IT, IT is business, ITasITis, Managing IT, Technorati, Uncategorized.
add a comment
Working for GlaxoSmithKline IT, after the 2000 merger, developed my familiarity with business process improvement (small letters) and with Six Sigma methods and metrics. I would never call myself an expert. Routine training was to Green Belt level, without taking the qualifying exam, and I don’t have the instincts which make a leading practitioner able to pick the right tools to adopt for any specific need.
But it taught me a lot, which can be applied well beyond IT. First: as a previous CEO used to say, “If you don’t keep score, you’re only practising”. So, to drive and verify and improvement, you need metrics. But pick the right ones, which will show you where you are. Establish your baseline before you start doing anything. Use the metrics to demonstrate the change (you hope!). And when the improved process has reached the status of business-as-usual, you can probably drop the measure. It’s no longer needed.
Second: a saying that was drummed into us. “Don’t tinker!”. Don’t make changes on the basis of “I think …” without the analysis. Don’t over-react to one-off incidents: processes have variability, and some outliers will happen naturally.
And third: develop and demonstrate your own (internal IT) understanding and improvements before you try to work with the rest of the business. IT has, perhaps, an unique overview of what goes on across the company, and is almost always a participant in any business improvement project. So there’s good leverage there: but you have to gain credibility first. It takes a lot to get to the point where, when a business leader asks for an IT development, you can say “Why? What improvement are you driving? Who will own it? How will you measure it?”
Well: tomorrow I’m facilitating a Corporate IT Forum event on Business Process Improvement (BPI). I’m expecting the twin threads of, first, identifying and improving IT’s own processes; and, second, putting that experience and expertise at the service of the business as a whole. Where are the sources of information and analysis?
Gartner have a Leaders Key Initiative on BPI. The overview, as recent as July this year, has a natty graphic showing the BPI practitioner as a juggler (operations, transformation, skills, technology and innovation) under pressure from both business and technology forces. They offer a number of tools for maturity assessment “across IT disciplines” (what about the rest-of-business?); key metrics (that’s IT spending and staffing, not how to measure a process); and best practices across several competencies. It seems, though, towards the end to lapse back into business process management (BPM) not BPI.
There isn’t a lot in the Gartner blogs, but a useful post from Samantha Searle earlier this year challenges us to avoid the word “Process” (unless your business-side colleagues are process engineers or in manufacturing). That kind of gells with the observation that Gartner probably, under the covers, maintain an IT-oriented focus because Process is very present in the key initiative!
Similarly I don’t find a great deal in Forrester specifically around BPI. But there’s a stronger focus on the interplay of IT expertise and whole-business improvement. A recent report, for example, discusses the shift from “a tactical process improvement charter” to a more strategic role across the enterprise. This requires a plan “for optimizing the BPM practice to deliver on new strategic drivers and business objectives”. That sounds more like it.
Interestingly, a search collected a link to Cambridge University which I expected to be to the business school or computer science. But it’s to their internal management services division with a one-page (one-slide, really) graphic and definition of BPI. Take a look. But the Judge Institute of Management Studies does indeed have a Centre for Process Excellence and Innovation, also worth reviewing.
There’s a lot of material you can find by searching. Too much to survey. Assess with care!
• Business Process Improvement Leaders Key Initiative Overview, Gartner, 25 Jul 2013 (search Gartner for ID:G00251230)
• 10 New Year Resolutions for BPM Practitioners #2: Don’t Mention the “P-word …, Samantha Searle, Gartner blogs, 8 Feb 2013
• Optimize Your Business Process Excellence Program To Meet Shifting Priorities, Clay Richardson, Forrester report, 6 Jun 2013
• Business Process Improvement, University of Cambridge, Management and Information Services Division (undated)
• Centre for Process Excellence and Innovation, Judge Institute, University of Cambridge
Overdue update: Gartner blog index 9 Aug 2013Posted by Tony Law in Insight services, ITasITis, Managing IT, Technorati.
add a comment
I’ve finally done a full update on the Gartner Blogs index published on informationspan.com. There are three significant changes (as well as the normal turnover of analysts).
- Gartner have introduced three new areas within their Markets coverage (that is, the area for IT sales professionals): Digital Marketing; Servers & Storage – Comparative Hardware; and Servers & Storage – Competitive Positioning. The technical Servers and Storage area is unchanged.
- Digital Marketing has become the first area within Gartner’s Marketing area to offer blogs.
- the former Burton Group group of services, which has been marketed as Gartner IT1, now comes under the heading Gartner for Technical Professionals. There’s only one IT1 blog at the moment. But I’ve discovered that the legacy Burton blog content, which I had thought was deleted, is mostly still accessible. Their last content was posted in early 2010 but they may still have value.
As a result of this, I’ve made changes to the structure of the blog index.
- I’ve split the index of blogs by coverage area into two: one containing the technology-related blogs and the second the remainder which now are: Gartner Services and Management; those with the Vertical Industry focus; and a new section for Marketing.
- I’ve re-introduced a page linking to the legacy Burton Group blogs; one of them (Identity and Privacy) has completely disappeared but the others are still reachable.
Gartner Services and Management currently includes a handful of blogs from Executive Program advisors; one blog from the Supply Chain service (developed after the integration of AMR, of which nothing identifiable now remains); and a long-moribund but still accessible blog by my old META Group acquaintance Val Sribar, now a Gartner GVP.
I’ve also refreshed the list of blogs indexed by the custom Google Search of Gartner blogs, which appears on the lead page. Visit http://www.informationspan.com/analystblogs.htm.
Just to remind you: you can use this index for all sorts of functions Gartner don’t provide:
- go straight to your favourite analyst’s blog
- see whether a blog has (reasonably) recent content without having to visit it
- look for blogs on specific Gartner coverage areas
- find blogs which aren’t included in Gartner’s Blog Home page
- search specifically across the entire Gartner Blog space
Please tell me how you use this index, and how it might improve.
Enterprise grade public cloud: IDC’s take 19 Jun 2013Posted by Tony Law in Cloud, Consumerization, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
I’m on an AT&T webcast relating to public cloud infrastructure and its growth. Allow that this is primarly a US-focussed perspective. It’s AT&T sponsored, but delivered by IDC. It’s being recorded, and I’ll add the URL when it’s available.
Much of the underlying data comes from IDC’s winter 2012 CloudTrack Survey, with around 500 respondents. Five elements: the pace of change; deployment; networking; workloads; and next-generation solutions.
IDC refer to the “third platform”, not just second platform; and with spend growing nearly 12% per year compared to less than 1% for second platform. Third platform will account for almost 25% of this combined spend by 2020, and in the next three years spend on external services will grow to around an eighth of “traditional” IT spend. Over three quarters of North American companies are already using public cloud services.
There’s a useful categorisation of cloud deployment models, with names that speak for themselves. Self-run private or managed private; dedicated (externally) hosted or virtual private cloud; or public. Running across these are the decisions about on- or off-site, and dedicated or shared infrastructure. That eighth of spend shift over the next three years depends on these decisions.
Virtual-private cloud (VPC) has clout, through additional security and control, better connectivity into corporate networks, and more controlled SLAs but are built on public cloud infrastructure. AT&T believe shared services will command the lion’s share of the developing spend, although the split between dedicated and shared is more equal right now. This is what AT&T imply by “enterprise grade public cloud”.
Connectivity is crucial (remember, AT&T is a network company …) and there is an opportunity to connect VPC through an MPLS (multi-protocol label switching) high-availability cloud network rather than the public internet. Integration to the corporate network is close to seamless. IDC believe this option overcomes many enterprise objections to VPC cloud usage. And the CloudTrack survey suggests that any major workload coming up for reinvestment is at least going to be considered for cloud migration.
Noticeably, the workloads most likely to be moved are about the key elements of the “third platform”: social, big data (and analytics) and mobile. Where relevant, emerging markets also make a strong contribution to the importance of the third platform. Enterprises will need competencies across cloud and all these; they may not be tagged as cloud initiatives, but in these spaces cloud is crucial for developments to be effective, and those developments will be combinations of the four technology spaces. There’s a graphic for this; look in the webcast when it’s online (I’ll add the URL when it’s available).
On the half hour. Transition from the IDC analyst (Frank Gens, Senior Vice President and Chief Analyst) to Amy Machi, AT&T representative. This is a sales pitch for the combination of IBM’s Smart Cloud solution and AT&T’s VPN (NetBond), and you’ll get less notes. But with so much discussion about the limitations of service agreements with providers, it’s interesting that IBM trail over 70 auditable automated tasks available to clients, and cloud-based ITIL processes. Also, an important point is that AT&T will scale network capability in line with the demands on the scaleable cloud resource being claimed at IBM’s end of the wire. For anyone looking seriously at this version of the Cloud option, several case studies show the variation in possibilities.
Note, too, that at the present this is a US service and users need to be an AT&T customer. It will extend to Europe and Asia/Pacific relatively soon.
So: in response to questions, Frank Gens believes that investment in new capabilities will swamp legacy migration onto the third platform. And IT managers (VP/SVP) are coming to accept a reputable cloud service provider as having security at least as good as their own and possibly better, but the network has remained a vulnerability. With a managed MPLS network, rather than public infrastructure, these concerns are mitigating.
Benchmarking: sources 17 Apr 2013Posted by Tony Law in Insight services, IT marketplace, ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
I’m facilitating tomorrow a Corporate IT Forum discussion on twenty-first century benchmarking. It’s a wide topic. This post is a set of links and some comments, based on the InformationSpan database of 700 research and analyst firms. But I’m always grateful for updates: please comment!
The Forum itself operates a benchmarking service for clients, so there’s a declaration of interest to make but I am not myself a member of it. Primarily this is crowd sourced: it invites members to contribute their own data, and to compare themselves against their peers.
• Computer Economics provides a range of benchmarking data, not all financial. I’d consider it a primary source and worth a subscription. It provides a wide range of data. Major studies include IT Spending and Staffing Benchmarks and Worldwide Technology Trends. Their Management Advisories look at ROI and TCO, Risk Management and other topics. Too many to list here. Take a look for yourself.
• InterUnity Group “provides leading companies with strategy, competitive intelligence, and benchmarking to improve business performance.” It’s not clear what areas of benchmarking are actually covered or whether the focus is primarily financial
• The component services of the Corporate Executive Board will be worth investigating. Using the Researched Sharing model for content, CEB services such as the CIO Executive Board link and correlate information and tools from clients.
• Ventana Research undertakes benchmark research as one of its primary activities, drawing information from its own community, social media and the company’s “media partners”.
• The Data Warehousing Institute undertakes benchmarking in its key area, primarily business intelligence. They publish an annual BI Benchmark Report.
This is a rapid post in advance of the event. Look for a wider-ranging Coverage Report from InformationSpan when I’ve time to develop the theme.
add a comment
I teach a couple of Open University courses. In one of them, I’ve just got to the point where we encourage the students to work through the industry skills frameworks. The aim is to benchmark their skills and to identify both longer term career direction and short term professional development targets.
A few years ago it was confusing, but manageable. My first contact with this area was quite some years ago when the British Computer Society began to develop from an academic interest group into the professional organisation it is today. It began to review applications for membership. To benchmark (that word again) applicants’ status and career progression, it needed a framework. Out of this grew the Industry Structure Model, which identified a number of career tracks. This developed into the Skills Framework for the Information Age (SFIA), which is still a great set of definitions for ICT career people. More below, about SFIA.
When I first came back to this teaching, five years ago, the then government had created an enormous, wide-ranging family of National Occupational Standards (NOS). These were divided among a number of defined industry sectors and Sector Skills Councils. Some of the areas were fairly obvious, like Engineering. Others, perhaps less so, like Contact Centres. The general principle was a good one: that in the main, skills were only defined once. So, anyone whose role included management looked to the Management framework. It wasn’t re-defined in every profession. Anyone who used IT (and I mean, used as a user) could benchmark those skills against the IT User NOS standard. These “generic” skills were, as it were, imported into the professional portfolio which defined actual roles in real organisations.
Well, what have we now?
1. Originally, there was the overall IT Professional Competency model (e-skills Procom). This has been discontinued so far as I can tell. It now exists only in the National Archive – under the “NVQ” section although Procom is not an NVQ framework (!).
Procom provided a framework of seven disciplines:
- Sales and marketing
- Business change
- Programme and project management
- Solutions architecture
- Solution development and implementation
- Information management and security
- IT service management and delivery
2. Of these, disciplines 4, 5, 6, 7 are represented in the IT/Telecom Professional NOS of 2009. The SSC, e-skills UK, still exists and this framework is still current on the e-skills website. These are, though, hidden in a link right at the bottom of the page. Currently, look for “NOS” in the purple footer.
The IT/Telecom Professional framework categorises capabilities at five levels: Junior Technician; Associate Professional; Professional; Lead Professional; Senior Professional. It categorises its criteria according to Performance; Knowledge; and Understanding.
Alongside this, e-skills maintains the IT User NOS which is valuable for almost anyone, We all use IT user skills. This framework defines three levels: Foundation, Intermediate, and Advanced. The Advanced level overlaps into the IT Professional framework, covering user application development (Access, say, or Excel). This is also the framework where you’ll find user skills with software, be they office tools or specialised business applications.
3. The Skills Framework for the Information Age (SFIA) still exists and is now at version 5. It’s available as a spreadsheet download.
SFIA defines the following skill areas:
- Strategy and architecture
- Business Change
- Solution development and implementation
- Service Management
- Procurement & management support
- Client interface (i.e. sales & marketing)
It defines levels from 1 (junior) to 7 (which equates to senior management or CIO). Not all cells in the model have definitions at all levels: for example, within Strategy & Architecture the cell “Corporate governance of IT” begins at level 6. SFIA does have the advantage that it encompasses management to the most senior levels as well as technical capabilities.
4. Since late 2012 there is the IT Skills Academy. It is itself confusing.
First, it references a full set of role descriptions in its Standards section. The rubric says that “The IT Professional Standards have been organised and aligned to the relevant SFIA skills and levels.”. What this actually means is that the Standards are not aligned to SFIA, but there is a correlation table showing where matches have been identified.
They are not aligned to the NOS either. Again, some areas map across although the names are not quite the same. The disciplines here are:
- Architecture, Analysis & Design
- Business Change
- Information Management and Security
- IT Project Management
- IT Service Management and Delivery
- Sales & Marketing
- Solution Development & Implementation
- Transferable Competencies (three flavours: Personal, Business and Leadership).
The sub-categories of each discipline have definitions from Level 3 to level 6. The definitions are, like the NOS, divided as Performance; Knowledge; Understanding.
The Transferable section is well worth having. With the change to the NOS database overall, these general skills are now much harder to find elsewhere.
5. The Skills Academy website also offers the Professional Profile. This matches the categories and levels (3-6) of the Framework, but the descriptions are considerably simplified with a handful of “Do you do these things?” criteria.
6. Finally there is what you get to from the new NOS website. Searching this website is now far inferior to what used to be provided. The Search delivers only PDF documents for individual “cells” in the overall model, with titles such as “Software Development Level 5 Role”. Note the use of “Level 5″ which is not the categorisation used in the NOS. The content appears to be cloned from the NOS, but the sub-elements have been reorganised and you have to look at the content to infer that Level 5 equates to Professional.
There’s no link, as there used to be, back from these framework documents to the Sector Council or to the overall Suite, and there’s no search which will identify appropriate suites for a capability (as was the case on the old NOS website). Link to Search for indexes for both “Occupations” and “Suites”, but this assumes you already know what you’re looking for …
This is a horribly confused and confusing situation.
• IT Professional Competency model (e-skills Procom), in the National Archive
• e-skills NOS page: look for links to IT/Telecom Professional and IT User frameworks
• Skills Framework for the Information Age (SFIA)
• IT Skills Academy: IT Professional Standards, and the simplified My IT Professional Profile tool
• See: National Skills academy framework backed by UK employers, Computer Weekly, 4 Oct 2012
• The NOS website is now maintained by the UK Commission for Employment and Skills (UKCES). The former URL (ukstandards.org.uk) redirects here.
• The NOS Search page is indexes, not searches. It has tabs for Organisations, Occupations and Suites.
Anatomy of a crash (2) 4 Apr 2012Posted by Tony Law in ITasITis, Managing IT, Technorati.
add a comment
So … New iMac with OSX Lion, installed and working. I’m taking the time to reinstal stuff as needed, and keeping a system audit as I go.
In no particular order, here are a few significant issues.
Problem: new machine has Firewire 800 port not Firewire 400. Need to connect to backup disk to restore stuff. Old firewire cable incorrect; then discovered there’s more than one FW 800 connector and I bought wrong cable online. Go into Brighton Apple Store and get correct cable. Send old one back.
Problem: when opening a document with any software (Word, Excel, Preview, anything …) multiple “old” documents open with it. Problem: Lion has new “feature” which, when an application is opened, “restores” old windows. Aggravating. Cure: in system settings, turn the feature off.
Problem: Blackboard Collaborate (Elluminate), which is crucial for my Open University work, isn’t fully compatible with Lion. Application sharing causes Elluminate to crash, which my students didn’t appreciate. Temporary fix: present sessions from my laptop, which is still on Snow Leopard. Cure: wait for the vendor to fix this; it’s a known problem.
Problem (this one was anticipated): installing Windows under Boot Camp causes a licence problem. Through my old machine I have a licence for XP and it would be legitimate to transfer this to the new machine. However, Apple tell me XP won’t instal on Bootcamp under Lion so I bought a Windows 7 upgrade pack. As I expected, activation doesn’t recognise either the old XP code or the new Win7 code. This is despite Microsoft’s advice that upgrading from XP needs to be a clean instal. Asked Microsoft for help; so far, they’ve referred me to a US West Coast call centre though, to be fair, it does come on stream at 5a.m. their time (so 4pm here, as they haven’t gone to Summer Time yet). Ongoing.
Something I expected to experience as a problem that isn’t: I decided to bite the bullet, abandon the old Entourage Microsoft mail client and upgrade to the Office 2011 version now called Outlook. I’ve stayed on Entourage 2004, primarily because of a useful feature. If I drag a mailbox to the desktop, it saves an archive copy. When I’m going to an event, I use this to transfer the relevant email threads to my laptop in case of questions. Entourage 2008 didn’t have it. But hey presto, Outlook 2011 has brought it back. And I like the new client. Unexpected benefit.
I did look at Apple’s migration assistant. But it’s not sufficiently granular for the selective migration actions I want to take. So some things like Calendar and Address Book get manually migrated. Address Book is easy; just move the folder, and get used to the new Apple interface which actually, once adjusted, is ok. Calendars get migrated one calendar group at a time; this requires some careful adjustment of preferences (“Put imported events into …”) but I only have a handful of calendar groups so it’s not a big deal. Here, though, not so sure about the new interface. The list of calendar groups is a drop-down, not a permanent panel, and on the new panel I can’t pre-select a calendar group to create a new item. Not so friendly.
More to come, no doubt; but the main things are migrated now. Most software I’m looking for new versions as I go; things like Graphic Converter, Audacity, Audio Hijack, and so on. VisualWorks, my Smalltalk application development platform, will probably be a challenge if there’s a new version out. We’ll see.
Links? well you can probably work them out.
Anatomy of a crash and recovery 6 Mar 2012Posted by Tony Law in ITasITis, Managing IT, Tech Watch, Technorati.
add a comment
So I’ve been struggling, the last several days, through the consequences of a hard disk crash on my trusty, but five-year-old, iMac. Application of the standard tools (Apple’s Disk Utility, and Disk Warrior) maintained access for a while but the machine now simply refuses to boot MacOS, and the tools won’t recover it. It just hangs. So I’m on Plan B, with a hairy workload making the timing as inopportune as it could be.
The tendency is to assume that if you have backups (and I have – Time Machine is fantastic!) then everything’s ok. Well, it is, but I thought it might be instructive to catalogue some of the issues and problems.
I’ve bitten the bullet and ordered a new machine. I daresay that if I cleared the old one, reformatted the disk and re-installed everything it might be able to mark the bad blocks, or whatever’s the problem. But that’s not much different from re-installing to a new machine, and this one is indeed five years old and won’t be capable of running the latest OSX upgrades. So, a new iMac is on its way.
And that was the first frustration, I’m not far from an Apple Store, and hoped I’d just be able to hike over there and order what I needed. But they only stock the basic models, and won’t do in-store upgrades (memory etc) so it’s had to be online and wait ten days.
Hence, my mainstream work has to transfer to the laptop. The Bootcamp partition on the iMac is still fine, so I can boot the machine in Windows and I’m using it in that version right now to write this. Anything through a browser is fine, so my Open University email and online work just transfers over; and I can do bits in Open Office. But I don’t have most of my software on Windows.
I don’t, in any case, want to end up with work spread between two machines; and the Windows partition isn’t backed up as I haven’t, hitherto, used it for anything permanent. And I haven’t been able to get the wireless keyboard to work with it, ever (see later) so I’m on my old Apple wired keyboard with a coffee spill which has debilitated the left hand Shift and CTRL keys (which I use more than the right hand ones, wouldn’t you just know). So it’s the Macbook for most of the work.
Well, everything’s on the Time Machine backup. Simple, surely, to just haul files onto the Macbook (overnight, perhaps) and away we go.
Well, no. I haven’t figured out why, but a proportion of the files on the backup give trouble. Quite a lot transfer fine. But a high proportion flag up that I don’t have permission to write to a folder somewhere down the chain. So, initially, I’m going down the chains and copying collections of individual files, at which point I get a prompt for a password and it’s ok. I don’t figure this, as I’m an Administrator on the machine. According to the permissions I have full read/write access. And there doesn’t seem much difference between the files that transfer and the ones that won’t. But there we are.
Weird work-around coming up. I’ve got the backup disk connected to the Macbook: I can’t make the Windows iMac see it on Firewire, but I can see Windows on the network. So I use the Macbook to copy directly from the backup to the shared drive on Windows, and then copy back from Windows to the Macbook’s own hard drive. No permission problems or password prompts. Ho, hum! I now have an almost complete rebuild. I’ve had to do it in limited batches because the Windows partition is not all that big, but I can leave the copy jobs running and it works.
There’s some software I won’t reinstall until I have the new machine, and that’s going to be a pain because some of it’s licensed and I may need to get new licence keys (things like Classic Menu and Graphic Converter, not to mention Office and my Bootcamp Windows). And of course, there will be masses of updates to re-apply.
I have quite a lot of aliases, to provide alternative paths to some files and folders: while these appear in the right places on the rebuild, they don’t “work” until they’ve been re-assigned.
And there is, of course, a lot of information that’s in places other than my well-defined data area. Mail was ok; I moved the Office Identities folder across, and it worked. I use Apple’s Address Book and Calendar, not Microsoft’s, so I can replicate with the mobile phone using iSync. Find and copy the Address Book files across, and everything works. Calendar, not so good; I had to carefully re-import data into the Macbook calendar, one Calendar Group at a time, to maintain the structure. Websites, on the Web Server: find and copy; that’s fine. Microsoft Office templates: I know where those are so that’s ok. There’ll be more; but that’s where I am right now.
Remember the wireless keyboard? I’ve switched it to working with the Macbook. And of course I’ve now connected the Time Machine drive to the Macbook too so I’m still being backed up.
Shifting to the new machine, when it arrives, will no doubt throw up new issues. But for the moment, I’ve got work to do!
Links: none, this time