Information theology

I gave a presentation to a church group this week about social media. This was part informative briefing, for a group of people mainly of my own generation who don’t have my links into that technology world; part discussion of some of the benefits and challenges; and part a suggestion how people of faith might approach these issues.

At the heart of the talk were two case studies. First, something familiar to, and accepted by, most people there: buying a book online. A processs with several stages: browing the website; making the choice; setting up an account for the purchase; clicking “Buy”; and expecting the book in the mail.

Two or three issues, even here. First, online, look carefully at the browse results. For the book which I illustrated, the cheapest choice was actually an audio CD. Obvious, in a bookshop; not so obvious online. And possibly not what you would want.

Then, you have to provide personal and credit card details. A name and address: for shipping, and for verification. An email address: so the marketing emails probably start. And credit card details. You probably trust (the example I used), but what about a company called Rogue Amoeba? I have, in fact, recently bought software online from them – something I’d been trialling for a while, so that’s how the trust was generated in this case. But trust is an issue. I’ve recently had a new card issued, because my card issuer detected fraudulent activity on the account.

And a third issue. Someone in the group had bought something through Amazon not realising it was actually being bought from a third party. When it didn’t arrive, Amazon don’t have liability. Watch who you are really buying from, is one lesson; but the other is that, on the web, it’s take it or leave it. You can’t negotiate terms and conditions. Yet, how many of us read them?

So: even a by-now common transaction had lessons for us. And there’s a fundamental principle here. We recognise privacy issues, and the sensitivity of our personal data, but: people trade privacy for benefit.

So we then looked at the world of social media: Facebook, Twitter, blogs, crowdsourcing. Human beings, as a species, communicate; and that’s how we use the Internet these days.


We communicate!

We communicate in a lot of ways, with different content. Sometimes – like with this blog – we intend a wide and unspecified audience and we write for public consumption. But not everyone realises that’s what they’re doing. Some people write in order to influence others: and that can be for good, or for not-so-good. And sometimes, people “loiter with intent”: that issue is, of course, most visible with concern about malign exploitation of children. It’s easy to pretend to be something you’re not.

The same principle applies: people trade privacy for benefit. In this case, it’s the benefit of keeping easily in touch with friends, with scattered families, with interest groups and so on. But in the online world it’s easier to make mistakes. The speed and scale makes them potentially more dangerous. And something once published here can’t be recalled; someone, somewhere, will have it cached.

And it’s easier for someone to pretend to be what they are not.

I illustrated the various media using the second case study: the Equality Trust. This campaigning group offers a website/blog; a Facebook presence; and tweets. There is also a very conventional platform: a book (“The Spirit Level”, which I commend).

I showed how the same content appears in different ways on the different platforms. This is a positive example of the way they can be used, for an issue with which my audience could easily identify. But I also showed an example of the spam follower messages that arrive, which Twitter normally seems to catch pretty well: people are exploiting all of these platforms, and you must learn to recognise the clues (like: zero followers, only one tweet, and following lots of people).

This, then, enabled us to discuss briefly the different styles of usage of these platforms. In my case, my Facebook account has few contacts who are mostly scattered family. I use LinkedIn for professional networking, and we discussed the differences briefly, but some people use Facebook this way. I tweet irregularly, usually to promote something I’ve written or an event, and for me again this is a professional platform though I may mix the occasional personal snippet.

Then we mentioned crowdsourcing. I looked at the Equality Trust on Wikipedia. We talked about the reliability of such sources, and also the ability for social media to react quickly: with the example of the Wikipedia entry about the London bombings, which was started, extended, structured and restructured while the mainstream media were still trying to phone in their stories. And with discussion about other topical areas (e.g. Iran, Burma or China where the authorities find information is increasingly hard to control).

We looked at the rise of commercial use of the web; and at the way Google changed the internet – not by its search engine, but by the advertising model of monetisation, which funds most “free” services which support, among other things, churches, charities and other interest groups.

Then it was time to consider what shapes people’s privacy attitudes. A short recent paper by Nov and Wattal is a good introduction to academic research in this area, for those who have access and aren’t put off by the language of formal socialogical hypothesis. This formalises what we may expect: people’s attitudes are shaped not just by their personal inclinations but by community and peer group attitudes (such as the more open culture generally among younger people, leading to more open sharing on Facebook); and by personal experience.

I added a personal perception: Society at large does not understand the concept of risk. One disaster outweighs many benefits in public opinion; but people also believe it “won’t happen to them”, and this may be particularly true among younger people with less life experience.

And Nov and Wattal reach one interesting conclusion: they don’t think that it’s necessarily possible to draw conclusions about social computing privacy issues by inference from more general privacy. The two should be studied separately.

Now the theology. In the late 1970s I was part of a church study group which published a report deriving a Christian approach to issues of developing technologies: energy, IT (we called it electronics in those days) and biotechnology. We were not trying to predict the technological future – we gave some examples, but recognised that things would come along which we had no concept of. In 1982, when the report published, the Web was still ten years away: we anticipated online shopping and information services, for example, but we attributed them to the Prestel service.

Fundamentally we asserted that technologies themselves are neutral: it is the uses to which they are put (by human beings) that create the possibility of good or evil. But we suggested six questions for people of faith to ask of any developing technology.

  1. Is [a technology implementation] likely to change society in a direction more compatible with God’s ultimate purpose?
  2. Does it give people opportunity for growth, or diminish [their] stature as a child of, and fellow-worker with, God?
  3. Does it benefit only one section of the world community at the expense of some other group?
  4. Does [it] increase the possibility of one individual or group … being able to dominate another?
  5. Does it add to the real quality of life or detract from it?
    • and for whom?
    • is this the same as “standard of living”?
  6. Does it make proper use of world resources, taking into account the needs of the whole world and of future generations?

We had a lively discussion, including issues such as

  • whether – or how – churches and other pro bono groups can/should use social media. There are already many online groups; online worship services and prayer lines; and, at this time of year, online Lenten studies and virtual pilgrimages
  • how these media will play in the forthcoming UK General Election: President Obama’s election campaign capitalised on these significantly, and UK political parties are already there
  • data retention by service providers: how long Google keeps your mail, even when you’ve “deleted” it (and service users expect backup, of course). Data mining: mobile phone companies can put together group profiles based on their records of where your phone’s been (are you a commuter, or a clubber?) and use this to make their service better fit your needs, but also market more to you.

A great evening and good discussion. These are ideas I want to go on developing; it was good to have a first chance to try them out.

Links and references:
•nbsp;Nov, O., & Wattal, S., Social Computing Privacy Concerns: Antecedents and Effects. Proc. 27th International Conference on Human Factors in Computing Systems (CHI 2009), Boston. ACM, New York, 2009, 333-336
•nbsp;Boyes, A.E. (ed). Shaping Tomorrow. Methodist Church Home Mission Division, 1982 (not available online)
•nbsp;The Equality Trust (link from here to blogs, tweets, etc)
•nbsp;Online church: St Pixels (initially sponsored by the Methodist Church)

Tech watch update

I’ve just come off one of Gartner’s ocacasional complimentary webinars with Jackie Fen and her team: Prepare for Disruptive Emerging Technologies Through 2020.

Jackie is a consummate professional and her webcasts are always well worth the time. Her aim was to illustrate a few of the areas where she and her team believe disruptive technologies are on the march: already deployed in some specialist areas, and likely to make considerable impact as they spread.

And the keyword is: disruptive. So, not just new ways of doing what we already do, but  ways of replacing what we do now. And it doesn’t necessarily mean novel; some of these technologies have been around for a while. Jackie showed the first-ever Hype Cycle chart and, fifteen years on, Speech Recognition is still climbing to plateau!

You’ll be able, I believe, to get access to the webinar replay shortly. So, just a couple of examples. Recent years have spent a lot of energy automating the fetch-and-pick process in warehouses. Fine, if you’re moving complete pallets. For individual items the human being is much more effective at picking – but much less efficient at getting rapidly from place to place! Solution: use the robots to bring the shelf unit to the picker, and take it back afterwards.

Then there are mobile phone companies using location data, via advanced data mining, to identify communities within their user populations. They already capture all the data … so they can target services (or marketing!) at the group who come into the city in the morning and go home in the evening: differently from those who come into the city in the evening, go to the smart restaurants and the theatre district: differently from those who also come into the city in the evening but go clubbing.

And 3-D “printing”, that is, sending the instructions to “print” (manufacture) a part to the remote location rather than the part itself. Not so much for manufacturing supply chains; more for repair shops that need any given part rather infrequently. They stock raw material instead. And costs are tumbling by orders of magnitude.

Gartner clients should watch out for an update to the STREET emerging technology management process, due out shortly. Jackie spent the last few minutes discussing processes around the technology agenda, with some interesting statistics. A few technologies (like 3-D printers) are well down the list when it comes to widespread adoption; but where they have been brought in, the value delivered is very high.

Jackie also read out the verse I added to something posted on their Hype Cycle blog a little while ago, reproduced below. Thanks!

Footnote: for a different review of the tech horizon: Bill Chamberlin at Horizon Watching has just published a most useful list of links to his own blog coverage of tech trends papers from 70 or so sources, and thence back to the originals where possible. If you can follow the train of content described in that sentence! Well worth a look; great piece of work.

• Gartner webinar replays (I’ll update this link when today’s replay is posted)
• HorizonWatching: Index to 2010 Trends and Prediction Lists, 23 Feb 2010
• Mastering the Hype Cycle (Gartner blog, Jackie Fenn and Mark Raskino)

Here’s my verse. The original was in response to a post by Nick Jones (I think it was) on the Hype Cycle blog, but I don’t think it’s still there.

The strategists won’t allow hype to dictate:
It’s a servant to them, not a master.
They let it inform, and they factor the risk
Twixt competitive edge and disaster.

Or they hold for a while, till it’s over the peak –
Or even invest in the trough –
When the prices are low, and there’s knowledge around
To ensure the return is enough.

Hype can be a snare: but provides a great guide
If your buyer’s informed and is practical,
And knows when investment is for the long haul
Or when it’s short term, and is tactical.

AMR fallout spawns SMB Research

I got a Twitter link from Bob Eastman, alerting me to his and Miles Prescott’s new firm. SMB Research started its blog in December. Both ex AMR, looking to cover supply chain, ERP, Customer Relationship Management and Enterprise 2.0 (AMR’s classic research agenda) with a focus on supporting the smaller business.

I’ve an interest in this, as you’ll see from my presentation Can Web 2.0 run your Business?

Enterprise users with legacy AMR contracts: take note, and also see today’s other posting which discusses what Gartner are saying about the integration of AMR (and Burton).

SMBs: also take note. This is a great development. Good luck, guys.

• Visit SMB Research at
• SMB Research Blog at
Can Web 2.0 run your Business? (InformationSpan presentation, British Computer Society, London, Jan 2010)

Progress with Gartner-AMR/Burton

Gartner CEO Gene Hall and Tony Friscia of AMR have published an update on the AMR website outlining what’s happening with the AMR integration. This is behind a new link on the AMR home page, under the Gartner logo.

Gartner CEO Gene Hall and Jamie Lewis of Burton Group have published an update on the Burton website outlining what’s happening with the Burton integration. This is behind a new link on the Burton home page, with the Gartner logo.

Yes, those two paragraphs look remarkably similar. So do the postings on the two websites – at least, at first sight. But read on …

Gartner once again repeat that they plan to continue the current AMR and Burton product portfolio through renewals. The AMR and Burton “pricing models” will not change.

There are, though, pointers to changes. First, you may note that the phrase is “pricing model”, not “pricing”. No comment … But for Burton, there is also a commitment to limit price increases to a “nominal” market uplift. That’s missing from the AMR announcement.

Then: the headline AMR comment says “We have already started to invest in the AMR portfolio to enhance the depth and breadth of services that will be delivered to existing AMR clients starting in Q2 this year. This will increase the value you receive as part of your current entitlements.”

And the same thing for Burton, except that it’s Q3 instead of Q2. The agendas are beginning to change, as Gartner establish their hands on the wheel. It remains, of course, to be seen whether this will be judged beneficial by legacy clients – and that’s intended to be a straightforward comment, not a loaded one.

The more important differences come further down, in the FAQs which follow.

Here’s what it says for AMR: “Gartner will release a new generation of AMR research products. In research and advisory, we will introduce AMR Supply Chain Leaders. This will combine AMR’s research with Gartner’s supply chain management and manufacturing research on our new “My Gartner” personalization platform… We are also introducing AMR Enterprise Supply Chain Leaders …“.

This clearly shows AMR-based products being developed in Gartner style, and content from the two analyst teams being brought together and, presumably, aligned. Clients can, we think, expect persuasion from account managers that these new services be trialled in place of the legacy AMR (and legacy Gartner) products. They are named like Gartner products [slight change: I said “as” originally, that was a mistake]; they are explicitly being delivered in a Gartner way; and there’s every reason to suppose they will be at Gartner prices. We said that we think the older AMR access models have a limited life span. This certainly does not dispel that supposition.

By contrast, the strategy for Burton looks set to retain the Burton website as the primary means of access at least for the future envisaged by that document. There is a much stronger feel of Business As Usual in the Burton/Gartner announcement.

But it does look as if new clients of Burton will be offered something different. The wording is: “Specifically, Burton IT1 and individual research & advisory services will be renewable by existing clients” – my underline. Existing clients only, we note.

We advise clients to maintain a healthy scepticism about the longer term: it may simply be that Garter have decided not to “Gartnerise” both acquisitions at once. And it’s interesting that AMR, with its more distinct client base and coverage, is being Gartnerised faster. Watch this space!

• AMR Research is a Gartner company, AMR website, 12 Feb 2010
• Burton Group is a Gartner company (there’s no heading to the actual webpage), Burton Group website, undated
• Gartner’s acquisition of Burton Group, InformationSpan/Lighthouse webinar report, Jan 2010.
• There is a range of other coverage on this blog; just search back

“Experton” comment on the Gartner-Burton report

We received the following comment from Luis Praxmarer of Experton. He offers an insight about the next level of taking the game to Gartner.

Dear Duncan and Tony,

First I like to thank you for the effort for setting up the Webinar and for discussing this topic as well as including Ashley, a former Burton Manager, into this call.

For the recommendations I would definitely advice clients to be more pro-active. Otherwise clients “stumble” into the Gartner Service without really reviewing their options.

Our take is and this is not very different for many M&A Activities:
1.    Analyze how the company has really used the service, what are the strength and weaknesses, and how does the demand change/expand/diminish within the organization.

2.    Get the Gartner Rep in and request a clear written statement on how long this exact service will be available (which usually they will not do) and request a positioning how Gartner would otherwise cover the service with their own offerings (including pricing).

3.    Request a free trial of the Gartner offering for a long enough time to really test it.

4.    Now compare what you had and what you see with what you have analyzed what your need is. This “need” might be addressed by several other companies in the market. Look at them and include them into the discussion at the right time based on the renewal date and commitment you get from Gartner.

5.    Now you should be well informed what is out there and what your need is for making the right decision on how to approach the negotiation and contracting. It also gives you time to include the real users if you are the coordinator / interface. Be aware of details in the old Burton contract as well as in Gartner as this could mean a very quick closing of the service. E.g. with the META Group takeover Gartner laid off an entire team and learned later they could not provide the service anymore to the clients. The contract usually protects the client only for the money back for the time he “discovers” the non-service (and only if he negotiates hard). Gartner will try to compensate this with their own offering and sometimes this might be an OK deal, but if the company requirements are not best fulfilled by Gartner, it is only a second or third best solution.

This is a much more pro-active approach and puts the client into the driver seat. You need to control the timing! Any professional VMO* would react in about this manner (if they follow the Best Practice advices). Of course now we could talk about the difference for Vendor and User clients but this is for another day ….

Best regards and have a good day

Luis Praxmarer
CEO & Global Research Director
Experton Group LLC, Dubai, UAE
Dubai Mobile: +971 50 62 54 980
International: +49 172 82 76 954
Munich Office: +49 89 92 333 10
FAX: +49 1212 5235 98 306

I certainly agree with Luis that a proper needs analysis is a pre-requisite for moving forward. InformationSpan’s DRAKE methodology provides a systematic approach to this in a user enterprise (as Luis says, the conversation about vendor clients is different). You then have something you can assess against the range of marketplace offerings. Most clients know about only a handful of potential providers and don’t have time to look further; equally few take the portfolio approach to services which can save money and improve both coherence and effectiveness of advice.

I have a database now of over 400 (not including 300+ others who do market research rather than technology insight), definitely at the service of any clients who wish to take a step back in this situation. Do get in touch!

*VMO = Vendor Management Office