Green 3: Andy Lawrence of 451 1 Nov 2011Posted by Tony Law in Impact of IT, Insight services, IT marketplace, ITasITis, Managing IT, Social issues, Tech Watch, Technorati.
Tags: Green IT Expo
add a comment
Continuing my assessment of analysts I haven’t heard before: here at the Green IT Expo is Andy Lawrence of 451 Group, talking “Green Datacentres to Green Clouds”. Andy looks after data centre disruptive technologies, and eco-efficient IT, for 451. It’s the first time, again, that I’ve heard 451 directly.
He promises an overview including a European Union Framework project called Optimis. I’ve been involved in EU research in the past, didn’t know about this one: that’ll be interesting.
Here’s a sort-of hierarchy of energy efficiency for the data centre. At the base, five years’ work on reducing the power use of datacentre infrastructure: PUE, best practices, EU Code of Conduct. One tier up: work on lower power chips, efficient drives, virtualisation, power management etc. Above that again, the ability to look holistically at an application or service: for example, what’s the eco-impact of choosing 1 second response rather than 1.5 seconds?
So: Cloud. Cloud should be more eco-efficient and it’s often asserted to be so. But is it? 451 believes the assumptions are largely unproven. [Private] cloud and virtualisation, as a matter of observation, seems – so Lawrence says – to show under-utilisation so some of the eco gains are not realised..
We’re about to see another measurement framework. Here, there are four axes: economic, compliance, CSR (corporate social responsibility), operational effectiveness. Again, take a holistic look: e.g. what’s the energy cost of insisting backups are permanently online rather than powered-down (on tape, for example).
How do you measure resource efficiency? There are some proxy metrics; there are direct measures (i.e. actual measurements, not estimates: how much carbon now); and metrics (e.g. PUE). They are “good; but be careful: unreliable for business decisions”. We’re promised a tour of some cross-industry initiatives, and also a few highlights from individual companies.
The EU’s Optimis project provides a list for assessment: trust, risk, eco-efficiency, cost (TREC). The aim is to create an architectural framework that looks at all of these, and a development toolkit. Lawrence asserts the need for multiple metrics: “a lone metric never works”. The hard stuff is the effort to associate carbon with a cloud service, especially where the actual data are locked up in the provider’s data centre and they may well have no interest in providing the detailed data to feed into the models. It is, at the least, a hard problem.
Lawrence outlines an alternative proxy approach. It still relies on cloud providers doing the sums; but they may well already be gathering the data, and may well be more willing to deliver a category-based per-hour or per-VM footprint (kWh and carbon per VM hour, perhaps). Its accuracy needs to be similar to that of billing, neither much more nor much less.
This presentation has given me an incentive to revisit what I know of 451 Group: perhaps the most encouraging aspect was Andy Lawrence’s willingness to identify, and review, academic/industrial research projects which are easily overlooked by an insight market which tends to look only at vendors’ own development pipelines. It admits that development of real, workable methodologies is some time away: Optimis, like all EU Framework projects, is pre-competitive research. But while the project itself may not deliver the ultimate solution, the ideas it generates will certainly inform future metrics and tools.
• The 451 Group and The Uptime Institute
• Optimis EU project: Optimized Infrastructure Services
• EU GAMES: Green Active Management of Energy in IT Service centres (similar, for high performance computing)
• (These projects are within the EU’s 7th Framework Project; the CORDIS database holds information on these and all projects)
Green IT; encountering Connection Research 1 Nov 2011Posted by Tony Law in Impact of IT, Insight services, IT is business, IT marketplace, ITasITis, Managing IT, Social issues, Tech Watch, Technorati.
Tags: Green IT Expo
add a comment
Connection Research is an Australian insight service focussing on sustainability issues. I know of them – they’re in the InformationSpan database – but this encounter at the Green IT event is the first chance I’ve had to hear from a key person; in this case, William Ehmcke the CEO. It’s another META Group spin-off company; William, it appears, led META in Asia-Pacific until it was acquired by Gartner in 2004.
This is an as-it-goes blog, plus a bit of later tidying up.
Connection reckons to work from real data, determining metrics and developing benchmarks. Their areas are: communities; green IT; the built environment; and carbon/compliance (Australia is about to introduce carbon pricing, around A$23/ton).
Connection also recognises “green fatigue” and “greenwash”; but broader issues are gaining prominence for PR; from regulation; or for financial reasons (direct, or indirect because of brand and reputation issues). There’s a perfect storm of issues, because the rise of “big data” is increasing demand; transparency is being demanded; energy security is a rising issue (in Australia as in the USA, though not so much in the UK); and simple cost.
Connection has helped to develop an ICT Sustainability framework and index, with academic partners, across: equipment lifecycle; end user computing; enterprise & data centre; and IT as a low-C enabler. Essentially, in this, is the same distinction as in Simon Mingay’s presentation: doing IT green, and enabling green business by IT. He recognises Bring Your Own plus mobility as a sustainability strategy – it creates fundamental savings and helps reduce the need for permanent facilities on the current scale..
The Fujitsu Global ICT Sustainability report, published Sept 2011, surveyed 80 different areas. It appears that results on the IT Sustainability Index (ITSx; see Connection’s website for more information) have generally regressed recently, and this isn’t a drag effect from emerging economies in China and India. Within the detail, it’s interesting that Government is ahead of the across-sector average index. Surprisingly, brand reputation is driving some “dirty” industry (e.g. mining) up the stack. Nationally, Canada is the leader and the UK second; regulation has been driving this market; and few markets excel in all the sectors.
Ehmcke highlights the major slip in the ITSx for Professional Services; odd, because these industries have only buildings, people and intellectual property. They ought to be easily able to excel; but they don’t, and have slipped relative to 2010 as has, more understandably, manufacturing.
In response to a question: an interesting national measure is GDP value per unit of carbon emission, where Japan leads the way (though not included in the Connection stats; the survey wasn’t done because of the tsunami). Ask how much carbon your enterprise uses per $million of revenue … the use and development of effective metrics is falling back and, without data, action is impossible. Over half the CIOs surveyed have no idea about their IT power consumption, for example.
In response to another question: a point was made that sustainability, in many corporations, is handed to Risk Management (even where there’s a Sustainability Officer), because it’s seen as being about compliance and a holistic view isn’t taken.
A couple more questions, and then a quick outline of the Foundation for IT Sustainability, and the new Green IT Fundamentals course based on licensed training material from Connection, linked to CompTIA, and supported by the Global e.Sustainability Initiative. A useful presentation; the emergence of training, metrics, and certifications is important and the topic was expanded in a presentation from the BCS which I haven’t blogged.
• Connection Research
• ICT Sustainability: Global Benchmark Report Reveals a Lack of Visibility of the ICT Energy Bill Has Delayed Success, Fujitsu Press Release, 21 Sept 2011: headline summary, with link to obtain a copy of the full report
• Foundation for IT Sustainability (FFITS)
• Global e.Sustainability Initiative (GESI)
A Gartner perspective on Green IT 1 Nov 2011Posted by Tony Law in Impact of IT, IT is business, ITasITis, Social issues, Tech Watch, Technorati.
Tags: Green IT Expo
add a comment
I’m at Central Hall, Westminster – home territory for a Methodist! I’m here for an event and expo on Green IT; waiting for the keynote from Simon Mingay of Gartner. There’s connectivity, so this blog will get periodically updated. Links, as always, will get added later; probably tomorrow.
“What happened to the Green in Green IT”? Both aspects: “Greening of IT” and “Greening with IT”. Mingay’s perspective: Green isn’t the primary agenda; it’s always been about cost, and about saving resources (particularly energy); but the aims coincide. ICT brings together the business information to achieve the targets.
1 – IT organisations have to engage, don’t wait for “the business” to come to you.
2 – IT must innovate, as part of the enterprise’s wider innovation agenda
3 – investment in IT systems must connect to the business’s value generating aspects, not just the “corporate and social responsibility” (CSR) agenda; although CSR is good for profit, this issue goes further.
Some organisations are slipping backwards, believing they’ve ticked the box – this ties up with a later data-driven observation from William Ehmcke of Connection Research. Energy management is a new core competency; demand and prices are both increasing and the resulting pressure on costs is unsustainable. Mingay quotes Andrew Witty, CEO of GlaxoSmithKline: “if we don’t do something about it, we’ll be out of the game”. Tactical improvement is not enough!
Mingay highlights various aspects of the enterprise world: corporate initiatives (e.g. Unilever Sustainable Living); vendor acquisitions and partnerships; enhanced regulations (mentioned Scope 3 and see ISO 50000; see Links, below). The focus is moving beyond compliance to a “resource perspective on the organisation”, designed in, continuous (not a once-a-year report), and including the whole supply chain: which isn’t easy!
Gartner offer a Strategic Planning Assumption – one of the tenets which shape their research: “By 2015, sustainability will be an economy-wide, top-five priority for major Western European and North American CEOs.” Though as a colleague at the event commented, this doesn’t identify which current top-five issue will give way to it!
Gartner offer three frameworks to assess:
- sustainability maturity: the more mature the performance, the higher the demand for information enablement
- sustainability value, in five domains varying from Enabling to Contributing (e.g. new business models, new products/services), linked to the run/grow/transform model, with separate scales for private and public sectors;
- solution domans for sustainable business systems: from compliance (low strategic priority) to growth, and from hindsight to foresight, segmented into (a) compliance, risk and governance; (b) enterprise efficiency; and (c) brand/reputation.
Building management is an obvious area where ICT can correlate and analyse the data from environmental monitoring and control, and deliver cost and eco benefit. Mingay isn’t the first to highlighted the opportunities for FM and ICT to work together; we know about this one from a Leading Edge Forum Study Tour in, I think, 2007.
And guess what, there’s a Sustainability Hype Cycle … the key point is the very large number of technologies mapped on it. Energy-efficient IT is mainstream (“mostly”), he says. But sustainable IT is still stuck in a niche, considering aspects such as toxics and e-waste and pigeon-holed with these issues. Supply chain issues, and systemic energy efficiency (middleware, network, application) are at present still stuck in “academia”, he says – what this means is that the fundamental research on how to identify, measure and model these issues is still being done.
Three stages: optimisation (current); innovation (starting – lots of “adopted innovation” which isn’t really new, and not yet seeing attitude changes especially towards compromise on performance and availability); paradigm change (rare, as yet, but the shift to Cloud has the potential to be one). Examples: data centre infrastructure management (DCIM), treating the whole data centre as a system, with PUE modelling, active power management and so on. Gartner are bringing this topic into their Data Centre and Infrastructure/Operations events. He offered some perspectives on emerging DC design trends, in a modular “build small, build often” approach. There is a list of “ten things to think of next” – starting with measurement! The two key optimisation parameters are space, and compute power per kWh, and sustainability governance is essential for progress (with IT fully engaged).
If you think you’re done on Green IT, you haven’t understood the issues!
• Sustainable Living Plan, from Unilever, aims to” develop new ways of doing business which will increase the social benefits from Unilever’s activities while at the same time reducing our environmental impacts”
• There’s information on the ISO 50000 family of standards on the ISO Helpline (and in many other places!)
• Greenhouse Gas Protocol Corporate Value Chain Accounting and Reporting Standard, also known as Scope 3, from the World Resources Institute
• Hype Cycle for Sustainability and Green IT, 2011, Gartner, 28 Jul 2011 (available to subscribers only; if this link doesn’t work, search for document G00214739)
Brian Arthur on the Second Economy 6 Oct 2011Posted by Tony Law in Impact of IT, ITasITis, Social issues, Technorati.
add a comment
In my inbox today was a pointer to an article in McKinsey Quarterly by Brian Arthur, economist and technology thinker. In brief, he explores the idea that a “second economy” is developing apace “underneath” the visible one. He shows how IT has not only taken over doing individual mundane tasks: it is now handling the many connections through which information is created from data, and making new connections as needed.
Well, as IT people, we know that’s happening. But the use cases are worth reading through because of the inferences being drawn. This connectivity, says Arthur, is already to a large extent self-configuring, autonomous (that is, “human beings may design it but are not directly involved in running it”) and self-repairing. If the Industrial Revolution represents the economy as growing muscular power, the IT revolution represents it as growing neural capability. He carefully doesn’t call it “intelligence”, but it has the capability to react as, say, a bacterium can sense and swim towards a source of nutrition.
Two quotes caught my particular attention, for those of us concerned with enterprise IT.
“In any deep transformation, industries do not so much adopt the new body of technology as encounter it”. Think about what’s happening with consumerisation, with cloud, with the iPad (as I wrote earlier today in tribute to Steve Jobs).
Economic growth in the past has always worked through the creation of jobs (though it has rendered old ones redundant and created new ones). This may not hold in future; the jobs that most of us professionals prize are disappearing too. So “the main challenge of the economy is shifting from producing prosperity to distributing prosperity“. I’d guess the Spirit Level researchers would have something to say about that, since it’s clear that more equal societies are better not just for those at the bottom of the heap but for those at the top too. And that, too, implies redistribution.
Not a tech posting, this; but professionals, in any field, need to be stimulated to think about the consequences of what we do for those among whom we exist. Read it; it’s worth it.
• The Second Economy, W. Brian Arthur, McKinsey Quarterly, Oct 2011 [sign-up may be required]
• W Brian Arthur: External Professor, Santa Fe Institute, and Visiting Researcher, Intelligent Systems Lab, PARC
• You might want to look for a paper or digital copy of Arthur’s The Nature of Technology (ISBN: 9781416544050)
• The Spirit Level: Why Equality is Better for Everyone
add a comment
My second webinar report today features a free Gartner seminar from Jeffrey Mann, who I knew well in his META Group days as a great application analyst. The topic isn’t “what’s available” but “how do you make decisions”: potentially much more useful.
First, he’s talking about Cloud for absorbing capacity demand peaks: the right definition. But, as he points out: the high-end integration requirements of a portal don’t necessarily suit well to Cloud infrastructure. Security and confidentiality play as issues too.
Compared to my post last week, Gartner’s definition of Cloud matches in most elements but I included easy sign up without long term commitments. This matches the use case for absorbing capacity peaks, but for longer-term critical business functions (running your sales on salesforce.com, for example) most consumers will want some longer term assurance.
Gartner also add to the established model of System (or Infrastructure), Platform and Application as a Service: two further levels. Information (e.g. a Reuter’s feed), and Business Services. The shift in provision focus is from “capacity” to “capability”, and evaluation is outcomes based. I like that.
Jeff “gets nervous” if cost saving is the only reason clients are moving to cloud services; cost reduction may be part of the outcome, but there are hidden costs (e.g. increased network capacity) and many disappointments. “Disconnect price from cost … reconnect price to value.”
And, perhaps closer to the meat of the theme: “Portals … will follow … The greater portal opportunity [for Cloud] lies largely with B2B” – strikingly close to Mark Benioff’s Cloudforce message I was listening to earlier. More on that later.
Early Cloud deployment: look for something that will work with the vanilla service (“out of the box” requirement). And it’s easier to start greenfield than migrating from on-premises services. Complexity (e.g. customisations) mitigate against migration.
Jeff showed a self-assessment chart for issues such as data, compliance, policies and failure remediation – how complex is getting going again after a stop? Even with due diligence, it comes down to trust – usually lower for a pure-cloud solution. Users often prefer to be in control even of functions and processes they are not so good at.
What about best practices? Half a dozen use cases, for example capacity on demand (such as hiring lots of extra staff for a short time, I guess like Christmas postal deliveries) – Jeff calls this “Cloudbursting”. I’ve heard a presentation of this being done around a massive weekly sales promotion that, on its first outing, unexpectedly and grossly overloaded the company’s normal web servers. Cloud to the rescue!
Other use cases include: providing lower-end capabilities to segments of the staff population, such as floor staff in a store; secure extranets in an isolated environment (e.g. in M&A or restructuring when information needs to be kept confidential to a subset of staff but in more than one enterprise); or “splitting the stack”. Jeff proposes a small handful of hard-headed questions to help evaluate whether cost will really be saved (bearing in mind that he asserts that cost saving shouldn’t be the only target for a Cloud move). You must be able to identify where cost savings will come from, they’re not automatic!
And do, first, hold a mirror to your entire company and ask if – culturally – the enterprise is ready to make this kind of change (it’s trust, again, but legal issues such as e-discovery may be highly relevant here). Then be sure you understand why: examples might include flexibility, cost, being more easily current to latest software versions, reduced internal resource requirements, and so on. And you must have defined measures for a pilot to judge whether to move on.
In this, recognise the many constituencies in the business with different needs and expectations: not a new idea, but a useful categorisation of business interests on the chart. Think how to get them involved.
Gartner does expect that in the next few years all organisations will have some level of Cloud service: complete (few), or mixed (most).
In the brief Q&A, the issue of recent high profile outages (e.g. on Office 365 for some customers) was raised. Jeff’s view is to keep it in perspective: compare with internal capability, not with the ideal of zero outage.
I raised a question based on listening to Mark Benioff earlier. That presentation was put out via Facebook, and Benioff was strongly promoting “social technology” as the communication and collaboration platform of the future. As an analyst, Jeff sees very few innovations which totally replace what came before, and believes this will be a case in point. So social technologies are, indeed, very important; but the older platforms won’t go away.
Congratulations for Jeff for going beyond the technology of Cloud, and the (perhaps hyped) potential. This was a good counterbalance to Mark Benioff’s evangelistic case, and confirms that Jeff has lost none of his edge since last time I had the chance to interact with him!
• Cloud Strategies for Portals, Content, & Collaboration Projects, Gartner webinar, 14 Sep 2011, replay (link to be added when available) – in the meantime see Gartner Webinars
• Mark Benioff at Cloudforce London, ITasITis, 14 Sep 2011
Mark Benioff at Cloudforce London 14 Sep 2011Posted by Tony Law in Cloud, Impact of IT, IT is business, ITasITis, Social issues, Social media, Tech Watch, Technorati.
add a comment
I’ve joined the online Cloudforce webcast to view Mark Benioff’s keynote. I’m not able to stay online for the whole two hours, but this is notes as far as I can go.
Benioff has pitched that a new revolution has happened: the role that social technology plays and the depth of its integration into society as a whole has changed in the last year. Interestingly, the broadcast is via Facebook, not one of the established Web Meeting platforms. No registration. Just “Like” the page to join the broadcast. And Twitter feeds for the speakers linked on the page at the time they’re on stage (not when they’re off). We’ll come back to that point.
In the preliminaries at the point I joined, a key point from the JP Ramaswami: businesses need to value relationships not just customers. And now there is an enormous quantity of real data, cheap to collect, to back up research into online interactions. The emphasis being on learning and understanding what makes relationships really work.
A black screen while the broadcast switches to “Cloudforce London”. And a marketing video, pushing Salesforce Chatter but showcasing (at a headline level) how Salesforce is supporting a host of responsive apps to provide customers of banks, cars, coffee shops and more with immediate useful information. Where’s my nearest ATM? What’s my car’s engine temperature? And so on.
Here’s Benioff. A paean of praise to Thomas Watson, Ken Olson and Michael Dell – guess which one is billed as a speaker? – also Steve Jobs and Mark Zuckerberg. But the theme is a new area of innovation, and the mix and impact of social technology into society is (he believes) new.
He cited the Arab Spring, which is certainly the current high profile example: not “hard power” or “soft power” but “social power”. And he asks: is there going to be a “Corporate Spring” with the end of in-enterprise dictatorships in a similar paradigm?
People have to respond. There are now more social network users than email, and very nearly Facebook (and Twiter) *is* the Web. And people use mobile apps (smartphones, iPad) more than web browsers; the laptop is out of date for on-the-move information access. The current Forbes cover headlines: Social Power and the coming Corporate Revolution.
Moving into the message for business, he asks for (1) next generation social profiling for customers: they are, after all, on Facebook, Twitter, and wherever else. Then (2) create an employee social network and enable staff to use this information. Returning to this, Benioff talks about creating (a few years back) an internal Facebook-alike which, crucially, is integrated with their main platform. Salesforce Chatter is now available to customers, and is going through a major upgrade, sue in a couple of weeks: presence added to IM, connection to other networks, filters, workflow (approvals). Customer groups (sounds a bit like Google Plus circles) extend the concept to external customers, including file sharing and all sorts of other things; it sounds like some major education will be needed to establish who can share what, and who can commit the company to what.
(3) I intially missed as I had to step out of the room: develop the next generation sales cloud. Benioff highlighted Groupon as a fast growing company; I’m not yet clear whether this means Salesforce is integrating Groupon. And then data.com helping keep up to date as people change their facebook profiles, Twitter handles and so on.
I’d comment, though, that on my first business flight to California – twenty years ago – they were clearly already thinking that way although the information available was less. On my return flight there were some of the same crew as I’d had outbound. I’d swear I was remembered. And although the practical guess is that they’d “just” checked the database, the point is that they had done so. It’s not “just”.
I hade to drop off the broadcast. On return, the webcast is towards the end of an extended case study of Toyota’s new Toyota Friend network which provides easy information about a car’s status, problems, service schedule, and so on – with the dealer able to schedule an appointment and communicate through the network. Not that any of this information hasn’t been available before; what’s added is the integration into a social framework (and, of course, driven by Salesforce).
I’ll see if I can catch up later, and tidy up some of this information – with a link to the recording if possible, but otherwise have a look at the US Dreamforce keynote. Perhaps the key point, if you take Benioff’s point about the rapid and revolutionary integration of social technologies, is that Salesforce is not only preaching the “social enterprise”; it’s becoming one, and the use of Facebook and Twitter explicitly to support this event is part of it.
• Salesforce Chatter
• Keynote from Dreamforce in the US
• Toyota Friend: Salesforce.com and Toyota Form Strategic Alliance to Build ‘Toyota Friend’ …, Toyota US Press Release, 23 May 2011; and Twitter feed (protected for approved members only)
Retrospective: culture and technology after 9/11 11 Sep 2011Posted by Tony Law in Impact of IT, ITasITis, Social issues, Tech Watch.
1 comment so far
Ten years ago I was in Philadelphia, on a week’s visit to company offices there. A routine visit, about to become anything but. Before nine in the morning, I went down to briefly visit a friend in another group: the company’s Communications function. Naturally they kept a continuous eye on the news; and so I near enough saw the events in New York as they occurred. With colleagues in another office in Pittsburgh, we were anxious as news came in from there. And in a city itself iconic in American history, we were anxious for what might happen where we were.
Those events had consequences in business, and changes followed which the technology was just becoming able to support. Of course, top of the issues were the security implications for travelling staff. But it wasn’t just that.
The company was and still is multinational: managers with international and strategic responsibilities travelled as a matter of course. That September, the company’s most senior managers had to get involved with getting people home. There were a lot of staff out of place; and the numbers involved represented a significant and ongoing cost. Well, ITC support for remote collaboration was at a point where it could make much of that travel un-necessary. Travel decreased. And we discovered that we really didn’t need videoconferencing for most remote meetings: almost from then on, the normal method of working was by telephone conference, facilitated by good and increasingly well-structured shared databases. We had online asynchronous discussion, meeting and agenda management, and a culture where information shared beforehand didn’t need to be gone over in meetings. The technology got a kick forward, and the culture changed.
The other immediate learning was that the Internet lived up to its design specification. A big part of the transatlantic telecomms capability passed through the basement of the World Trade Center, apparently; and it was wiped out. Telephoning home was a real problem. But the Internet was designed to cope with major outage; it “anticipates damage and routes round it”. My email home may have gone three quarters of the way backwards round the world: but it got there.
Of course, security has influenced a whole lot of other technology considerations too. Today’s mobile phones (the technology barely existed ten years ago) have an “aeroplane mode” because designers need to avoid even the possibility of interference with increasingly wire-driven aircraft control systems. We have to take laptops out of our hand luggage at airport security – for as much longer as we still carry laptops, that is. People are aware of the potential of online community media (Facebook and so on) for coordinating both malign action and the public responses to it.
And – to spread this note away from technology – we have become more aware of each other in the global community. Donald Rumsfeld, in an early TV response to the disasters, asked “What have we done to make people hate us so much?” and my impression, then, was that it was a serious question. Of course, it rapidly became rhetorical with the implied answer “Nothing, of course.” But we do indeed need to understand where destructive actions like these come from. In the words of a much wiser man: “You can’t redeem what you don’t understand”. We need to listen and learn: from our colleagues, if we are fortunate to work with people from backgrounds not like our own. It’s too easy to create resentment unintentionally.
Technology, particularly today’s exploding social media environment, has the potential to bring people together. As tech people, we can be more aware of this than most. The insight is ours to share.
No links for this post. Purely a personal column. Being in the USA ten years ago was a moving experience and one I don’t regret; my thoughts today have been with many friends and colleagues with whom I was privileged to share the experience.
Did Social help the rioters – or the clear-up? 25 Aug 2011Posted by Tony Law in Social issues, Social media, Technorati.
add a comment
I’ve been out of the country, and largely out of contact, from a couple of days or so after the riots in Tottenham, in Hackney (near where we used to live), in Clapham (everyone travels through there!) and elsewhere. So I’m cautious about comments about causes and analysis, because I haven’t followed it.
But one thing appears. Many politicians and the media have been lamenting the role supposedly played by Twitter, Facebook and BBM in encouraging people to join in the “fun” and “free stuff” looting. The PM started this ball rolling in Parliament, though there’s been some more considered comment from some quarters.
But one up to The Guardian. They’ve done the job properly and gone to look at what the data actually tell us. From a database of Twitter traffic over four days they’ve produced some clear initial results. Here’s a sample:
- there was far more Twitter traffic between people trying to stay away from the trouble than bringing people in
- the peak traffic mostly came after key trouble, not before it, suggesting people were trying to communicate what was going on rather than to incite it
- on the other hand, the Hackney trouble was preceded by a lot of traffic reporting stores closing and the build-up of police presence – but not inciting trouble
- the highest Tweet peak was indeed aimed at coordinating people: but it was the day after the last major trouble, and brought people together to clean up
As the Guardian also points out, “… no politician would seek to switch off TV news or demand a newspaper blackout during a riot …”. This preliminary analysis suggests that switching off Twitter (and the other channels) would have been counter productive both in terms of the information being exchanged, and in public relations.
This is a brief summary and the paper promises further, deeper analysis of its database in the days to come. Social media advocates and sceptics alike should follow this for an example of good use of data and an accurate, research-based picture of reality. Watch the space.
The Guardian published these articles in the paper copy 25 August 2011 under the title Networks to stand firm over government calls for censorship. Online they are datelined 24 August.
• Twitter study casts doubts on ministers’ post-riots plan
• Twitter traffic during the riots, with interactive graphic (this may take some time to load/display)
• Riots database of 2.5m tweets reveals complex picture of interaction
More is available through the links in the sidebar to these stories, which have also been secondarily reported in other places.
Tags: Google, schmidt, search
add a comment
Yesterday’s Guardian Media section has a spread trailing a lecture to be given to a media audience on Friday, by Eric Schmidt of Google, at the Edinburgh International Television Festival. The MacTaggart Lecture is sponsored by Media Guardian.
Mainly, the article’s an analysis of the interaction between Google (or search more widely) and the content providers, charting the way the relationship has developed.
Google isn’t a content provider and, largely, has been able to move on from the “copyright busting” image promoted by content providers who targetted both its search business and the range of clips from shows posted on YouTube (owned by Google, of course). Content ID helps: Google’s search capability is harnessed to identifying “pirated” material on YouTube, and providers can either have them removed, advertise against them, or capitalise on them in other ways. YouTube viewers are, after all, a self-generating fan club.
In more depth, the article reviews how the definition of “television” has changed: many people, and a lot of popular content, is now viewed online from archive rather than at the time of broadcast. The BBC’s iPlayer, and other channels’ similar services, facilitate this. And if you watch a commercial channel’s online replay, adverts that get interpolated into the stream. TV replay isn’t killing broadcast advertising; it’s facilitating it.
In the words of the article: “Google needs content creators in order to thrive. Good content drives search, and search drives advertising.” The lecture will be streamed live from 18.45 UK time on Friday: see the link below.
• ‘Google needs television industry’ will be message at Edinburgh, Media Guardian, 21 Aug 2011 (the printed copy Google: let’s make profits, not war was published 22 Aug)
• Dr. Eric Schmidt to deliver MacTaggart Lecture, Edinburgh International Television Festival. The list of past speakers is here.
• Relay of lecture, Friday 26 August, 18.45 BST; see http://www.youtube.com/user/mgeitf and click the link for the 2011 lecture (the link here is current, but may change)
• YouTube ContentID
Categorising knowledge: beyond phone numbers. 27 Jul 2011Posted by Tony Law in Cloud, Impact of IT, Social issues, Tech Watch, Technorati.
add a comment
Cody Burke, of Basex, blogged recently on “Overload Stories” about problems caused by the process I might call mechanisation of knowledge. Here’s his scenario:
Your cell phone runs out of battery power, and you need to make a call. A friend graciously offers to let you use his phone, but as you attempt to make the call you realize that you have no idea what the actual number is of the person you are trying to reach. Now flash back 15 years and try again. Odds are you would have had much better luck, because you would have had to memorize that number, instead of relying on the contact list in your phone.
Well I’m not sure. Fifteen years ago you’d certainly have had a handful of numbers you remembered, but the rest of those you wanted to have handy would have been in your address book. If you left that behind, you’d have been in exactly the same trouble. And for wider contacts, I had shelf space for a whole row of phone books.
Burke refers to an academic study testing knowledge retention, discussing and updating the concept of “transactive memory”. If I’ve understood it correctly, this is the way that memory operates when the datum being remembered is connected to a working group or shared task (this isn’t quite the impression I got from Burke’s summary ; if the idea catches your attention, follow the link to the original paper).
ITasITis always goes back to the original sources. Sparrow, Liu and Wegner, writing in Science, define transactive memory thus:
In any long term relationship, a team work environment, or other ongoing group, people typically develop a group or transactive memory [my italics], a combination of memory stores held directly by individuals and the memory stores they can access because they know someone who knows that information.
It’s murderously difficult to accurately summarise academic research, but this isn’t quite the impression I got from reading Burke’s summary. What’s interesting is Sparrow et al‘s conclusion: they believe their careful statistically-based investigation provides
preliminary evidence that when people expect information to remain continuously available (such as we expect with Internet access), we are more likely to remember where to find it than we are to remember the details of the item. One could argue that this is an adaptive use of memory – to include the computer and online search engines as an external memory system that can be accessed at will.
Cody Burke is fighting against the concept that we’re becoming internet zombies – as if, somehow, the provision of vast online repository capability removes our human ability to recall. On the contrary, he says: we capitalise on it. There is “a natural (and uniquely) human tendency to learn where information resides and leverage that knowledge to be more effective”.
Or as Sparrow et al put it:
“people forget items they think will be available externally, and remember items they think will not be available … [They] seem better able to remember which computer folder an item has been stored in than the identity of the item itself … We are becoming symbiotic with our computer tools … [We] remember less by knowing information than by knowing where the information can be found”.
Two comments. When I was a student (we had computers, but not databases) I had a tutor who used to point to a row of folders on his book-case and say “There’s knowledge I know; and knowledge I know where to find”. Raymond Dwek predated Sparrow et al by some 45 years! And there was always, and still is, a third category: knowledge I know how to find. In the manual age this was the difference between referring to a specific article in a learned journal, and working through a whole range of likely sources to check for relevant information. Today, it’s the difference between a categorised index and a relatively unstructured search. I used Google Scholar to find the Sparrow et al article, by the way – I didn’t know where to find it, but I knew how.
Certainly, these days, we shift the content of those categories. Categorisation, whether it’s a database structure, email folders, or keyword search, expands “knowledge I know where to find” by providing new access routes. Search is about “knowledge I know how to find”. We may now not retain so many actual phone numbers in our heads, and may not even recall our own mobile number (after all, we never call our own phone!) but it’s probably in the address book on our Google account or in iCloud. Information no longer lives in just one place.
And secondly: I don’t speak to a phone number. I speak to a person (or sometimes, to a service). The phone number isn’t information; it’s meta-information (so also, sometimes, is an email address). It’s the means to get to the item you really want, which is the person. And phone numbers were always artificial constructs; we’re gradually doing away with them. On my desk phone, my really-most-frequent contacts are stored by name. On Skype, or Twitter, or LinkedIn, or Facebook, you contact people by their name or some hash of it, or by some identifier they’ve chosen to describe themselves. Not “unable to remember phone numbers”; moving beyond their use!
Human beings are tool-users. In my first IT job I used to teach programming to postgraduate students (and others) and I always emphasised that the computer is a tool; it extends the power of the human brain in the same way that a crane, for example, extends the power of the human arm. We take advantage of new tools and concepts as they arrive; and our modern array of electronic tools are no different. But this research is a good reminder to be aware of how we are developing in our use of these tools, so that those whose responsibility is to develop the tools themselves can effectively support knowledge workers and facilitate their activities.
• Memory in the Age of the Internet – The More Things Change, The More They Remain The Same? Cody Burke, Overload Stories, 21 Jul 2011
• Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips. Sparrow, B., Liu, J., & Wegner, D.M., Science, 14 July 2011: 1207745; published online [DOI:10.1126/science.1207745]
• Google Effects on Memory: Interview with Betsy Sparrow: Science podcast, 15 Jul 2011
• How sweet to be iCloud, ITasITis, 16 Jun 2011