posted by Anni on October 23, 2011
I have always been interested in the symbiotic interplay between technology and human systems, how humans invent and use new technologies and then how those technologies change or alter the human systems themselves.
As I thought about how to approach this next post there are two streams that I wanted to explore and I have decided to do this in two separate, but inter-related posts.
The first relates to digital anthropology, and will explore some of the current thinking around humans and technology on more of a personal scale. The second will explore aspects of the sociological context, bringing in group dynamics and the emerging concept of the digital tribe.
Digital anthropology can be defined as “the social and cultural dimensions of information technologies and digital media” and explores how human cultures are affected by the rise of digital technologies and networks. For humans, who are predominantly social beings and largely operate in groups, this is as much about how we operate and act as individuals as it is about how we operate and act in groups.
Someone who has been exploring this for many years is Genevieve Bell, Director of User Interaction and Experience at Intel Labs. Bell’s work focuses on how different cultures around the world use technologies and, in her recent presentation at the Web 2.0 Summit she gave a presentation on the “life of data”. In it she asks
“what if we treated data as if it were a person?”
So, what would data be like if it had human-like qualities? Bell suggests that there are a number of characteristics to consider. Firstly, there are some things that will never be digitised, some data is forever analogue. Secondly, data is social and it likes to connect with others but it also comes from a particular place, and emanates from within a context. Data cannot necessarily always be controlled and can, in fact, “run wild” and be feral (i.e. out of control). An innate quality of data is that it is imbued with responsibilities in terms of telling the right story, and of doing so in the right spirit. Data wants to look good and can tell a story that is not always necessarily true. And finally, data is not necessarily meant to last, a lot of data is temporal and is meant to go away.
This is a fascinating, and, in fact, hugely important question, because data is increasingly becoming something that is, and can, be treated as a thing in and of itself, in many ways separate to the entity with which it is ostensibly related. If we think about the emerging artificial intelligence technologies that are now on the horizon with applications such as Siri and other types of semantic systems, the disconnect between data and the entity becomes more evident.
It is fitting that Apple launched Siri the day Steve Jobs died. Siri is a very impressive semantic application based on smart voice recognition combined with natural language processing, machine learning and the ability to link data between applications. It was developed by DARPA which in 1966 was tasked with developing computer capabilities for intelligent behavior in complex situations. In 20o7 the project was spun out into a company and then it was bought by Apple in 2010.
I saw Siri at SemTech in 2009 and Tom Gruber, the founding CTO, was rightly proud of the system which generated a good many conversations, many of them to do with the with sociological aspects of data and the potential of semantics in everyday use. Having now been integrated onto one of the most popular IT platforms on the planet, Siri has the potential, according to numerous observers, to be a real game changer, and has gone a long way towards achieving DARPA’s original brief.
This is because Siri goes beyond “need” and focuses on “intent”. It’s task is not necessarily to understand what someone wants, but why they want it. According to Azam Khan, Siri brings together three vital elements which enable a far more intuitive and natural experience for humans which are:
Siri is more than just a “nice to have” application, it has the potential to become the integrator of all integrators upon which millions will rely to live their daily lives and get things done.
So. what does this mean for data? Apple, like all of its competitions, has been collecting customer data or years, and the more we rely on our technologies to outsource our thinking and actions the more data is collected. I, for one, no longer remember phone numbers or email addresses and, with GPS, I don’t even have to look up a map. The more I rely on the little screen which is permanently with me the more data I give it, and it learns about me through my habits and idiosyncrasies more than anyone else except for those immediately around me, predominantly my family. So, as I use Siri it will become my own personal digital assistant and if anyone wants to know anything about me they will be able to just ask it. It will learn who they are and from there, based on their relationship with me, it will determine what I want them to know. Herewith the wicked problem.
At the minute I think I am in control of my data and I assume that, to be honest, probably not that many people are interested in me or what I do. But my data is valuable and, as Douglas Rushkoff said,
“on Facebook we’re not the customers. We are the product.”
If something is free then the price you are paying is your data.
Genevieve Bell’s comments about the feral nature of data are poignant here because with so much data it would seem that mine would be lost in the aggregation of everyone else’s. But, with increasing computer power and big data analysis, coupled with really clever semantic and other systems, all of a sudden anyone can find my needle in the proverbial data haystack which is built on all of our data. According to Amit Sheth, Director of the Kno-e-sis Centre, it is through that data that we ourselves are now the “participant sensors”. We feed into the greater system because by leveraging the power of the network (both social and technological) and in doing so we are helping to detect earthquakes, facilitate disaster relief, monitor traffic, and analyse environmental data.
All of this may for the collective good and it would be hard to argue against it. However, what about issues such as privacy and the question of choice? What about our ability to determine what we share and what we do not? And how much do we understand about what is going on around us?
Dion Hinchcliffe describes what he sees as the “Big Five” IT trends which are underpinning everything we have mentioned thus far. These are:
Combined, these five create an unprecedented information ecosystem which completely changes the dynamic for society itself. As Clay Shirky observes
“A revolution doesn’t happen when a society adopts new tools, it happens when a society adopts new behaviours.”
These behaviours are themselves manifestations of the crucial societal transformations driven by digital technologies, identified by Castells as:
So, back to the digital brand.
The digital brand is, as I have defined previously, the emergent outcome of the data that can be collected and analysed from all sources relative to a specific thing – person or product, organisation, group or nation. Therefore being emergent it is organic in nature and one could say that it is alive. It is both a process and a thing, and as we learn to manage data we need to learn to manage this brand as it accompanies us wherever we go, and is now an integral part of how we communicate with technologies as much as how we communicate with each other.
Castells’ sees that communication happens by activating the “Windmills of the Mind” to create shared meanings. And Toffler saw that
“The link between communication and character is complex, but unbreakable. We cannot transform all our media of communication and expect to remain unchanged as people. A revolution in the media must mean a revolution in the psyche.”
So, if we now have systems that are becoming contextually aware and inhabited by “intentionality” what does this mean for us? How do we manage data which is separate to the entity with which it may have been originally associated and trust that no harm will come to us as individuals?
The concept of the “personal data locker” has been around for a while and for those interested I recommend “The Support Economy” by Shoshana Zuboff and James Maxmin.
A good overview of the concept is provided by David Siegel but essentially it is a system where you, the individual, are in control of your data and it is up to you to determine who and what has access to it, and can share it with others, how it is updated and what use is made of it. It relies on the combination of trust – in terms of to whom you give permission, and transparency – in terms of everything being open to you.
The idea is what drove much of our work on “eLetterbox” and, if Australia Post had understood it properly, their “PostZone” project could have completely repositioned them for the digital economy. So could Xerox with whom we also tried to work.
The idea of personal electronic mailboxes and personal identification management has been floated in the UK, and more recently in Australia, and it is now on the radar for marketers. The truth is that people prefer to be in control, but it needs to be made easy, something that is being proven with tablets. So, where are things headed?
The so-called digital generation are in theory more comfortable with the use of digital media but they have neither the wisdom nor experience to understand the potential consequences of sharing the minutiae of their personal lives (although as they age that is changing). The older generation have the wisdom but even they post amazingly revealing things on their Facebook pages. For all to at least have a choice there needs to be an awareness raising which society must both demand and facilitate. Organisations of all types need to integrate digital education into their capability development, and democratic governments need to proactively take a leadership role as they have in areas of sex education and cyber-bullying.
The work of digital anthropologists such as Genevieve Bell, Lucy Suchmann, Dana Boyd, Danny Miller, Douglas Rushkoff and Clay Shirky, to name just a few, needs to be socialized and integrated into mainstream education. And business, first and foremost, needs to take a responsible position with regard to privacy and transparency.
Hopefully the legacy of Steve Jobs will be more than through the products he shaped and brought to so many people, but may lead to a fundamental age of digital enlightenment. If it does then we shall be blessed with a future to embrace rather than one to fear.
Creative Commons CC BY-NC-SA: This license allows reusers to distribute, remix, adapt, and build upon the material in any medium or format for noncommercial purposes only, and only so long as attribution is given to the creator. If you remix, adapt, or build upon the material, you must license the modified material under identical terms.
CC BY-NC-SA includes the following elements:
BY
– Credit must be given to the creator
NC
– Only noncommercial uses of the work are permitted
SA
- Adaptations must be shared under the same terms