Forgetting and remembering in the digital age


In the past month I have participated in a number of events which have highlighted the juxtaposition between “remembering” and “forgetting”.

And 2014 is a rather important year on both fronts.

Firstly, 2014 is the tercentenary of the Longitude Act (1714) when the British Government established a prize to solve the “Longitude Problem”. As a part of these celebrations the National Maritime Museum has an exhibition “Ships, Clocks and Stars – the story of Longitude” which “tells the extraordinary story of the race to determine longitude (east-west position) at sea, helping to solve the problem of navigation and saving seafarers from terrible fates including shipwreck and starvation.”

The problem of Longitude was a very real one as European powers expanded their interest in, and exploration of, the world, and each of the suggested solutions relied on the painstaking gathering of information captured from various instruments which enabled huge numbers of measurements to be taken by humans and then meticulously entered into tables for ongoing reference.

As I wandered around the exhibition what struck me was the enormity of the task of not only making the observations, but in the compilation and continual updating of the tables themselves. This was in stark contrast to the ease with which I later consulted Google Maps to find the location of a nearby pub together with the timetable of the Thames Clipper in order to navigate my own way home.

We who live in the Western World take so much for granted as we tap into our smartphones and very rarely do we think about how fragile our systems are until we are hit with power outages, such as happened in North America during last year’s “polar vortex”, or last week’s slowing-down of the internet which most people were oblivious to.

We forget how far we’ve come in a very short time, and it is useful to visit places like the Royal Observatory to reflect on these past accomplishments, as well as to exhibitions such as “the Digital Revolution” at the Barbican, which documents not only some of the developments in film, architecture, design, music and game development, but demonstrates the creative possibilities offered by augmented reality, artificial intelligence, wearable technologies and 3-D printing. For those of us who have not only lived through, but have an active interest in, the evolution of digital information technologies, the exhibition was a wonderful reminder of human ingenuity and creativity; for many others it was a demonstration about what is now “real” and no longer “magic”, no longer “the future” but very much the present.

The notion of “presence” is hugely important when it comes to remembering and forgetting. As the World remembers the “War to end all Wars”, by commemorating the Centenary of the First World War, this is absolutely the right moment to ask some very important questions about what sort of world we want to create and what sort of future we want to live in, and to determine how best we are going to answer those questions for the benefit of ourselves and future generations.

The idea of “memory” is something which we have traditionally approached from the human perspective.  It has social and often creative aspects, and often the “convenience” of forgetting is as useful as the “right” to remember, based on circumstance and objective. As Napoleon is supposed to have said:

“History is the version of past events that people have decided to agree upon.”

So, what happens when it is not “people” who decide, but machines? And how are those decisions made?

Much of the conversation around me at the minute is about the European Court of Justice’s recent ruling on the “Right to be Forgotten”. Last week I went to a Debate on the “right to be forgotten” hosted by the Central London Debating Society. What was most interesting was that it seemed that those who actually researched the EU legislation had a far more positive view of it than those who did not, regardless of which side of the debate they were representing.

The legislation itself is certainly getting people talking, even those who would normally not be interested in something like this, and, for those in the data and information worlds, it is presenting all sorts of unforeseen and complex challenges, and highlighting the need for legislators and policymakers to have far more developed digital skills and capabilities in order to deal with governing in the digital age.

When it comes to articles that are “on the public record” and “in the public interest” I can see the case against this legislation, but when it comes to the situation where, according to Eric Schmidt, young people are going to have to change their names in order “escape their cyber past”, I find this both disconcerting and, in fact, very sad.

It could well be that the concept of “privacy”, and, indeed of the fallibility of human memory, is now a thing of the past, and that young people are now no longer able to experiment with who and what they want to be (as many of us were able to do) because of the greed of large information companies whose business models feed on the information they often unwittingly provide.

I have written about information and privacy in previous posts (particularly the digital brand), but it seems that now these issues are impacting at a personal level, and my only hope is that at last ordinary citizens wake up to the fact that digital information presents them with very different choices to make in terms of how they interact with both organisations, and with each other.

It may be that what will emerge is not just something like the “personal data store” but a whole new transparency in the relationship between those who supply data (as in individuals) and those that seek to use it (organisations). Perhaps a system will finally emerge whereby data itself has value as a source of currency and exchange, and the key elements of the digital brand will be more clearly articulated.

All of this, of course, requires people to have a greater understanding of data and information; a “digital literacy”.

In order understand this on a personal basis I have just spent two days with Decoded doing both their “Code in a Day” and “Data in a Day” courses. Incredibly valuable days which not only demystified the whole concept of “coding” for me, but also gave me insights into the actual mechanics of data, and the incredible array of tools and resources that are now easily, and often freely, available on the Web.

Whilst there are obvious benefits for people from both the advertising and the retail sectors, the key insight for me was the skills-gap that exists between the people who develop and make policy and legislation, versus those who actually work with data and code on a daily basis.

For far too long “IT” has been seen as “rocket science”, and, whilst I am not going to undermine the skills required in order to artfully programme code, I am going to say that they are absolutely teachable, and that that they are as important as “reading, writing and ‘rithmatic” … the traditional “Three R’s”. 

Coding is a language and a state of mind. It takes patience, it takes a certain aptitude for the “craft” but it is logical and it can be taught to everyone, as the UK Government has already determined to do. Whilst we don’t all need to rush off and be “coders”, I absolutely believe that each and every one of us who lives “digital lives” needs to equip ourselves with at least the basic skills to understand what we are doing in order to more effectively analyse, understand and communicate more effectively with data.

Only then can we have any sort of reasonable debate about “remembering” and “forgetting” and make a conscious decision as to whether history will be written by humans or machines, but hopefully both.