Wednesday, 26 November, 2014 – 00:00
When I began writing my new book, there was one primary focal point: big companies providing internet services profit from the crimes of others. That profit can be tracked and traced and internet companies can be made culpable in just the same way as banks, etc. who deal with criminals.
But as the book has developed (and this is why it’s going to miss its November slot and be published in January instead) the subject widened and became a full review of what companies do to ensure they profit.
Yesterday, Facebook announced its new terms and conditions. Its timing is astonishing. Yesterday was also the day that the UK review into why the murderers of a soldier, Lee Rigby, found that, several weeks before the attack, held a conversation on Facebook in which one of them said that they planned to kill a soldier and was egged on by a known extremist overseas.
Whether or not it knew it knew is not the point: Facebook in fact knew.
Facebook’s new terms and conditions specifically explain that Facebook analyses user’s posts and acts upon what it finds. What Facebook is looking for is information that will enable it to better target advertising to users based on the data they add to the Facebook system.
In the new book, I point out that large companies that provide internet services collect and crunch massive amounts of data. They do it for profit.
Now we know, they do not use that data for good, to save lives.
Social media, the Facebook case shows, does not go hand in hand with social responsibility.
It must be made to do so. And, perhaps unsurprisingly given my long history of running into the wind when everyone else sails before it, I explain how we, the rest of us who demand that responsibility, can make it do so.
It’s not just Facebook – that’s today’s emotionally charged example. It’s all users of the web.
© 2014 Nigel Morris-Cotterill
All rights reserved
Like this content?