Last week, Facebook blocked a car insurance plan proposed by Admiral, which hoped to view young drivers’ profiles in order to help set car insurance premiums.
The company wanted to use computer analysis of likes and posts to make a judgement about the driver’s level of risk. They set up computer algorithms designed to consider personal attributes of young drivers such as their level of organisation. According to the BBC, by Admiral’s logic, a Facebook post that invited friends to meet up and included a specific time and location suggested a more organised individual and therefore less of a risk as a driver.
On the day of they were set to launch, however, Facebook deemed the use of such an algorithm inappropriate and blocked Admiral’s plans to determine discounts on the basis of Facebook posts and likes. So, while you may still be hiding your Facebook page from your employer, making sure they never set eyes on the photos from your ‘lads’ holiday in Kavos 2 years ago or the incriminating photos from last night that suggest you’re not bedridden with the flu today, you’re safe from being judged by your car insurer for now…
On the one hand it all seems quite ridiculous, that a person can be reduced to a sum of their activity on social media. But Dr David Stillwell of Cambridge University – who has developed similar technology – believes that Facebook posts offer a very accurate picture of someone’s personality.
In an interview with The Guardian, the Admiral-project leader, Dan Mines, described the algorithm as “innovative, it is the first time anyone has done this”. But isn’t this what we do, as marketers, every day with ad-targeting? Reducing people to their Internet history allows us to target those individuals we deem most likely in need or want of our services and products and it’s great. Google Adwords does it and Facebook does it too, using personal information shared by its users to help businesses target their ads.
So why does Admiral’s algorithm feel like a step too far? Why does it feel so much more intrusive to analyse user’s data in this way? Is it that we feel comfortable with people finding out about our work, lifestyle, tastes and likes and dislikes, but we are far from feeling comfortable about judgements being made over those things? Yes, let a computer determine that I like watching videos of baby sloths and show me more of them, that’s cool. But don’t use that data to make external assessments and judgements about my personality, my lifestyle and my ability to drive safely…
And hey, maybe you don’t think it’s a step too far. Certain industries will surely be excited by the invention of such an algorithm. At the end of the day, with so much data flying around today, it can be hard to know where exactly to draw the line. Facebook drew the line on Admiral here, but that’s not really surprising. After all, if Facebook were to become implicated in such weighty assessments of individuals, the company would surely risk discouraging users from sharing the sort of personal data its current ad-targeting business model relies on.
Some might say rather foolishly, I asked Harry for his opinion on this subject and abruptly lost 10 minutes of my life. But here’s what he had to say on the topic:
“It’s worth mentioning that Facebook often updates its privacy policy, the details of which we doubt anyone ever reads, and these changes may be increasing the amount of data available to companies. Now, personally, I don’t think that an algorithm trawling my Facebook posts is a big issue if I’ve allowed a company to view that data. It’s not as though there is a stranger looking through my holiday snaps. It does, however, become an issue if these algorithms are becoming more sophisticated and being used to make a judgement about your personality and behaviour. Especially if this is then being used to treat you differently – i.e. with how much you pay in insurance premiums. I think that Facebook has an obligation to educate its users on the capabilities of these algorithms and the potential implications of that access. So with the Admiral example, rather than the current message, which is something like ‘Allow Admiral to view your posts and likes’, it should have a message like ‘Allow Admiral to view your posts and likes and then use that data to make an assessment of your personal attributes, which may then affect your insurance premiums’. Doesn’t quite have the same ring to it, does it? My gut feeling is that they do say that somewhere, perhaps page 39, section 4, sub-paragraph 8 of their privacy policy… But we don’t actually care enough about our privacy to go in and read it?”
If you want to find out more about how to target your ads then get in touch with one of our digital marketing experts, we promise not to use a snazzy algorithm to determine whether you’re eligible for a discount on one of our packages (or do we?).