Photo: Pink Sherbert Photographt
Not only does your phone know where you are, but it knows where you are going to be. It may even know why you’re going there.He calls it the “Interdependence and Predictability of Human Mobility and Social Interactions,” but the algorithm researcher Mirco Musolesi and his team recently tested in the UK stirs up thoughts reminiscent of Phillip K. Dick’s Minority Report, and all the moral trappings that come with it.
Spurred by a Nokia initiative, and the 6,000 Euro prize money, Musolesi started researching a mobility tracking project (using volunteers furnished by Nokia).
Mobility itself, every day routine, is quite predictable. Where Musolesi and his team ran into problems was unpredictable, unscheduled modes of mobility.
The team got by though, with a little help from friends.
Parmy Olson, of Forbes, reports, “When the algorithm was simply tracking the volunteer, it could predict their future GPS coordinates to within roughly 1,000 square meters. When the prediction took into account additional information from a single friend, the error rate improved by several orders of magnitude.”
Indeed, with the addition of friends and acquaintances, Musolesi was able to predict where his targeted principle would be in 24 hours, within a range of error of 20 meters.
The idea started with Nokia’s “Dedicated Challenge,” issued in early November 2011. The challenge called on researchers to write algorithms that would use existing information to infer certain personal details about their data users. Ostensibly, Nokia wants this information so it can, for example, sell Facebook a better ad-experience for it’s users.
Don’t worry, Nokia just wants you to find a nice pair of shoes, honest.
Nokia’s “Dedicated Challenge” was just that, ‘dedicated,’ to three ends in particular:
– Semantic Place Prediction: ‘Why’ are you where you are. Is it a restaurant? A Tea Party meeting? Planning the next Zuccotti occupation?
– Next Place Prediction: Leaving that Scientology meeting to visit your male masseuse?
– Demographic Attributes: White? Wealthy? Possibly an actor?
The moral implications of algorithms that infer, intuitively, your future location and it’s purpose, your age, race, gender, your acquaintances and close friends, are quite astounding.
Police are interested in the tech for crime prevention reasons. I wouldn’t doubt if the NSA was interested for the National Security applications, and knowing their history of loosely defining the words “threat” and “national security,” I don’t dare wonder how they’ll be applying it.
I assure you, they’re not interested in getting you a nice pairs of shoes.
NOW WATCH: Briefing videos
Business Insider Emails & Alerts
Site highlights each day to your inbox.