Automated Decision Making at Grindr
Are Grindr users subjected to Automated Decision Making (ADM)? What does that even mean? Automated decision-making is the process of making a decision by automated means without human involvement. For example, an app or service might make a recommendation to a user, or personalize a feature based on an algorithm (data + math) instead of a human decision. Sometimes, these can be quite simple, like "people who like this often also like that.” Other times, they are more complex. The privacy world is recently in a flurry with news about ADM systems, which are sometimes lumped together with more advanced systems that embrace AI (Artificial Intelligence), and questions around how apps like ours use them.
Many proposed state privacy laws in the US are now engaging on this topic, and a few weeks ago the EU Commission released a 120+ page document covering their proposed rules for Artificial Intelligence which they believe provide “[p]roportionate and flexible rules [that] will address the specific risks posed by AI systems and set the highest standard worldwide.” The rules focus primarily on high-risk AI systems that could have a significant impact on a person’s real-world movements and opportunities (employment, law enforcement, border control, etc.).
There are some powerful AI systems that are on the horizon. Self-driving cars that have to make very fast decisions based on what they can see or sense are an example. IBM's Watson famously beat champions in Jeopardy and Chess. And AI systems are getting more powerful, quickly. But for some time, most every service you use will not be using the AI of science fiction but more AI “lite.”
For EU users, Article 22 of the EU’s GDPR (General Data Protection Regulation) states, “The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.” I really like the UK ICO’s approach to this part of the GDPR by providing a detailed check-list to walk through the responsibilities as a company with respect to ADM.
Does this matter to Grindr and our users? Today, it largely doesn’t impact our users as we don’t engage in ADM outside of our security systems—definitely not to the point of affecting a user’s legal status or their legal rights, i.e., “producing a legal effect.” In the Grindr Privacy Policy we call out the following use of collected personal information: “For Automated Decision Making - for example, to detect and remove spammers, detect and remove non-compliant images, etc. through artificial intelligence.”
As described, we have automated security systems that try to identify those attempting to create spam accounts and block them, and our systems identify and remediate accounts that are breaking our community guidelines. As with any automated system, we do get some false negatives which lets some of the bad stuff through. We are grateful for our ever-vigilant users to help report those accounts. Sadly, there is also the chance for false positives, meaning someone who didn’t do something wrong is flagged as someone who did. Finding the balance between the two is difficult and complex. We continually fine-tune our systems to navigate this ever-evolving challenge. Rest assured, we give those users who fall into the “false positive” bucket a direct line to our Customer Experience team who can remedy the situation quickly if our systems made an error.
I recently shared with Tom Quisel, our Chief Technology Officer, that many people think Grindr’s systems are more sophisticated than they really are. I appreciate that his teams deal with a ton of complexity, but how much of what we do at Grindr today reaches the bar of Automated Decision Making or Artificial Intelligence?
Tom shared this with us:
“The lines between Automated Decision Making (ADM) and Artificial Intelligence (AI) are blurry. Part of the confusion arises because AI is used as a catch-all term to refer to different concepts. AI can refer to the simulation of human intelligence, to computers that are capable of problem solving, to specific skills such as speech recognition or computer vision, or to computerized agents that perceive their environment in some way and attempt to achieve a goal, just to name a few. AI is sometimes conflated with Machine Learning (ML), the science of inferring rules from data without direct human involvement. AI can be as simple as a common thermostat: a device that perceives its environment by measuring the temperature and decides to turn the heat on or off to achieve its goal of holding a particular temperature. Many factors need to be considered when assessing the ethical implications of an ADM system. Among others, there's the complexity of a system, how it is used, how it can be overridden, the data used to create the system, any biases, alternative options, and the intentions behind it.
To focus on Grindr, our app provides a straightforward set of features that allows users to search for and chat with other nearby users who've used the app recently. The app puts the power to search, view, filter, and block in the hands of our users. When a user searches for others nearby, Grindr displays those who were online recently and applies the searching user's filters (such as age, tribe, relationship status, etc...), sorted by distance. Sometimes a little randomness is thrown in to keep results fresh. That's it. There's no recommendation algorithm to speak of on Grindr today. Grindr gets out of the way, and lets our users drive their own experience.”
While Grindr doesn’t leverage AI or ADM outside of security systems today, we do endeavor to make “smarter” product features in the near future (better cascade results, recommended tags for searches, etc.). As we explore ADM and AI for Grindr features, our ongoing privacy commitment to our users includes transparency, and where appropriate, we’ll provide controls to our users to turn off or provide input to help finetune the automated systems.
-Shane Wiley, Chief Privacy Officer | LinkedIn