The New York Times currently has a running series on technology called "The Privacy Project". I'd recommend that everyone who uses technology at least glance through the articles, but I may also from time to time point out some articles that I think you should pay attention to: I Used Google Ads for Social Engineering. It Worked
In this piece, Patrick Berlinquette discusses how he used Google Ads in a campaign for suicide prevention, with redirect ads.
A helpful ad on Google will match your keywords with a relevant landing page. But some ads provide countermessaging or alternative destinations that go against your search words. These are called “redirect ads.”
With redirection, marketers swerve your monetizable desperation. But we can also swerve something bigger: your beliefs, convictions and ideology.
In Mr Berlinquette's campaign, he showed unique ads to individuals who used search terms that showed "suicidal intent" and boosted special ads to those searchers who were physically located at or near the Golden Gate Bridge. His results were pretty astounding.
Nearly one in three searchers who clicked my ad dialed the hotline — a conversion rate of 28 percent. The average Google Ads conversion rate is 4 percent.
Mr. Berlinquette discovered a way to use Google Ads as a force for good--a way to positively influence the lives of those who are suffering.
However, the piece notes--almost in passing--that seemingly anyone can run these kinds of ads.
Google let me run the ads with no issue. It didn’t seem to care what the language on my website was, or what phone number I directed people to. There was no vetting process to become a redirector. I didn’t need qualifications to be a conduit of peoples’ fates. I expected the ads to get rejected, but they were not.
For every search conducted by an American who wanted to kill, I saw the exact words he or she typed into Google before clicking my ad. And anyone who runs campaigns using the blueprint will have access to the same. It is a one-way mirror into the American psyche.
Click data can be used for harm by a redirector with bad intentions. If redirectors can groom ISIS sympathizers, they can also use it to groom school shooters.
The internet has given us a tremendous capacity to reach out to those who are struggling and suffering or in need. But the internet is an amoral system--it does not inherently know whether a search or redirect is good or bad. As humans, our choosing to ignore this amorality is a moral decision in and of itself and something we rarely consider
National Suicide Prevention Lifeline