AI can be used for a number of tasks, up to and including being your wingman on dating apps. But those same qualities of being able to interact with someone in real time also make this technology dangerous — including the possibility of AI being used to scam you in some way. Or, knowing scammers, it’s more like that AI could be used to scam you in a large number of different ways. Either way, getting scammed by AI is a situation that most people — and the federal agency that regulates product sales and advertising — would like to avoid.
As Engadget reports, Michael Atleson — an attorney with the FTC’s Division of Advertising Practices — recently explained the agency’s areas of concern with respect to AI. Among his concerns were AI convincing people that it was, well, not AI. “People could easily be led to think that they’re conversing with something that understands them and is on their side,” he told Engadget.
Atleson also shed light on how the FTC would regard ads that use AI in a misleading way. “Companies thinking about novel uses of generative AI, such as customizing ads to specific people or groups, should know that design elements that trick people into making harmful choices are a common element in FTC cases, such as recent actions relating to financial offers, in-game purchases, and attempts to cancel services,” he said.
Meet Rizz: Your AI Dating Wingman and Response Generator
The keyboard plug-in can theoretically do the talking for you on dating appsAs Axios reported earlier this year, scammers are already using AI and chatbots to scam people believing that they were getting help with tax preparation. The language of Atleson and his colleagues might dissuade some would-be scammers from getting involved with AI scams — but unfortunately, it seems like a new era of scams is already upon us.
Thanks for reading InsideHook. Sign up for our daily newsletter and be in the know.