Soft lighting, a white prayer covering, wire-rim glasses, and an earnest half-smile that seems almost practiced are characteristics of a certain type of portrait that has been appearing in my feed lately. Melanskia is her name. She suggests a $50 drink mix called Modern Antidote, has more than 300,000 Instagram followers, and gently warns about “industrial waste” in the liver. She doesn’t exist either. Only when you slow down and take a close look will you notice that final detail: her eyes don’t quite follow the camera, and her hair sits a bit too perfectly under the bonnet.
Instagram is not used by the Amish, as is well known. That serves as both the hook and the initial giveaway. In actuality, Melanskia is not Amish. One of a few AI-generated personas created to promote supplements, she is a synthetic influencer, and whoever is managing the account is benefiting greatly from her “rural Amish” framing. It encapsulates a product that most likely lacks clinical evidence with an antiquated signal of honesty, simplicity, and distrust of contemporary industry. The trick is that. The hard work is being done by the aesthetic.
| Detail | Information |
|---|---|
| Topic | AI-generated “Amish” influencers on social media |
| Most-Cited Example | “Melanskia” – synthetic Instagram persona |
| Claimed Followers | 300,000+ on Instagram |
| Product Pitched | Modern Antidote (dietary supplement) |
| Retail Price | ~$50 per jar |
| Distribution | Amazon and direct-to-consumer |
| Disclosure Status | No AI disclosure on the account |
| First Major Coverage | New York Times, March 9, 2026 |
| Underlying Technology | Deepfake avatar creation tools, incl. MetaHuman Creator |
| Broader Category | Synthetic / virtual influencers |
| Regulatory Oversight (US) | FTC guidance on endorsements (AI disclosure still evolving) |
| Ethical Flag | Appropriation of religious/cultural aesthetics |
In a sinister way, it’s worth pondering how clever this is. For many years, sophisticated, urban, and hyper-modern influencers dominated the internet. Then trust began to erode. Viewers became weary of the shine, sponsorships, and filters. Something grounded began to work better. In small towns, on farms, and in kitchens with actual wooden floors, creators started to pull numbers that the glossy accounts were unable to. It was inevitable that someone would discover that you could also fake that.
Who “someone” really is is the more important question. The Melanskia account and a few other similar synthetic personas were linked by the New York Times to marketing campaigns endorsing unproven dietary supplements. The fact that these accounts were created by AI is not disclosed. CGI watermark-free. Just the voice, the quiet authority, the face, and the Amazon link. It’s still unclear whether this technically violates FTC endorsement rules because regulators haven’t kept up with AI-generated spokespeople, and this entire industry is expanding in the gap.

It’s difficult not to feel a little uneasy as you watch this play out. Deepfakes aren’t new, but this particular variety of them is particularly deceptive. In Pennsylvania and Ohio, plain-living communities such as the Amish and Mennonites are utilized as a type of costume. Real traditions, real communities, reduced to a feeling. These individuals are notorious for choosing not to use digital technology, leaving them vulnerable to the synthesis and reintroduction of their own faces. The technological one contains a cultural violation.
The more difficult reality of why it works is also present. For years, researchers at organizations like the Pew Research Center and Liverpool John Moores University have warned that deepfake avatars will eventually change the online trust economy. Businesses like Guildhawk contend that professional digital humans—such as multilingual corporate spokespeople, museum reconstructions, and cultural heritage work—are genuinely helpful when created with consent and clear ethics. Consent and disclosure are meant to be the boundaries between that and Melanskia. In reality, the typical scroller cannot see that line.
The odd thing is that it most likely won’t slow down. It is ridiculously profitable to sell a $50 supplement through a fake influencer. The handwritten-looking captions, the bonnet, and the accent are all less expensive than what any advertising agency could produce. We’ll probably see more of these faces until platforms enforce AI disclosure or the FTC makes a firm rule. Quieter ones as well. The next generation might not even care about the costume. Only a tone will be borrowed.
