Press "Enter" to skip to content

AI-Generated Influencers Exploit Military Identity for Online Gain

An Instagram account featuring a supposed blonde Army service member named Jessica Foster gained over a million followers before being exposed as a fake. This scenario illustrates a growing phenomenon where AI-generated personas mimic military identities to amass followers and earn money online.

The watchdog group Military Phony, which monitors fake military claims, refers to this as “digital stolen valor.” It equates to wearing unearned medals and using false credentials for respect or financial gain, though these actions might not always meet legal definitions under the Stolen Valor Act.

AI-generated personas often pair identity-driven messaging with viral content to increase engagement and visibility. (Photo credit: screengrabs from Emily Hart’s social media.)

The trend of AI-generated influencers adopting military or trusted professional identities like nursing is increasing. These personas leverage the credibility of these roles to gain followers, boost engagement, and sometimes make money. Advances in artificial intelligence are making these synthetic identities more convincing and harder to detect.

The ‘Emily Hart’ Account

One notable case is the “Emily Hart” account, which amassed a following through political and lifestyle content, eventually leading followers to paid adult subscriptions. A Wired report revealed the persona was produced by a 22-year-old medical student, known as “Sam,” who used AI to generate images and cater to a conservative audience. Sam used Google’s Gemini AI to refine the persona, targeting financially engaged and loyal groups like older men in the U.S.


An AI-generated influencer persona, “Emily Hart,” uses lifestyle imagery and political messaging to build a following and drive engagement on social media. (Photo credit: Screenshot via social media)

The Department of Defense did not comment but directed inquiries to federal law enforcement, highlighting that impersonating military personnel is illegal. The FBI has not responded to requests for comment.

Legal Precedent

Legal experts note that the difference between protected speech and punishable actions lies in intent and profit. Falsely claiming military service online might be protected speech unless it involves fraud or financial gain. Eugene Volokh, a law professor, explained that fraudulent claims for money or other values could lead to legal repercussions, referencing the U.S. v. Alvarez case.

The Alvarez case acknowledged that fabricating military service for profit is punishable, potentially resulting in civil or criminal penalties, regardless of whether the persona is real or AI-generated.

Platforms Struggle to Keep Pace

Despite platform rules for disclosing AI-generated content, enforcement is inconsistent. Accounts often go unlabeled or are only removed after considerable traction, allowing them to build large audiences. Meta, Instagram’s parent company, has disclosure policies but hasn’t detailed enforcement methods.


An AI-generated image used by a social media account posing as a U.S. Army service member highlights how synthetic identities can mimic military imagery to attract followers online. Photo Credit: Screenshot via social media

Watchdogs like Military Phony emphasize the difficulty in identifying these accounts, as AI-generated images can obscure key details that normally reveal fraud. These personas often combine military imagery with targeted messaging to quickly establish authenticity and resonate with specific audiences. Thus, the appeal often lies in the persona reflecting shared values or beliefs, not necessarily its authenticity.