Conference Context
EEEU 2024
Trust in AI-generated influencers is conditional. Credibility and relevance can increase purchase influence only when disclosure is clear and persona consistency is maintained.
This study outlines trust drivers that can be prioritized for Gen Z virtual-influencer programs. The key decision is how to sequence credibility, relatability, and creative novelty without crossing ethical boundaries.
Conference Context
EEEU 2024
Online Publication
June 18, 2025
Source Type
Springer chapter
Figure Tier
Reported evidence
Verifiable citation: Springer chapter record (10.1007/978-981-96-4116-1_9) .
Source note: Derived from Cao et al. (2025), Journal of Theoretical and Applied Electronic Commerce Research, 20(2), 150.
Build transparent disclosure architecture first, reinforce credibility signals second, and amplify visual novelty only after trust stability is confirmed across touchpoints.
Prioritize credibility cues before aesthetic amplification, then use relatability as a scaling lever for behavioral intent. Conversion risk can rise when persona consistency weakens across posts, platforms, or sponsorship contexts.
Persona redesign should be a late intervention, triggered only when trust decay persists after disclosure clarity and relevance adjustments have already been applied.
Operational scaling should proceed only with explicit safeguards that protect informed consent and reduce manipulation risk.
Clearly identify synthetic persona status to prevent hidden manipulation and maintain informed audience consent.
Avoid exploitative persuasion patterns for younger cohorts with high social conformity sensitivity.
Use explicit sponsorship markers and verifiable claims to reduce trust erosion after campaign exposure.
Gen Z audiences should receive clear disclosure and manipulation-risk protections so AI-enabled influence does not create asymmetric information disadvantages.