I've been covering social media for a long time, more than a decade in fact, and there's one thing I've learned about Adam Mosseri, the head of Instagram. It's that he's a corporate drone with little personality or passion for anything in particular, and he's going to drive company policy no matter what and say whatever's best for Instagram and Meta.
Mosseri has never shown any true creative passion or pursuit. If you ask him what his favorite IG feature is, he'll tell you it's the one he just released. If you ask him who his favorite artist is, he'll list the most popular artist on IG at the moment (or his brother), and if you ask him what he's into, he'll tell you some random IG trends.
Sure, he started wearing cardigans and chains after becoming head of IG, but Mosseri was never a creative person and doesn't seem to understand what creative people really want or need. All he has to do is take a message from Zach, justify the logic behind it, and broadcast it to the Instagram community. Therefore, I am always skeptical of posts where he shares his personal thoughts and opinions within the app.
Because, honestly, I don't think he has any, and through that prism I can really see the corporate message he's trying to blend into a reasonably intelligent screed.
This week, Mosseri shared his thoughts on the year ahead for IG in a lengthy article discussing the future of content, with the key premise that “authenticity is becoming infinitely reproducible” in an AI-powered world.
Regarding the 20 text-only slides in an IG carousel post (could text posts become a new IG post option at some stage?), Mosseri explains:
- AI tools now allow anyone to replicate creators' work
- AI content is also improving and will soon be indistinguishable from human-generated content.
- People no longer share personal content on Instagram (share via DM)
- Creators are turning to less sophisticated content to counter AI fakery (or, as Mosseri puts it, graphic evidence)
- But AI tools will soon replicate this aesthetic, increasing skepticism about what's real and what's fake.
- Instagram is working to highlight AI content through labeling, but it will soon become so overwhelming that IG won't be able to label everything
- As a countermeasure, Mosseri said Instagram will seek to verify authentic content and highlight original creators.
- Instagram also plans to do more to show information about who is behind each account
So what does this mean?
Now, based on the above overview of Mosseri's motivations, I would say that the driving force here is pretty clear. Rather than working to protect creators, Mosseri is trying to legitimize the influx of AI content and give users more choice in what they see in their apps.
Meta has spent hundreds of billions of dollars developing AI tools, so it's only natural that they want users to create with them to further their position as the AI market leader. More people creating with AI is a good thing for the meta, so Mosseri is basically waving the white flag, saying creators have to get better at producing original content if they want to keep up with AI fakes.
This is despite the fact that more and more platforms are exploring anti-AI options as people begin to get overwhelmed by AI slop. As a result, users are less likely to share a post because they don't know if it's real or fake, impacting their trust. This is essentially the opposite of traditional social media, allowing people to share their perspectives with the world. AI is eroding this, and Meta has actively promoted that by encouraging users to create with AI tools at every turn, so Mosseri's veiled concerns about such things are disingenuous to say the least.
Mosseri knows that Meta actually wants more AI-generated junk, which means more content flowing through the system and more opportunities to keep people engaged, so it's actively creating AI tools that better replicate the work of real people. So while Mosseri's efforts to highlight real creative work by real human creators are lagging behind, it is the meta itself that provides people with the tools to deny this.
And there is no doubt that the solution here will also benefit the meta. Meta intends to encourage more creators to sign up for Meta Verified to help their content rank higher. Because Meta knows this is from an actual human creator. And while Mosseri says that not all AI content can be flagged, Meta could significantly counter this by adding its own built-in digital tagging (and by partnering with other platforms to detect various forms of AI tags) and a simple flagging system for users to indicate whether they believe a post is generated by AI. IG may add AI content tags if the content suggests that it is).
There are ways to combat this, but Mosseri is trying to justify the notion that AI content will become so good that creators will need to adapt their approach.
But they don't.
Yes, some of the AI-generated content is actually good, but because it's so easy to create, anyone can spew out AI-generated garbage, and they're doing it on such a large scale that the vast majority of AI material is downright sloppy.
Do you know if AI-generated content is successful? Content with a good concept, a human-originated idea at the core of its depiction. AI tools still cannot come up with human ideas, which is a key differentiator. Additionally, AI tools cannot connect with the same audiences as the top online creators.
Human connection remains key, and while Mosseri may want to downplay it as a way to justify the influx of AI content, it remains fundamental to all popular creatives and creators that resonate.
That doesn't change. AI tools may improve our ability to create derivative works, but they will always be derivative and only resonate based on the ideas and concepts behind them. While it's hard to come up with great ideas and concepts, and even harder to come up with them consistently, there are very few people in the world who have the personality that can come across on screen, resonate with a wide audience, and build a viable and valuable online community. On top of that, even fewer people have the work ethic and commitment to make this happen.
This is why, despite the promise of a “creator economy,” only a small percentage of online creators actually make money from their work. This isn't a realistic “career” for 99% of people, but the platform wants you to believe that if you create content consistently, feed more material into your databank, show it to your users and keep them interested, you too can become the next online millionaire.
It's chasing a dragon, and that dragon is what keeps their profits high. So the natural answer would be, “Creators need to come up with better angles to stay ahead of the game.”
However, no matter how good the AI content is, it is meaningless if the concept is crap. Human-centered ideas are about people, and the ability to come up with them is a skill in itself.
The real value is there, always has been, and that won't change even as we have more tools to create more crap.
