Generative AI for Content Marketing: The Reality After Six Months
Earlier this year, we ran an experiment. Could generative AI meaningfully improve our content marketing efficiency? After six months of testing across multiple clients and dozens of content pieces, I have answers.
Some of them surprised me.
The Test Setup
We tested AI-generated content across several categories:
- Blog posts (like this one – wait, this one is human-written, I promise)
- Social media content
- Email marketing copy
- Product descriptions
- Case study first drafts
- Sales collateral
For each category, we compared:
- Time to produce content
- Quality ratings from editors
- Engagement metrics (where measurable)
- Client satisfaction
We used multiple AI tools including ChatGPT, Claude, and Jasper.
Where AI Worked Well
Social Media First Drafts
AI excelled at generating quick variations of social media posts. Give it a topic, some key points, and a tone guide, and it produces 10 options in seconds.
The catch: about 60% needed significant editing. But even so, starting with 10 flawed options is faster than starting from zero.
Time savings: roughly 40% for social content production.
Product Descriptions
For straightforward product descriptions – the kind where you have specs and need readable prose – AI was genuinely good.
Feed it: Product features, target audience, brand voice guidelines Get back: Serviceable descriptions that needed light editing
We produced 200+ product descriptions in a fraction of the usual time. Quality was consistent enough for e-commerce use.
Time savings: roughly 60-70%.
Email Subject Lines and Variations
AI is excellent at generating A/B test variants. “Give me 20 variations of this subject line” produces useful options quickly.
The best lines still came from humans, but AI significantly expanded the testing pool.
Research and Outline Generation
Asking AI to research a topic and propose an outline proved valuable as a starting point. It surfaced angles we might not have considered and identified common questions readers have.
This doesn’t replace proper research, but it accelerates the early stages.
Where AI Disappointed
Long-Form Blog Content
Here’s the uncomfortable truth: AI-generated blog posts were identifiable and unsatisfying.
They hit all the structural expectations – introduction, sections, conclusion. They included relevant information. They avoided obvious errors. But they lacked:
- Genuine insight or original perspective
- Specific examples from real experience
- Distinctive voice or personality
- The willingness to take positions or be controversial
Every AI-written blog felt… corporate. Safe. Forgettable. The kind of content that exists but doesn’t matter.
We tested AI blogs against human-written blogs for engagement. Human content outperformed by 40-60% on time-on-page and significantly better on comments and shares.
Case Studies and Thought Leadership
For content that needs to demonstrate expertise, AI fell flat. Case studies need specific details that AI doesn’t have. Thought leadership needs genuine thought.
AI can structure a case study template and suggest what to include. It cannot write a compelling narrative about a specific client engagement.
Content Requiring Brand Voice
Despite extensive prompting and examples, AI consistently struggled to maintain distinctive brand voices. It could approximate a tone, but the result was a generic version of that tone, not the specific brand identity.
For brands whose voice is their differentiator, AI-generated content undermined rather than supported the brand.
The Quality-Speed Trade-off
Here’s the framework we developed:
| Content Type | AI Contribution | Quality vs Human |
|---|---|---|
| Social media variations | High | 75% |
| Product descriptions | High | 85% |
| Email subject lines | Medium-High | 80% |
| Email body copy | Medium | 70% |
| Blog first drafts | Low-Medium | 60% |
| Thought leadership | Low | 40% |
| Case studies | Low | 50% |
The pattern: AI adds the most value for high-volume, lower-stakes content where consistency matters more than distinctiveness.
The Workflow That Worked
Rather than replacing content creation, we found AI works best integrated into a human workflow:
- Human defines topic, angle, key messages, and brand requirements
- AI generates research summary, outline, and first draft
- Human substantially rewrites for voice, insight, and specific examples
- AI assists with variations, headlines, social snippets
- Human reviews and finalises everything
In this workflow, AI accelerates but doesn’t replace human judgement. The time savings are real but more modest than the “10x productivity” claims suggest.
Realistic time savings with this workflow: 20-30% for content production overall.
The Unexpected Problems
Homogenisation
As more companies use AI for content, everything starts sounding the same. The same phrases appear everywhere. The same structures. The same safe takes on topics.
For SEO-focused content, this is particularly concerning. Search engines are getting better at identifying AI-generated content, and distinctiveness matters for standing out.
Fact Checking Overhead
AI confidently generates plausible-sounding nonsense. Statistics are invented. Sources are fabricated. Claims are made that don’t hold up.
The time saved generating content is partially eaten by the need to verify everything. For B2B content where accuracy matters, this overhead is significant.
Quality Drift
There’s a temptation to lower standards when content is “AI-assisted.” The thinking: it’s good enough, and we can produce more volume.
This is a trap. More mediocre content doesn’t help. It might actually hurt as audiences tune out generic material.
Our Conclusions
After six months, our position:
Use AI for: research acceleration, outline generation, product descriptions, social variations, and as a brainstorming partner.
Don’t use AI for: thought leadership, case studies, or any content where distinctiveness is the point.
Never skip: human review, fact-checking, and substantial editing of anything AI produces.
Accept that: AI content tools are useful but not transformational. They’re productivity aids, not replacements for good writers and clear thinking.
Final Thought
The AI content hype suggested we were months away from automated content factories. The reality is more nuanced. AI is a useful tool that makes some tasks faster. It’s not a strategy and it’s not magic.
The companies that will win at content aren’t the ones producing the most AI-generated material. They’re the ones using AI to free up human capacity for the work that still requires humans – the insight, the perspective, and the genuine connection with audiences.
That’s harder to automate than the vendors want to admit.