Forums
What do you measure when AI bots inflate your devtool traffic?
My team was pretty happy in December.
Direct traffic jumped 150% for one of our cloud devtools.
Then we actually looked at the data.

These are my 3 favorite developer marketing examples (paid and organic)
Let s turn this thread into a place to share our favorite developer marketing examples. I ll start with a few that stood out to me.
Creator and influencer partnerships. Railway has grown reach through creator-led YouTube content that focuses on problems adjacent to what Railway solves. Tutorials like How to Setup Auto Deployment From GitHub walk through common deployment and CI workflows developers already search for, with Railway appearing naturally as part of the solution rather than the starting point. I liked this approach because it meets developers where they are already looking for answers, shows the product in context, and allows evaluation through existing workflows instead of a direct product pitch.
-
Billboard distribution. Snowflake ran high-visibility out-of-home campaigns around major tech hubs and conference moments, using short category messaging around the Data Cloud. The placements appeared on major routes such as US-101 in San Francisco and in high-traffic locations across New York, with copy rotated to stay relevant during events like Snowflake Summit. I liked this approach because it focused on category ownership rather than features, made the message easy to recognize at a glance, and created moments that people naturally photographed and shared online, extending distribution beyond the physical locations.
-
GitHub SEO and repository-led discovery. Prisma uses GitHub as a primary distribution channel by making its repositories the default destinations for high-intent, product-led searches. Repos like prisma/prisma and prisma/prisma-examples consistently appear for queries such as Prisma examples, Prisma migration, and Prisma schema, where developers are already evaluating how to implement the tool. Instead of sending traffic to marketing pages, these searches land developers directly on working code, example projects, and active issues, making GitHub the evaluation surface and allowing distribution to compound over time through stars, forks, and ongoing usage.

If you want to see more developer marketing examples like this broken down channel-wise, along with the thinking behind why they work, you can read the full post here
Now, what yours?
What 100 Developers Told Us About Technical Videos
We ran a survey to understand what developers look for in technical videos.

We shared a short questionnaire with developers and received 100 responses across backend, frontend, DevOps, and data roles. The goal was to understand video length preferences, discovery channels, pacing, and what makes a video useful.
AI search behaves very differently for developer content vs non-developer content
We have been digging into how AI search behaves across different types of content, and one interesting pattern we observed and wanted to share with y'all.
AI search performs well when the guidance is interpretive. Advice that a human can adapt, contextualize, and apply flexibly tends to survive summarization and generation. Minor inaccuracies do not invalidate the outcome.
Developer workflows are different. Most developer queries require instructions that must execute correctly in a specific environment. Versions, configs, tooling choices, and project conventions matter. When AI search retrieves common patterns and smooths over missing context, the answer often looks correct but fails when applied.
This explains why AI search feels reliable for explanations and fragile for real-world implementation. The system optimizes for what is most repeated, not what is most precise.
“Product Hunt is about consistency”
That's what @fmerian, one of the most active and successful hunters on Product Hunt, shared with us while discussing how developer tool launches work today.
Product Hunt works as a repeatable surface when teams launch early and continue returning with progress. An early launch creates visibility, feedback, and a baseline presence on the platform. Each subsequent launch builds on that foundation.
Early adopters anchor this process. An initial launch brings the first group of users into the product. As the product evolves, those users provide context during future launches by sharing how they use the tool and what has changed since the last release.
@Supabase followed this approach. Their first Product Hunt launch happened when the product was still in alpha. They kept shipping, gathering feedback, and launching again with meaningful updates. Over time, this built familiarity and momentum, leading to stronger outcomes in later launches.

