I'm working on toran, a live API inspection tool that works with just a URL swap. No SDK, no proxy config, no cert setup.
The problem - I couldn't see what my code was actually sending to third-party APIs. Debugging meant console.logs everywhere or messing with Charles/Proxyman certs.
I m curious how people here think about a calmer, signal-first feed in practice.
If you ve tried Trace already: What felt immediately useful? What felt missing or confusing? What would make it something you d actually open every day?
If you haven t tried it: What would you need to see before giving a feed like this a real shot? What would make you bounce?
I m early and still shaping this, so honest feedback (good or bad) is genuinely helpful.
Step 150 of debugging why a payment does not get saved to a database. Two days on this one bug. And there are plenty more. If you can build somehing that will do the back-and-forth, the "now try this and tell if it... no? Okay, le's do this thn, and this, and that..." Do what Claude Opus 4.5 is tellling me to do, the tens of hours, to get to the solution. Automate that and you have a winer - becuase there are 100K full-stack devs who will do all this more effienctly themselves, yes. but there are 10M non-developers who love what they built, but are getting killed in the debugging, the last 5%.
Hey everyone! We re launching VibrantSnap updates today and celebrating the New Year with an exclusive 20% launch discount.
Before launch, we d love your thoughts on a feature we re considering next: AI voice-over.
The idea: Instead of recording a voice-over separately, you d speak naturally while recording your screen and VibrantSnap would automatically reformulate and generate a clean, polished AI voice-over from your original speech.
Proposed flow: 1 You record as usual and talk naturally 2 AI cleans up phrasing + tone 3 Final video gets a clear, professional voice-over
We re currently building a new capability in SuperIntern: turning real meeting conversations into MCP-powered automation.
The idea is simple: SuperIntern listens to the meeting, understands what people say, and then uses Model Context Protocol (MCP) to orchestrate other tools and agents.
I ve been building a simple tool for makers who want quick, frictionless user feedback directly from their site.
It s lightweight, fast, and drops easily into any stack.
I d love to know: What s your biggest struggle with collecting feedback? What channels work best for you? Does a 10-second feedback button sound useful? What features matter most to you as a founder/dev?
Sharing to learn, not just promote honest feedback would really help me shape the roadmap
Hey everyone, sharing a small but meaningful milestone.
BeamUp finally got its first paid user, 5 months after launch. What made this really special is that the user came in organically, started using BeamUp with Google Drive, and upgraded on their own, without me reaching out or changing any messaging beforehand.
BeamUp is a no-code upload portal that lets people receive large files directly into their cloud storage, no servers, no backend, no retention.
Here s what surprised me: Even though someone understood BeamUp well enough to upgrade, I realized many visitors weren t actually understanding the core value from the landing page. The concept is simple once it clicks, but unfamiliar at first glance.
Yesterday I launched something weirdly simple but surprisingly powerful, a resume that never dies.
You download the PDF once and it keeps updating itself forever. Projects, skills, links, everything stays alive.
The launch went way better than I expected (we even hit the top charts (#6 )), and I m insanely grateful to everyone who checked it out, messaged, upvoted, or just got curious for a second
Haimeta now supports Google s latest Nano Banana 2 Pro jump in and try it free today.
We believe the future of creativity is atomic: ideas broken into tiny units that can be endlessly remixed and reimagined. With Haimeta + Nano Banana 2 Pro, creation becomes real-time remixing, fast iteration, and playful experimentation.
Here s something uncomfortable I ve learned building AI agent systems:
AI rarely fails at the step we re watching.
It fails somewhere quieter a retry that hides a timeout, a queue that grows by every hour, a memory leak that only matters at scale, a slow drift that looks like variation until it s too late.
Most teams measure accuracy. Some measure latency.
Here s something uncomfortable I ve learned building AI agent systems:
AI rarely fails at the step we re watching.
It fails somewhere quieter a retry that hides a timeout, a queue that grows by every hour, a memory leak that only matters at scale, a slow drift that looks like variation until it s too late.
Most teams measure accuracy. Some measure latency.
We all know data can be super powerful, but sometimes a story from a customer or in the news will catch me and just remind me how very true that really is. In my day to day work I love the little stories. An hour saved here and there, an insight that wouldn't have been possible without tools and techniques at your fingertips. A friend recently used Querri to quickly diagnose why conversion rates for his product had dropped. Long story short, everything was fine because they'd had a big uptick in visitors from a lower converting segment that they were pushing to get more of. Everything was right with the world and he was able to move on with his day knowing their strategies were working. What's a time data changed your trajectory in a big way or a small but meaningful way?