Why security tools are failing developers and what needs to change [Q&A]


The needs of development and security teams often seem to be in conflict. Developers want speed of delivery while security wants stability and safety.
These competing interests can be made worse by the security tools that developers are expected to use. We spoke to Mackenzie Jackson, developer advocate at Aikido Security, to discuss why security tools are failing developers and what can be done about it.
BN: Is there a fundamental disconnect between development and security teams' understanding of what the other does?
MJ: Yes, and it's costing companies more than they realize. Developers are under pressure to ship features yesterday, while security teams are tasked with preventing the next breach.
This creates a fundamental gap in priorities. For example, security teams want to be alerted on all threats whereas developers loathe any alert that isn't 100 percent confirmed as a threat. This creates a situation where developers view the security team as slowing them down and security teams view developers as reckless cowboys who don’t care about security.
On top of this, there is a massive communications gap between those two teams.
The security industry has created this impenetrable language that even seasoned pros struggle with. I was recently at a conference showing this book of 10,000 security acronyms -- I literally don't know what 90 percent of them mean, and I work in security! Yet somehow we expect developers to translate this alphabet soup into their daily work.
We keep telling everyone that 'security is everyone's responsibility' while simultaneously making it completely inaccessible to anyone outside the security bubble. It's like handing someone a book in a foreign language and then getting frustrated when they can't read it.
BN: What is it about most security tools that mean they don't work for developers?
MJ: Most security tools are built by security people who've never had to ship a product on a deadline. They create these tools thinking about security problems, not developer workflows.
The typical scenario plays out like this: A developer integrates a security scanner that immediately floods them with hundreds of alerts. They have no clue which ones actually matter, which are false positives, or how to fix them efficiently. So what happens? The tool becomes 'shelfware' -- just another tab they never open. When a tool slows down developers, they become remarkably creative in finding ways not to integrate it into their workflows.
The data shows that up to 85 percent of security alerts are noise. When you're a developer trying to ship features, and your security tool is constantly crying wolf, you eventually just tune it out. And that's precisely when the one real vulnerability slips through.
I’m not saying that developers don't care about security -- they absolutely do. But when faced with a choice between meeting a deadline or wading through an ocean of possibly-meaningless security alerts, priorities become pretty clear.
BN: Is the answer different tools or a different approach to using them?
MJ: It's both, but the approach matters more than the tool itself. We need to stop creating this false choice between speed and security. No developer wants to ship vulnerable code, but our current approach forces them to choose.
What we need is security that integrates seamlessly into developer workflows. Security should feel like a helpful co-pilot, not a gatekeeper slamming on the brakes. That means tools need to drastically reduce false positives, provide explanations in human language (not security-speak), and offer actionable fixes.
The best tools focus on protection without disruption. For instance, runtime protection tools that work inside applications can automatically block threats without developers having to do anything. But they need to be designed for minimal false positives and maximum context awareness, otherwise, they're just another source of noise.
The reality is that security tools need to earn their place in the development pipeline by providing real value without slowing things down. If your security tool is just creating busywork and frustration, it's actually making your security worse, not better, because developers will find ways around it.
There's also this whole concept of ‘shift left’ in security that's become a bit of a running joke among developers. Yes, catching issues earlier is great, but just throwing more security tools at developers without fixing the fundamental usability problems isn't helping anyone.
BN: Does AI have a role to play here and if so what?
MJ: AI has a role, but it's not the magic wand that vendor marketing would have you believe. I wrote a piece called AI is failing in Cyber because there's a lot of hype with very little substance behind it.
Where AI actually shines is in reducing the tedious parts of security work. Think about using it to analyze code changes at scale, prioritize vulnerabilities based on actual risk, or generate fixes that developers can quickly review and apply. These are real problems that AI can help solve.
Our research found that 67 percent of software vulnerabilities are silently patched and never disclosed -- that's a massive blind spot for security teams. AI can help identify these gaps by analyzing code changes across millions of repositories in a way humans simply can't.
The most promising application is using AI to bridge the security-developer divide -- translating security concepts into developer language and automating the remediation process. Developers want to build, not spend hours fixing security issues or deciphering cryptic vulnerability reports.
But: AI will amplify whatever you feed it. If your security approach is already noisy and full of false positives, AI will just generate more noise faster. The companies getting this right are focusing AI on specific, high-value problems rather than trying to use it for everything.
Also, no AI is replacing the need for security expertise or developer judgment. The goal should be freeing up humans to focus on the creative, complex parts of security and development that machines still can't handle.
Image credit: .shock/depositphotos.com