AI-generated code could increase developer workload and add to risk

Artificial intelligence is supposed to make things easier, right? Not for developers it seems as AI-generated code is set to triple developer work within the next 12 months according to software delivery platform Harness.

This could also mean that organizations are exposed to a bigger 'blast radius' from software flaws that escape to production systems.

Nine-in-ten developers are already using AI-assisted coding tools to accelerate software delivery. As this continues, the volume of code shipped to the business is increasing by an order of magnitude. It's therefore becoming difficult for developers to keep up with the need to test, secure, and remediate issues in every line of code they deliver. If they don't find a way to reduce developer toil in these stages of the software delivery lifecycle (SDLC) it will soon become impossible to prevent flaws and vulnerabilities from reaching production. As a result, organizations will face an increased risk of downtime and security breaches.

"Generative AI has been a gamechanger for developers, as eight-week projects can suddenly be completed in four," says Martin Reynolds, field CTO at Harness. "However, as the volume of code being shipped to the business increases, so does the 'blast radius' if developers fail to rigorously test it for flaws and vulnerabilities. AI might not introduce new security gaps to the delivery pipeline, but it does mean there’s more code being funneled through existing ones. That creates a much higher chance of vulnerabilities or bugs being introduced unless developers spend significantly more time on testing and security. When the Log4J vulnerability was discovered, developers spent months finding affected components to remediate the threat. In the world of generative AI, they’d have to find the same needle in a much larger haystack."

Harness suggests that the answer to the problem of AI is more AI, used to automatically analyze code changes, test for flaws and vulnerabilities, identify the risk impact, and ensure deployment issues can be rolled back in an instant.

Companies should be looking to integrate security into every phase of the SDLC; automate the processes needed to monitor and control open source software components and third-party artifacts, such as generating a Software Bill of Materials (SBOM) and conducting SLSA attestation; and use Gen AI to help remediate issues.

"The whole point of AI is to make things easier, but without the right quality assurance and security measures, developers could lose all the time they have saved," adds Reynolds. "Enterprises must consider the developer experience in every measure or new technology they implement to accelerate innovation. By putting robust guardrails in place and using AI to enforce them, developers can more freely leverage automation to supercharge software delivery. At the same time, teams will spend less time on remediation and other workloads that increase toil. Ultimately, this reduces operational overheads while increasing security and compliance, creating a win-win scenario."

Image credit: BiancoBlue/Dreamstime.com

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.