Nearly a fifth of software scanned during the past year has a serious security flaw, according to a new report from application security company Veracode, released this morning.
The study draws on scans of 759,000 applications that Veracode customers conducted with the company's platform during the past 12 months. Overall, 74% of the scanned applications had at least one flaw, and 19% had an issue deemed "high or critical severity."
The report defined a flaw as "an implementation defect that can lead to a vulnerability."
"When you download an application onto your computer, almost 20% of the time you're getting a high-risk [flaw] that at some point in the future will eventually be discovered and a patch will probably be issued," Veracode CTO and founder Chris Wysopal told Government Technology.
The high portion of software bearing flaws is not unusual – such figures align with Veracode's findings from previous years, Wysopal said.
Even so, the figures may be an undercount, said the company's chief research officer, Chris Eng. The report only captures data about software that Veracode customers deemed important enough to scan. Any issues in lower-priority applications would not be reflected.
"There are all the applications that are not business critical enough – or they don't have scanning in their budget, or for whatever reasons there's not visibility at this level – they're not being scanned at all, perhaps," Eng said. "They're generating even more security attack vectors than these ones are."
Software life cycle: The 4-year mark
Fixing flaws isn't a one-and-done affair, and applications need active maintenance, but they aren't always getting that throughout their life cycles, the report found.
Organizations appear to more readily address vulnerabilities in new software, according to the report. Roughly 30% of applications showed flaws when first scanned. But ensuing scans made "shortly after" turned up issues in only 22%, suggesting organizations acted to fix the problems. Similarly, a Veracode press release states that "nearly 80% [of applications] do not take on any new flaws at all for the first 1.5 years."
That diligence appears to drop off, however. Scans showed new flaws in 30% of four-year-old applications and 35% of four-and-a-half-year-old ones, per the report.
"We only know about this because they are scanning, and we are finding the risk – like, we are finding flaws for them," Wysopal said. "It's just they're... doing less about it than they were earlier in the life cycle."
The study doesn't uncover why organizations are taking less action to repair older software, Wysopal said, but possible reasons might include changes in "the makeup of the team or how the company values the app." Changeover among the developers on a project also could potentially be a factor, Eng suggested. It's possible that incoming developers are making mistakes when trying to fully understand and work with code written by someone else.
Either way, the result is that older applications may present greater security issues, Wysopal said. It's typically impractical for organizations to try to retire software once it hits several years of age, so they need other strategies for maintaining security throughout the application's life cycle.
Organizations can reduce risks by conducting frequent, automated scans to help catch issues promptly. That makes it more likely flaws are detected while the project team still includes people familiar with the code – or even the engineer who introduced the mistake – and so can make adjustment without having to refamiliarize themselves, Wysopal said.
"If engineers step away from code for a few months, they have to relearn it," Wysopal said.
Secure development training
Trainings in secure code development can also help head off issues and reduce the chance flaws are introduced.
Such training isn't always included in formal computer science education: Forrester's review of "the top 50 undergraduate science computer programs as ranked by U.S. News for 2022" found that "none require[d] a course in code or application security for majors."
But more attention has turned to the issue. In the aftermath of Log4j, the federal Cyber Safety Review Board issued recommendations for improving the cybersecurity landscape; these included making secure software development part of computer science education.
Formal degree programs aren't the only opportunity for intervention, however, and organizations may provide hires with on-the-job training in security principles. Eng believes interactive trainings that see employees test out ideas in hands-on lab environments are more effective than passive trainings, in which employees read informational slides.
The report detected some impact from the interactive trainings the company offers, once developer teams had completed 10 trainings: "When you had developers using this interactive-style training, you had... like a 2% reduction in the probability of flaws being introduced, but then like a 12% reduction in the number of flaws introduced," Eng said.
Open source support
The Log4j incident also drew heightened attention to the security of open source code. Veracode research probed how some of these code bases are maintained, raising questions for future study.
The report looked at open source libraries on GitHub, examining the roughly 30,000 it saw actively being used in customers' scanned software, Eng said.
Overall, about half of the code repositories listed contributions from fewer than 10 developers, and about half had seen no activity in the past year. Some outliers emerged, however. About one in 10 repositories had more than 72 developers, while other languages relied on libraries where contributions were made by just one or two people. For example, nearly 92% of JavaScript applications came from a single developer who'd been inactive over the past year, per the report. (At the same time, the report found fewer flaws in JavaScript applications, compared to applications in.NET or Java.)
Findings that a code library has only a few contributors is "not saying necessarily anything about them being more vulnerable. But it certainly does mean that it's a little bit more fragile," Eng said. "What happens if a vulnerability in that library is discovered – are they going to be slower to patch? Are they going to be faster? What if one of those developers gets run over by a bus? What does that mean for the maintenance of the project?" – Government Technology/Tribune News Service