Comment by ruuda

Comment by ruuda 2 months ago

2 replies

There is a growing culture of microdependencies, where one project can depend on hundreds or thousands of libraries, combined with automated "vulnerability" tracking, which means projects are constantly receiving notifications about issues in libraries deep in the dependency tree, most of the time in a part of the library that is not even used by the top-level application. It's no surprise that "security" is eating up more and more time.

pornel 2 months ago

And every CVSS score is 9.8, because it's designed to never underestimate potential risk, no matter how absurdly unlikely, rather than be realistic about the actual risk.

  • clwg 2 months ago

    CVSS is not not really meant to measure risk, it primarily measures the severity of technical vulnerabilities. It should be used in conjunction with other factors such as system exposure and threat sources to determine the probability of exploitation. This should then be combined with impact and costing data to fully assess the risk.

    Regulatory requirements also need to be contextualized similarly. If they become burdensome, efforts should focus on reducing the exposure of your systems to those risks.

    That said, patch and configuration management should be second nature and performed continuously so that when a real issue arises, you're prepared and not worried about your environment falling over because you're unsure how it will respond to an update, or whether your backups will restore properly - which are risks as well.

    I saw more than a few organizations struggle with log4j because they only patched server systems when a vulnerability was publicly exposed, and a Metasploit exploit was available.