About the Project Vulnerability Report

 

The Project Vulnerability Report is a two-part report that shows two different scores that we have invented at the Black Duck Open Hub.  The reason for the Project Vulnerability Report (PVR) is that we wanted to show some quickly understandable information about the reported security vulnerabilities for a project in a similar manner to how we display project activity with the Project Activity Indicator.  Similar to the Project Activity Indicator, the PVR is an opinionated assessment of factual data.

Security Landscape

A project’s security landscape is largely built from what is known about vulnerabilities within the project.  The National Vulnerability Database, the Mitre Common Vulnerabilities and Exposures database, and the VulnDB are examples of sources of project vulnerabilities.  By reviewing the data in these databases, one can get a feel for how secure a project is by the number and severity of the vulnerabilities reported against it.  However, it is a non-trivial amount of work to constantly fetch that information and lay it out in a manner so that the data are meaningful and informative.  So, naturally, we want to do it for you!

The Open Hub’s Project Vulnerability Report is quickly digestible graphic to help you get a handle on the security landscape for an Open Source Software project.

An Opinionated Approach

We took a stance in creating the PVR and expressed some opinions.  Inherent in the PVR are the following opinions:

  • More recently reported vulnerabilities are more important than those that were reported quite a while ago.
  • Vulnerabilities reported against a more recent major version are more important than those reported against previous major versions.
  • Vulnerabilities reported against more recent releases are more important than those reported against older releases.
  • The frequency in which vulnerabilities are reported are also important, essentially, “no reported vulnerabilities is good”.
  • The raw score value is important.

To implement these opinions, we sum up the reported vulnerabilities against each major and minor version and weigh those scores with an inverse logarithmic scale.  In this manner, for example, a 10-point vulnerability reported last month will have a higher weighted score than if it had been reported six months ago.  We do some math and, voila, two scores!

Security Confidence Index

The Security Confidence Index is the top-down proportional value that shows the sum of potential weighted values minus the ratio of weighted scores from vulnerabilities. A higher Security Confidence Index is preferable and values increase to the right. Projects with fewer reported recent vulnerabilities against the most recent major versions and releases will have higher scores. Hover over the score indicator to see the calculated value.

Combine this information with the Project Activity Indicator to paint a quick picture of the level of activity and the security stability of the project. For example, a high Security Confidence Index on an Inactive project tells us one thing, such as “no vulnerabilities recently reported because there is nothing new”, while a high Security Confidence Index on a project with Very High Activity tells us something entirely different.

Vulnerability Exposure Index

The Vulnerability Exposure Index is the weighted score of all the vulnerabilities.  We start with the vulnerability scores for each vulnerability reported against the project within the last five years and then apply the weighting system for each of the variables; age of report, major version, minor version.  The result is the weighted score of vulnerabilities.  Then, we inform you of the weighted score’s rank amongst all the projects for which we have a Vulnerability Exposure Index.

Similar to the Security Confidence Index, a higher Vulnerability Exposure Index is preferable and values increase to the right. Hover over the score indicator to see the full score as well as the rank.

Algorithm Details

If you would like more details about exactly how the Security Confidence Index and Vulnerability Exposure Index, they are below – but be warned: It can get complicated!

Firstly, not all versions are considered. For the purposes of this algorithm, we will take the following approach:

  • Versions that begin with non-numeric characters will not be considered.
  • Version identifiers with the following words in the identifier will be considered non-production releases and will not be considered:
    • Alpha, beta, pre, rc, candidate, build
    • Word matches will be case insensitive.
  • Following the rule of Semantic Versioning, the major version will be the first numeric identifier in front of the first decimal point.
  • The Minor version will be the first numeric identifier after the first decimal point.

Algorithm Components

Major Version Weight (MVW)

Release versions will be weighted based upon their position in the “release tree” so that, for example, version “4.x” is ranked higher than version “2.x”.  This “release tree ranking” will be dependent first upon the total number of major versions present in the 5-year examination period. The number of major versions is used to divide 100% evenly and each sequential version is assigned the increasing weight.

For example, a project has releases in version 3, 4, 5 and 6 in a 5-year period.  There are 4 major versions, each version gets 25% allocated to it in an accumulating manner.  Version 3 gets 25%, version 4 gets 50%, version 5 gets 75% and version 6 gets 100%.

Minor Release Sequence Weight (MRSW)

Release versions will be weighted based upon their relative position within the 5-year period and the number of minor releases within the major release. This value is calculated like the MVW.

For example, consider a project which has had 12 releases.  A penultimate release would be 11/12 through the release cycle and be weighted at 91.667%.  The 6th release would be 6/12 through the release cycle and therefore be weighted at 50%.

Age Weight (AW)

The Age Weight is the value of the Decay Formula at the particular number of months in the past in which the version was released.  The formula for the Age Weight is:

AW = Ce-kt

Where:

  • C = The Initial Value, which would be 100%, or full weighting
  • e = The natural log
  • k = Decay constant, which is 0.05 for our desired curve
  • t = The number of months ago

Release Version Weighting (RVW)

RVW = AW*MVW*MRSW*SUM(CVE_SCORES), where each CVE_SCORE is determined by the National Vulnerability Database and is the CVSS v2 Base Score.

Month Credit (MC)

MONTH_CREDIT = AW – (SUM(RVWs) / SUM(CVE_SCORES)), for the AW of that month, and where the sums are of the RVWs and CVE_SCORES only for releases in that month. If there are no releases in any given month, the MONTH_CREDIT will be the AW.

Algorithm Formulas

Security Confidence Index

The Security Confidence Index is the sum of all of the Release Version Weights within the 5-year examination period: SUM(RVWs).

Vulnerability Exposure Index

The Vulnerability Exposure Index is calculated from the formula:

SUM(MCs) / SUM(AWs) *100

All MCs and AWs in the 5-year examination period are included in the sums.

Summary

We hope that the Project Vulnerability Report — comprised of its two scores; Security Confidence Index and Vulnerability Exposure Index — becomes a helpful metric on the Open Hub.