Close Menu
geekfence.comgeekfence.com
    What's Hot

    Designing trust & safety (T&S) in customer experience management (CXM): why T&S is becoming core to CXM operating model 

    January 24, 2026

    iPhone 18 Series Could Finally Bring Back Touch ID

    January 24, 2026

    The Visual Haystacks Benchmark! – The Berkeley Artificial Intelligence Research Blog

    January 24, 2026
    Facebook X (Twitter) Instagram
    • About Us
    • Contact Us
    Facebook Instagram
    geekfence.comgeekfence.com
    • Home
    • UK Tech News
    • AI
    • Big Data
    • Cyber Security
      • Cloud Computing
      • iOS Development
    • IoT
    • Mobile
    • Software
      • Software Development
      • Software Engineering
    • Technology
      • Green Technology
      • Nanotechnology
    • Telecom
    geekfence.comgeekfence.com
    Home»Software Engineering»7 Recommendations to Improve SBOM Quality
    Software Engineering

    7 Recommendations to Improve SBOM Quality

    AdminBy AdminJanuary 20, 2026No Comments10 Mins Read2 Views
    Facebook Twitter Pinterest LinkedIn Telegram Tumblr Email
    7 Recommendations to Improve SBOM Quality
    Share
    Facebook Twitter LinkedIn Pinterest Email


    A software bill of materials (SBOM) provides transparency into the elements of an integrated software product. Such transparency is critical to identifying system vulnerabilities and thus mitigating potential security risks. There is growing interest in using SBOMs to support software supply chain risk management. In September 2024 Army leaders signed a memorandum requiring SBOMs for vendor-supplied software. More recently, the Department of Defense (DoD) Chief Information Officer, through its Software Fast Track Program, is requiring that software vendors submit their SBOMs, as well as those from third-party assessors, to enable detection of variances between SBOMs for the same software.

    Different SBOM tools should produce similar records for a piece of software at a given point in its lifecycle, but this is not always the case. The divergence of SBOMs for individual pieces of software can undermine confidence in these important documents for software quality and security. This blog post outlines our team’s recent findings on why SBOMs diverge and recommends seven ways to improve SBOM accuracy.

    SBOM Harmonization Plugfest

    The SEI’s 2024 SBOM Harmonization Plugfest project, sponsored by the Cybersecurity and Infrastructure Security Agency (CISA), aimed to uncover the root causes of SBOM divergence, such as imprecise definitions or standards, how uncertainty is addressed, or other implementation decisions. The SEI brought together SBOM tool vendors, standards producers, and others in the SBOM community to produce sample SBOMs for analysis. The recently released Software Bill of Materials (SBOM) Harmonization Plugfest 2024, on which this post is based, outlines our team’s findings, analysis, and recommendations to help SBOM producers generate more consistent and reliable SBOMs.

    We asked Plugfest participants to generate and submit SBOMs based on nine software targets chosen as a representative sample of various programming languages as seen in Table 1 below.

    Table 1: Nine software targets Plugfest participants based their submitted SBOMs on.

    The SEI gained approval from most participants to make their submissions public. Those SBOMs that were approved for release are now available at SEI’s GitHub site.

    Overview and Analysis of Submitted SBOMs

    We received 243 SBOMs from 21 Plugfest participants. To ensure anonymity and to prevent any bias in our review, we anonymized participant names by assigning alphanumeric codes to each. One participant, who was assigned the code Y2, submitted many more SBOMs (102) than all the others (Figure 1). Y2 generated and submitted SBOMs in every format their tool supported (i.e., source and binary analysis as well as enriched and non-enriched).

    Figure 1: Bar chart displaying SBOMs submitted per target.

    Figure 1: SBOMs Submitted per Target

    Analysis

    To ensure an objective analysis, we first determined evaluation criteria for our review of the SBOMs. We then determined automated approaches to extract information from the SBOMs to facilitate our development of software tools for analysis as well as our generation of baseline SBOMs, which we used for comparison purposes.

    Evaluation Criteria

    Assessing the consistency of the minimum elements of the submitted SBOMs was a critical component in determining their completeness and accuracy. A list of minimum elements specifies the baseline SBOMs should meet and facilitates information sharing. The criteria we used for minimum elements are those required for documenting a software product’s primary component and its included components as outlined in CISA’s Framing Software Component Transparency: Establishing a Common Software Bill of Materials (SBOM):

    • SBOM Author Name
    • SBOM Timestamp
    • SBOM Type
    • SBOM Primary Component
    • Component Name
    • Component Version String
    • Component Supplier Name
    • Component Cryptographic Hash
    • Component Unique Identifier
    • Component Relationships
    • Component License
    • Component Copyright Holder

    Analysis Tools

    Due to the many submissions, we developed tools to automate ingesting and processing SBOMs to collect, collate, and export data about them. Participants submitted SBOMs in SPDX and CycloneDX formats in a variety of encodings including JSON, XML, and YML.

    We wrote code for processing SBOMs using Python within Jupyter computational notebooks hosted on an SEI internal Bitbucket repository, which also contained a copy of SBOM Plugfest submissions. We used two primary notebooks for analyzing SBOM submissions: one for CycloneDX and one for SPDX. We sought to extract the following from each SBOM:

    • information related to the presence or absence of minimum elements
    • information about software components, including their relationships to one another and with the target software

    In each notebook, we collected information from each SBOM by doing the following:

    • traversing the directory of SBOM submissions, importing JSON SBOM files, and decoding the JSON files so that data could be extracted
    • extracting minimum elements from each SBOM where the data existed and noting where data was missing
    • constructing a dependency tree based on the dependencies listed in each SBOM (These dependency trees contained information about software components and the types of relationships among those components as listed in the SBOM.)
    • collating data from each SBOM into two common data structures: one for information related to minimum elements and the other for component information

    We analyzed the data structures using Python data science packages, or we exported them as comma separated value (CSV) files for further analysis. We used information about the presence or absence of minimum elements to generate summary statistics for each software target and each SBOM type (source/build). We used dependency graph information to analyze the presence/absence of components and assess the depth of the SBOMs.

    Baseline SBOMs

    We selected three prominent open source tools, Syft, Trivy, and Microsoft’s SBOM Tool, to create baseline SBOMs for each of the nine software targets. The baseline SBOMs served as initial examples of what we might expect to see submitted by Plugfest participants. The baseline SBOMs also allowed us to develop analysis tools early in the project so we could start analyzing participants’ SBOMs as soon as they were submitted.

    Findings from SBOM Analysis

    The following are notable findings from our research on the SBOMs submitted for the Plugfest. These findings, ordered from the trivial to the more complex, explain the types of variances in the SBOMs as well as their causes.

    1. Component number, content, and normalization. We found significant variance in both the number of components and the content of the minimum required elements in SBOMs from different participants for the same software at the same lifecycle phase. Some variance in SBOM content is due to the lack of normalization; the same content was simply being written differently (e.g., software version detailed as v 2.0 or just 2.0).
    2. Software versions. Another cause for variance in SBOM content is that some software specifications allow for a range of possible software versions, but SBOMs allow only a single version to be documented for each dependency. This results in SBOMs having various versions listed across different participants for each target that allowed version ranges.
    3. Minimum elements. Some variance in SBOM content is due to differences in whether participants included minimum elements or not, which may be due to the somewhat artificial nature of generating SBOMs for a research project.
    4. Use cases. SBOMs have diverse use cases, which lead to different types of SBOMs. The wide variety of possible use cases is an additional cause for the lack of harmonization across SBOMs for the same target. If we had specified a use case, participants may have taken a more harmonized approach to how they generated, enriched, or augmented their SBOMs for that use case.
    5. Build and source SBOMs. Participants used different approaches to generate their build and source SBOMs, which led to differences in the discovered components. Some participants used a container build process to generate their build SBOM, and others built a standalone executable for their chosen runtime environment using the target’s language or build-framework-specific process. Build SBOMs also varied based on the environment and tool configurations each participant used. Source SBOMs capture dependencies declared or inferred from source code. Some participants used additional information from external locations, such as the artifact repositories referenced by dependencies or the contents of platform toolchain libraries, to infer additional dependencies.
    6. Dependency interpretation. A review of submitted explanatory readme files and discussions with participants indicated some differences in the interpretation of dependency. Some submissions included dependencies of first-party components that are not typically deployed, such as target documentation build tools, CI/CD pipeline components, and optional language bindings.

    7 Recommendations for Improving SBOM Quality

    The following recommendations based on our research and analysis will improve the quality of SBOMs and help ensure consistent content in SBOMs for the same target.



    1. Emphasize inclusion of the following minimum elements:



      • SBOM Type.

        Include the SBOM Type to document the lifecycle phase for which this SBOM was generated (e.g., Source, Build). We recommend that this attribute be required rather than optional.


      • Component Version String.

        Emphasize the importance of reporting the version exactly as provided by the supplier. This reporting minimizes the need for normalization due to data being inconsistently reported (e.g., one SBOM reports
        v 2.0

        and another reports
        2.0

        ).


      • Component Supplier Name.

        Include the name of the entity that provided the contents of the software being described. This helps users of the SBOM understand which third parties were part of the supply chain. A common registry of component suppliers would help normalize this entry. For open source software components, which do not have a traditional supplier, a direct reference or link to the project repository should be provided.


      • Component Cryptographic Hash.

        SBOM guidance should clearly state what components are being hashed when a cryptographic hash is included. Make it more straightforward for SBOM users to know how to verify the hash value. Alternatively, when supplying cryptographic hashes, SBOM creators should be explicit about what was hashed.


      • Component License.

        Emphasize the need to provide licensing information or to note that the license information is not known or was not included.



    2. Improve normalization of SBOM elements.

      Much divergence in SBOMs is due to lack of normalization (e.g., version numbering as mentioned earlier or
      date/time

      which may be written as 2025-06-15 or simply as August 2025). Standardize on using the term
      supplier

      for a
      primary supplier

      and the term
      manufacturer

      for a
      secondary supplier

      .


    3. Document how the term



      dependencies



      is interpreted in the SBOM generation process.

      Develop guidance to distinguish dependencies by category (e.g., runtime, tests, docs).


    4. SBOM generators should document their approach to generating SBOMs.

      This will help consumers better understand potential differences in SBOMs for the same software. Also document the use case for which the SBOM is being generated. Different use cases may require differences in SBOMs.

    5. Use the appropriate tool for the environment.

      SBOM creators and users should ensure they are using an appropriate SBOM tool for their specific environment. SBOM tools typically focus on a subset of the programming languages and build environments.


    6. Support developer community SBOM efforts.

      Some developer communities are working to include SBOM generators in language tools and build frameworks to make it much easier for projects using those languages and frameworks to generate SBOMs as upstream suppliers. These efforts have an outsize impact because they lower the barrier for creating SBOMs and push the SBOM generation further upstream to project maintainers who have detailed knowledge of their own source code and build processes.


    7. Develop and validate SBOM profiles.

      To help stakeholders communicate more effectively, they could develop and validate SBOM profiles, each profile being a well-defined restriction placed on one or more SBOM standards to clarify meaning and allowable values for each field, its cardinality, and structural aspects. The
      OWASP Software Component Verifications Standard (SCVS) BOM Maturity Model

      profiles feature is an example. Another approach would be to define a JSON schema that extends the existing schemas for CycloneDX and/or SPDX and adds the necessary clarifications and restrictions for a profile.

    Future Work on Ensuring SBOM Quality

    SBOMs are of growing importance to safeguarding the security of all software systems, including DoD and critical infrastructure systems. As more organizations require use of SBOMs, there will be greater need to ensure their quality and completeness, including providing transparency for undeclared dependencies. Decisions to keep SBOM elements opaque may be rethought if third party SBOMs can provide needed transparency. This research project is part of a continuing SEI effort to improve the quality of SBOMs.



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email

    Related Posts

    Next-Gen JavaScript Package Management with Ruy Adorno and Darcy Clarke

    January 24, 2026

    Why Soft Skills Matter More Than Technical Skills in Agile Teams

    January 21, 2026

    America Under Surveillance with Michael Soyfer

    January 19, 2026

    How to Use AI for Product Discovery and Writing Better User Stories

    January 16, 2026

    The Top 10 Blog Posts of 2025

    January 15, 2026

    Developer Experience at Capital One with Catherine McGarvey

    January 14, 2026
    Top Posts

    Understanding U-Net Architecture in Deep Learning

    November 25, 202511 Views

    Hard-braking events as indicators of road segment crash risk

    January 14, 20269 Views

    Microsoft 365 Copilot now enables you to build apps and workflows

    October 29, 20258 Views
    Don't Miss

    Designing trust & safety (T&S) in customer experience management (CXM): why T&S is becoming core to CXM operating model 

    January 24, 2026

    Customer Experience (CX) now sits at the intersection of Artificial Intelligence (AI)-enabled automation, identity and access journeys, AI-generated content…

    iPhone 18 Series Could Finally Bring Back Touch ID

    January 24, 2026

    The Visual Haystacks Benchmark! – The Berkeley Artificial Intelligence Research Blog

    January 24, 2026

    Data and Analytics Leaders Think They’re AI-Ready. They’re Probably Not. 

    January 24, 2026
    Stay In Touch
    • Facebook
    • Instagram
    About Us

    At GeekFence, we are a team of tech-enthusiasts, industry watchers and content creators who believe that technology isn’t just about gadgets—it’s about how innovation transforms our lives, work and society. We’ve come together to build a place where readers, thinkers and industry insiders can converge to explore what’s next in tech.

    Our Picks

    Designing trust & safety (T&S) in customer experience management (CXM): why T&S is becoming core to CXM operating model 

    January 24, 2026

    iPhone 18 Series Could Finally Bring Back Touch ID

    January 24, 2026

    Subscribe to Updates

    Please enable JavaScript in your browser to complete this form.
    Loading
    • About Us
    • Contact Us
    • Disclaimer
    • Privacy Policy
    • Terms and Conditions
    © 2026 Geekfence.All Rigt Reserved.

    Type above and press Enter to search. Press Esc to cancel.