Passa ai contenuti principali

Vulnerabilities explained: how NIST globally detects and manages flaws

What is a vulnerability?

A vulnerability is a weakness of a computer system that allows potential attackers to reduce information assurance.

A vulnerability can occur at any "place" of a computer system, for instance it can be a flaw at network level during protocol messages exchange, it can be an application bug unveiling private information, it may reach the sphere of copywriting and much more. Nowadays computer systems drive most of the human activities from agricolture to health through aerospace, automotive, Internet of Things, mainframes, micro-services and any modern keyword buzzing over the Internet.

Security is an important aspect of any modern computer system since it represents the endless battle against attackers. There not exists systems 100% safe and secure. The level of security a computer system can afford depends on many factors, to mention a few:
  • Money: the "bigger" is your investment the more your system will be secure. Poor products and solutions will bring the feeling for a better confidence on your computer system that is simply insecure. Don't do it.
  • Skills: spending money is not sufficient. You must know how to exploit any single cent of your investment. Purchasing the best software on the market shall result as completely lost effort if you don't have idea of how to use it. Security for computer system is a very difficult subject and high qualified professionals need to be involved for security assessments.
  • Culture: take it seriously. Sometimes security is considered as a necessary evil. This is not a good approach. Consider it as an integral part of your project.
    Would you live within a beautiful house where electric sockets can strike you any time?
    A computer system is like a house. You have to feel it comfortable so that you can call it home.
"NIST promotes U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology in ways that enhance economic security and improve our quality of life."
It is a world-wide reference for security and - among other activities - it promotes analysis and classification of vulnerabilities.

Classification

It is useful to find an operational classification of vulnerabilities. This is required to speak about flaws through a common language and it is a viable method to automate vulnerability management through a machine readable format.

NIST adopts the concept of metrics that is a set of indicators about how a vulnerability affects a computer system.



  • Base metrics: computer system components directly impacted by vulnerabilities and how they affects adjacent components.
  • Temporal metrics: average time to exploit a vulnerability by an attacker, time required to remediate (workaround, patch) and time to resolve the issue.
  • Environmental metrics: vulnerability impact on confidentiality, integrity and availability and scale of impact with respect to an organisation.

Workflow

NIST collects all detected flaws in the Common Vulnerabilities and Exposures (CVE) database.

Anybody able to detect a vulnerability can send it to NIST.

Once received, each vulnerability is pending of analysis and will line up in a queue. As soon as possible, security experts will open up the posted vulnerability description to investigate for causes and resolution whenever possible.


Analysed vulnerabilities can be reworked whenever any internal review or additional information comes along with the process and over time. Further, external collected info can start this process over if they bring valuable impact to the reported analysis.
In some cases, vulnerabilities are deferred and rejected, this is the case when it is not possible to confirm the vulnerability, for example.

Conclusion

NIST-NVD is a huge contribution to world-wide community to improve security on computer systems and beyond. NVD represents a reference database for many tools providing DevSecOps solutions for software-component-analysis and static code analysis.

References


Commenti

Post popolari in questo blog

Enterprise DevOps for Database Lifecycle Management

Database and Developments Databases represent the biggest burden for enterprise realities addressing SDLC optimisation. Database Build and Release Automation are often considered as an unsolved issue and they are confined as occasional local experiments, making difficult  a broader adoption of DevOps methodologies . Trying to conceptually shrink a process for SDLC: business requirements are collected, translated into proper technical requirements, arranged by releases and split in tasks. Each release will contain changes for different application components  and database objects . Sample Design / Develop Process Data modeling Data models are key for a software architecture and they go side by side with the data access layer . They affect the way data are organised and retrieved, thus it is necessary a careful analysis and design of data models that are often carried out through specialised tools. Those tools provide database structures for instance and may represen

GDPR, procurement and technical debt: how to control software quality of suppliers.

Buying software and services Software and IT services are often considered a live cost for big companies whose core business is not focused on software production. Controlling the quality of those factors is a complex task. On the one hand you have to necessarily manage procurement and check whether providers can afford the level of quality you need. On the other hand measuring such a quality is hard and may require not clear tools and criteria. Public Sector, Finance and Fintech, Banking, Insurance companies require tons of software and IT services. CRM, SAP, ERP, DWH are almost everywhere included as assets the most part of companies rely on. Services for systems and architectures, data and databases, integration, software developments and tools are necessary in order to provide high quality services, internal control, easy evolution, etc. An important lever to minimise the cost of services is control . This means having the ability to measure  and monitor the quality of