CISSP - Asset Security Tutorial

Welcome to the second domain of the CISSP tutorial (part of the CISSP Certification Training).  

This domain provides an introduction to Asset Security. Let us explore the objectives of this domain in the next section.


After completing this domain, you will be able to:

  • Classify information and supporting assets

  • Determine and maintain ownership of assets

  • Identify ways to protect privacy

  • Ensure appropriate data retention

  • Determine data security controls

  • Establish asset handling requirements

Let us begin with a scenario to highlight the importance of Asset Security in the next section.

Importance of Asset Security

Recently, a hacker broke into one of the Nutri Worldwide servers taking advantage of an application vulnerability. The server had various types of information at different levels of criticality.

The information on the server was secured with appropriate security controls. Although the hacker was able to gain access only to the information with a lower level of protection, the breach had a huge impact on the organization. It was later found that there was a flaw in the classification process, leaving even sensitive information with very little protection.

Let us discuss the need for information classification in the next section.

Need for Information Classification

There are several good reasons to classify information. Not all data has the same value to an organization.

Some data is valuable to the people who take strategic decisions because it aids them in taking long-term or short-term business direction decisions.

Some data, such as trade secrets, formulas, and new product information, is so valuable that its loss could create a significant problem for the enterprise in the marketplace. This may create public embarrassment or by causing a lack of credibility. For these reasons, it is obvious that Information Classification has a higher, enterprise-level benefit.

Information can have an impact on business globally, apart from the business unit or line operation levels. Its primary purpose is to enhance confidentiality, integrity, and availability and to minimize the risks to the information.

Also, by focusing the protection mechanisms and controls on the information areas that need it the most, you achieve an efficient cost-to-benefit ratio. Cost-to-benefit ratio aims at generating maximum benefits from the available resources and budget.

Preparing for a career in Information Security? Check out our Course Preview on CISSP here!

Information Classification Objectives

The objectives of information classification are discussed below.

The objective of an information classification scheme varies from sector-to-sector. In general, the information classification is done to minimize risks on sensitive information.

Information classification has the longest history in the government or military sectors. In these sectors, information classification is used primarily to prevent the unauthorized disclosure of information and the resultant failure of confidentiality.

A commercial or a private sector company might wish to employ classification to maintain a competitive edge in a tough marketplace. There might also be other sound legal reasons for a company to employ information classification, such as to minimize liability or to protect valuable business information. Information classification can also be employed to comply with privacy laws or to enable regulatory compliance.

Government or Military Sector Classification

The Government or Military sector classification is described in this section.

The information classification scheme followed by the Government or Military sector has five levels.

  1. Top Secret

  2. Secret

  3. Confidential

  4. Sensitive but Unclassified or SBU

  5. Unclassified

Top Secret

Top Secret is the highest level of information classification. The unauthorized disclosure of top secret information will cause exceptionally severe damage to the country’s national security.


The next level is Secret. This is the information designated to be of a secret nature. The unauthorized disclosure of this information could cause some damage to the country’s national security.


The third level is Confidential. This is the information designated to be of a confidential nature. The unauthorized disclosure of this information may cause damage to the country’s national security. This level applies to the documents labeled between Sensitive but Unclassified and Secret-in-sensitivity.


The fourth level is Sensitive but Unclassified (SBU). This is the information designated as a minor secret that may not create serious damage to the country’s national security if disclosed. However, such material would cause "undesirable effects" if they are available in public.


The lowest level is Unclassified. This is the information designated as neither sensitive nor classified. The public release of this information does not violate the confidentiality of a country’s national security.

Commercial or Private Sector Classification

We will discuss commercial or private sector classification in this section.

The information classification scheme followed by the Commercial or Private establishments has four levels,

  1. Confidential

  2. Private

  3. Sensitive

  4. Public


The highest level is Confidential. This classification applies to the sensitive business information that is intended strictly for use within the organization. The unauthorized disclosure of such information can seriously and adversely affect the organization, its stockholders, business partners, or customers.

For example, information about new product development, trade secrets, and merger negotiations is considered confidential.


The next level is Private. This classification applies to the personal information that is intended for use within the organization. The unauthorized disclosure of such information can seriously and adversely affect the organization or its employees. For example, medical information and salary levels are considered private.


The third level is Sensitive. Information that requires a higher level of classification than normal data can be termed as sensitive.

Unauthorized disclosure of this information could affect the company. This information is protected from a loss of confidentiality as well as from a loss of integrity, due to unauthorized alteration. This information requires a higher-than-normal assurance of accuracy and completeness.


The lowest level is Public. This is similar to unclassified information; all of a company’s information that does not fit into any of the other categories can be considered public.

While its unauthorized disclosure may be against policy, it is not expected to affect the organization, its employees, or its customers, neither seriously nor adversely.

Information Classification Criteria

Information classification criteria will be discussed in this section.

Once the scheme is decided upon, the government agency or the company must develop the criteria to decide what information goes into which classification.

As seen in the image, several criteria may be used to determine the classification of an information object like Conditions, Elements, Limitations, and Procedures.

Classification can be decided based on certain conditions which the information satisfies, such as value, age, useful life, and personal associations.

Value is the commonly used criteria for classifying the data in private sector. If the information is valuable to an organization or its competitors, it needs to be classified.

Age states that the classification of information might be lowered if the information’s value decreases over time.

Useful Life states that if the information has been made obsolete due to new information, substantial changes in the company, or other reasons, the information can often be declassified. If the information is personally associated with specific individuals or is addressed by a privacy law, it might need to be classified.

For example, investigative information that reveals informant names might need to remain classified.

While implementing information classification, you will need to consider and implement appropriate practices related to its authorization, custody, reproduction, logging, labeling, filing, etc.

You must also take into account certain Limitations like the ability of expert, ethics of custodian, and incompatible activities of the administrator while classifying the information.

You also need to specify certain procedures for controlling the use of information and labeling of information.

Data Classification Considerations

When classifying data, a security practitioner takes the following into consideration:

  • Appropriately defined data access privileges or roles

  • Data retention requirements of the organization: Certain regulatory requirements make it mandatory for organizations to retain the data for specific periods of time

  • Data security requirements: Depending on the type of data and regulatory requirement, the appropriate level of protection is determined

  • Appropriate disposal of data and the method of disposal

  • Data encryption requirements

  • Appropriate use of data

  • Regulatory or compliance requirements

In the following section, let us identify who classifies data.

Role Responsible for Data Classification

In an organization, the data owner can take decisions regarding data classification as he or she is most familiar with the data. The data owner also has correct knowledge of the value of the data to the organization.

As the sensitivity of the data may change over a period of time, the data needs to be appropriately classified and reviewed annually by the data owner.

Deviations, if identified, are documented, and corrective action is taken by the organization.

Depending on the organization’s retention policies, which are based on the laws and regulations governing the industry, the data is retained for a certain period of time. After completion of the retention period, the data is destroyed securely.

Let us discuss a business scenario in the next section to understand data classification.

Business Scenario

Network Firewalls division has a lot of information that is created and maintained. However, Kevin, a Security Administrator at Nutri Worldwide Inc., recognizes that not all the information is critical, and would like to do a high-level classification of the information accessible to him.

Question: What are the different information classification levels Kevin has to use?

Answer: The information classification levels for any commercial organization are: Confidential, Private, Sensitive, and Public.

In the next section, let us discuss Data Management.

Data Management

Many organizations need to manage large quantities of information and computer resources. A good data management plan and strategy will help in organizing and managing data.

Data management involves managing the information lifecycle needs of an enterprise in an effective manner by developing and executing architectures, policies, procedures, and practices. The process of data management involves many activities ranging from administrative to technical aspects of handling data.

Organizations require data management to ensure that their data complies with standard classifications, ensure data validity, integrity, and consistency. It also helps to secure and maintain data.

Now that we have looked at the need for data management let us discuss the best practices for data management in the next section.

Best Practices for Data Management

Best practices for data management are:

  • Create a data management policy which will guide the overall data management program in the organization.

  • Clearly define roles and responsibilities for managing data such as data providers, data owners, and custodians

  • Audit effectiveness of controls, processes, and practices for data management

  • Create procedures for quality control and assurance

  • Establish processes for verifying and validating the accuracy and integrity of the data

  • Document specific data management practices and descriptive metadata for each dataset

  • Follow a layered approach to data security that will enhance the protection of the data

  • Have in place clearly defined criteria for data access

Let us discuss Data Policy in the next section.

Data Policy

For an effective data management program in any organization, the first step is to create a data policy. A data policy is a high-level document created by senior management that defines long-term strategic goals for data management throughout the organization.

A data policy guides the framework for data management and addresses issues related to data access, legal matters, custodian duties, data acquisition, data handling, and other issues. It should be dynamic and flexible so that it can be adapted to a variety of situations and challenges.

A security practitioner should address the following elements while creating a data policy:

  • Data privacy requirements based on the type of data and the existing laws and regulations

  • Ownership of data

  • Cost considerations such as the cost of providing data or access to the user

  • Sensitivity and criticality of data

  • Policies and processes of managing data

  • Existing laws and regulations

  • Legal liability of the organization in case of data mishandling.

Let us now discuss Data Ownership in the next section.

Data Ownership

When information is created or acquired in the organization, it is important to assign ownership to it. An information or data owner can be an individual or a group who has created, acquired or purchased the information and is directly responsible for it. As discussed earlier, the data owner creates the data classification.

The responsibilities of a data owner are:

  • Determining how the organization’s mission and strategic goals will be impacted by the information

  • Determining the cost of replacing the information

  • Understanding the requirement of entities, within and outside the organization, for the information, and the conditions under which it can be shared

  • Recognizing when the information reaches the end of its lifecycle and destroying it

In the next section, let us look at some of best practices data owners can follow.

Data Ownership—Best Practices

To adopt best practices, data owners must establish and document the ownership and intellectual property rights of their data. They are also responsible for creating and documenting policies for securing data, and other controls relevant in acquiring, handling, and releasing data.

The data owner should ensure data compliance with laws and regulations. This is very important as far as carrying out business activities is concerned.

Draft and finalize agreements for data usage by customers or users. This can be in the form of signed agreements, non-disclosure agreements, or a contract between the owner and users.

Let us now discuss responsibilities of a data custodian in the next section.

Data Custodians

Data custodians are responsible for the safe custody, storage, and transportation of data, implementing the business rules, and technical environment and database structure.

Some other important responsibilities of data custodians are to:

  • Allow only authorized and controlled access to the data

  • Ensure that no unauthorized access is granted

  • Maintain versions of Master Data and the history of changes

  • Identify data stewards for every dataset

  • Ensure data integrity is maintained in technical processes

  • Ensure security controls safeguard data

  • Audit data content and changes

  • Maintain consistency with common data models while adding data to datasets

  • Maintain the database with change management practices

Another role in addition to, and supporting, data custodians is that of the data steward. Data stewards are responsible for the content, context, the associated business rules for the data, and information stored in a data field. Let us identify the various roles associated with data custodianship in the next section.

Data Custodians (contd.)

A single role or entity in the organization most familiar with a dataset‘s content and associated management criteria is generally best suited for data custodianship.

There are many roles/custodians which are suitable for custodianship including:

  • Data Manager

  • Project Leader

  • Database Administrator

  • Geographic Information System Manager

  • IT Specialist

  • Application Developer

Let us now discuss the concepts of data quality in the next section.

Data Quality

Quality, as applied to data, has been defined as fitness to serve its purpose in a given context. When data is fit for its anticipated uses, such as in planning or decision making, it is said to be of high quality.

High-quality data is consistent, complete, and accurate. Many data quality principles apply when dealing with various types of data. These principles are involved at all stages of the data management process, from data collection to its final usage.

Data quality has to be maintained throughout the lifecycle of the data. Otherwise, loss of quality can directly impact data usage.

The different stages of data lifecycle are:

  • Data collection or capturing

  • Recording

  • Identification

  • Metadata recording

  • Storage and archiving of data

  • Presentation and dissemination of data

  • Analysis and manipulation of data

Data applicability and use greatly reduce if data quality is lost.

Let us discuss aspects of data quality in the following section.

Data Quality—Aspects

Major aspects of data quality standards are:

  • Reliability

  • Accuracy

  • Completeness

  • Precision

  • Consistency across data sources

  • Reproducibility

  • Resolution

  • Timeliness

  • Repeatability

  • appropriate presentation

  • Currency

  • Relevance

  • Ability to audit

In the next section, we will continue the discussion on data quality and focus on Data Quality Control and Quality Assurance.

Data Quality Assurance and Quality Control

Quality Assurance or QA is defined as the assessment of quality based on standards external to the process. QA involves reviewing activities and quality control processes to ensure the final products meet predetermined standards of quality. It is the process of discovering data inconsistencies and correcting them.

Data Quality Control or QC is defined as an assessment of data quality based on internal standards, processes, and procedures established to control and monitor quality. This process is normally done after QA.

While Quality Assurance maintains quality throughout all stages of data development, Quality Control monitors or evaluates the resulting data products.

Let us discuss Data Documentation in the following section.

Data Documentation

Documentation is key to good data quality. Identification and documentation of all datasets are very important as this helps manage and use the data throughout its lifecycle. It also helps to avoid duplicity of data, which consumes a lot of effort and precious memory in the organization.

The objectives of Data Documentation are to:

  • Ensure data durability

  • Facilitate the re-use of data for multiple purposes

  • Facilitate user understanding of data requirements

  • Ensure data exchange

  • Facilitate the discovery of datasets

  • Facilitate dataset interoperability

Let us discuss Data Documentation Practices in the next section.

Data Documentation Practices

The following are widely accepted documentation practices for data entry into electronic systems:

Dataset titles and corresponding file names should be descriptive and may contain information such as project title or name, type of data, location, and year. These datasets may be accessed in future by people unaware of the details of the project or program. As a standard, the file names must not exceed 64 characters.

Lowercase characters are preferred as they are more platform and software independent.

For making data usage easier, the dataset contents have to be understood by users. This requires the use of appropriate file contents including data file name, dataset title, author, date of creation, last modified date, and companion file names. This can be a part of the document header.

When the dataset is large and complex, additional information must be provided.

These include:

  • Parameters, which must be short, unique, and descriptive of the parameter contents

  • Coded fields with values defined for uniform use

  • Missing values, or the use blank or code to indicate missing data

  • Metadata for identification, quality, and other data attributes

Let us now discuss Data Standards in the following section.

Data Standards

Data has to be organized and managed according to defined protocols and rules. Data standards are documented agreements on the format, representation, definition, structuring, tagging, manipulation, transmission, use, and management of data. These standards become important when data and information are to be aggregated or shared.

The benefits of using data standards are:

  • Efficient data management

  • Enhanced data consistency

  • Efficient data updates and improvement in security

  • Increased data sharing

  • Improved documentation

  • Higher quality data

  • Increased data integration

  • Improved understanding of data

Let us discuss the Data Control Lifecycle in the next section.

Data Control Lifecycle

Carefully managing the entire data lifecycle is a best practice for data management.

In this, some of the activities required are:

  • Correct data specification and modeling

  • Database maintenance

  • Continuous data audit, which indicates the effectiveness of the existing data

  • Data storage and archiving, which helps in maintaining data with regular backups

  • Data security

Let us discuss each of these activities in the following sections, beginning with Data Specification and Modeling in the next.

Data Specification and Modeling

Efficient database planning requires to first understand user requirements and then perform data modeling. Databases must be designed to meet user requirements starting from data acquisition to data entry, reporting, and long-term analysis. Data modeling is the methodology used to identify the path to meet user requirements.

The project goals and objectives must be achieved keeping the data model and structure as simple as possible. The data model is created in the conceptual design phase of the information lifecycle.

Let us discuss Database Maintenance in the next section.

Database Maintenance

Database maintenance is an important activity in an organization. With the change in hardware, software, file formats, or media, datasets have to be migrated to new environments. For efficient data management, a well-defined procedure for updating the database must be created.

Versioning also plays a vital role in database management, especially in a multi-user environment. A good database management practice is ensuring daily system administration. Database administrators should also employ processes for threat management.

Let us focus on Data Audit in the next section.

Data Audit

A data audit refers to the reviewing data to assess its quality or utility for a specific purpose. Data audits to monitor the continued effectiveness, and use of existing data are part of data management best practices.

A data audit involves profiling the data and analyzing the data requirements of the organization. It also involves assigning levels of importance to the requirements identified. Data audits also involve identifying and analyzing gaps, duplications, inefficiencies, and assessing the impact of poor quality data on the organization's performance and profits.

Let us discuss Data Storage and Archiving in the next section.

Data Storage and Archiving

Data storage and archiving addresses those facets of data management which are related to the housing of data. Problems may arise if data storage and archiving are not planned and implemented carefully. The data can become outdated and possibly unusable as a result of inadequate management and storage.

Efficient data storage and archiving have many advantages. If primary copies and backups are corrupted, storage and archiving ensure data is maintained effectively. Periodic snapshots of data also allow rolling back to previous versions, if required.

Some important requirements for physical dataset storage and archiving for electronic or digital data are:

  • Appropriate understanding of the existing network infrastructure

  • Server software and hardware

  • Dataset size and format

  • Database maintenance

  • Database updating

  • Backup and recovery

Let us focus on Data Security in the next section.

Data Security

Data security means protecting data, from harmful entities and unauthorized users. Database security involves safeguarding the confidentiality, integrity, and availability of data.

Addressing security concerns require systems, policies, and processes to protect a database from unintended activities Security must be implemented using the Defense in Depth or layered approach. Several controls can be used such as data encryption, backups, incidence response, disaster recovery, clustering, and others.

The security controls implemented must be regularly tested to check for effectiveness. Weaknesses or gaps identified have to be mitigated. A security practitioner can also apply risk management principles to maintain acceptable levels of risk.

Let us discuss Data Access, Sharing, and Dissemination in the next section.

Data Access, Sharing, and Dissemination

Data and information must be readily available to those who are granted access privileges. As discussed earlier, the data owner takes decisions regarding access to data.

Some of the issues related to data access and sharing are policy and data ownership issues, liability issues, and legal or jurisdictional issues unique to the geography.

A security practitioner should also consider the cost of providing access to data against the cost of sharing data. Other important aspects understand the format of data required by the end-user, and user needs and privileges.

Security considerations are also important when dealing with issues related to data access and sharing. Organizations need appropriate policies in place to address data security and protect sensitive information.

Let us discuss the concept of Data Publishing in the next section

Data Publishing

Data management solution implementation requires addressing the need for data publishing and access.

Attention to detail helps ensure that the published data makes sense, and the people accessing the data find it usable. These details include providing descriptive data headings, legends, metadata or documentation, and checking for inconsistencies. Documentation helps users to better understand the data contents.

Let us discuss data handling requirements in the next section.

Data Handling Requirements

Data handling encompasses three activities: data or information asset handling, storage media handling, and records retention. Data handling requirements include marking, storing, handling and destroying sensitive information.

The best practice for information handling requires all information assets to be clearly marked and labeled. Information classification helps in the proper handling of information assets.

Media storing sensitive information requires both physical and logical controls. These controls include marking, storing, and handling based on the information classification, which provides methods for the secure handling of sensitive media.

Organizations must have in place policies in place regarding the marking and labeling of media. Storage media should have a physical label identifying the sensitivity of the information contained.

Sensitive media should only be handled by designated personnel, and sensitive information must be securely stored to prevent any unauthorized access.

The organization should also devise policies for records retention. These policies indicate how long the information and data are to be retained by the organization. Information must only be retained as long as it is required by the organization. Retention policies must also take into account legal and regulatory requirements.

Media Resource Protection

Media resource protection can be classified into two areas:

  1. Media security controls

  2. Media viability controls

Media Security Controls

Media security controls are implemented to prevent any threat to C.I.A. by the intentional or unintentional exposure of sensitive data. They prevent the loss of sensitive information when the media is stored outside the system.

The elements of media security controls are:


The use of media provides accountability. Logging also assists in physical inventory control by preventing tapes from getting lost and by facilitating their recovery process.

Access Control

Physical access control to the media is used to prevent unauthorized personnel from accessing the media. This procedure is also a part of physical inventory control.

Proper Disposal

Proper disposal of the media after use is required to prevent data remanence. The process of removing information from used data media is called sanitization. Sanitization can be done by overwriting, degaussing, and destruction.

Wiping or Overwriting

Wiping, also called overwriting, writes new data over each bit or block of file data. One of the shortcomings of wiping is that physical damage to a hard disk prevents complete overwriting.


By introducing an external magnetic field with a degausser, the data on magnetic storage media can be made unrecoverable. A degausser destroys the integrity of the magnetization, making the data unrecoverable.

Physical Destruction

When carried out properly, is considered the most secure means of media sanitization. Common means of destruction include incineration and pulverization.


Shredding refers to the process of making data printed on hard copy, or on smaller objects such as floppy or optical disks, unrecoverable.

Media Viability Controls

Media viability controls are implemented to preserve the proper working state of the media, particularly to facilitate timely and accurate restoration of the system after a failure.

Many physical controls should be used to protect the viability of the data storage media. The goal is to protect the media from damage during handling and transportation, during short-term or long-term storage. Proper marking and labeling of the media are required in the event of a system recovery process.

The elements of media viability controls are:


All data storage media should be accurately marked.

The labels can be used to identify media with special handling instructions or to log serial numbers or barcodes for retrieval during system recovery. It is important not to confuse this physical storage media marking for inventory control with the logical data labeling of sensitivity classification for mandatory access control.


Proper handling of the media is important. Some issues with the handling of media include cleanliness of the media and the protection from physical damage to the media during transportation to the archive sites.


The storage of the media is very important for both security and environmental reasons. A proper heat- and humidity-free, clean storage environment should be provided for the media. Data media are sensitive to temperature, liquids, magnetism, smoke, and dust.

Let us discuss the concept of Data Remanence in the next section.

Data Remanence

Data remanence, an important aspect of data security, is the residual representation of digital data that remains even after attempts to erase or remove the data have been made. Security practitioners must be familiar with the different technologies employed in storage devices to deal with issues of data remanence.

For example, if you format a hard disk drive or HDD, although the data is erased, the formatted data can be retrieved using specific data recovery tools.

Some of the countermeasures for dealing with the data remanence are:

  • Purging, which involves permanent removal of sensitive information from the memory or storage device.

  • Clearing involves removal of sensitive information from a storage device so that reconstructing the data requires special recovery software or tools.

  • Destruction refers to physically destroying the storage device so that the data cannot be recovered from it.

  • Overwriting, one of the common methods employed to counter data remanence issues involves overwriting data on the storage device several times so that the original data cannot be reconstructed.

  • Degaussing is a technique used for destroying data on magnetic storage tapes. It uses a box-like device known as a degausser which works by changing the magnetic field on the tape, effectively destroying the data on the tape. Degaussing can also be used to erase the contents of a hard drive, USB thumb drive, smartphone, or floppy disk.

  • Storing data on media by encrypting it before storage is an effective countermeasure against data remanence. If the encryption is strong and the encryption keys are kept secret, it is difficult to get unauthorized disclosure of information from the media.

Let us look at a business scenario in the next section to better understand data protection.

Business Scenario

With the rapid expansion in the collection and storage of digitized personal information of customers at Nutri Worldwide Inc., the issue of privacy has gained significance.

As the General Manager of IT security, Hilda Jacobs is concerned as there are very stringent legal and regulatory requirements for the protection of privacy and data. She decides to implement a data management process in the organization.

Question: What is the first step Hilda must undertake to kick-start the data management process at Nutri World Inc.?

Answer: As the first step in the data management process, Hilda has to create a data policy.

Let us discuss Asset Management in the next section.

Asset Management

To understand the concept of asset management, it is important to first look at inventory and configuration management.

Inventory Management

Inventory management involves capturing details about the assets, their location, and owners. IT assets can be both hardware and software. IT Asset Management or ITAM combines financial, inventory and contractual functions to support lifecycle management of IT assets and strategic decision making for the IT environment.

Configuration Management

Configuration management is the practice of systematically handling changes in a way that ensures the integrity of the asset or system over time.

It can be implemented through appropriate policies, procedures, techniques, and tools. These are used to manage and evaluate proposed changes; track the status of a proposed change, and maintain an asset or system inventory and supporting documents in case of changes in the system.

A Configuration Management Database or CMDB is a database containing all relevant information on the information system components used in an organization's IT services and the relationship between those components.

Let us discuss Software Licensing in the next section.

Software Licensing

An important IT asset in any organization is licensed software. The software has to be protected from malicious users who may create illegal copies resulting in copyright infringement. To avoid this, the organization must secure original copies of the licensed software.

The organization must also take steps to prevent users from creating and installing illegal copies of software. IT administrators must identify unauthorized software installations on the company’s network. Licenses must be managed properly and must not exceed the permitted limits. A software or media librarian must be responsible for controlling media and software assets.

Let us discuss the Equipment Lifecycle in the next section.

Equipment Lifecycle

Every IT equipment in an organization has a finite useful life. A security practitioner must carry out appropriate security activities throughout the lifecycle of the IT equipment.

These activities are:

Define security requirements

This involves ensuring security specifications are considered while acquiring or developing IT assets and that appropriate funds are allocated for the security function.

Acquire and Implement

Additional security features must be acquired and implemented if required.

Operations and Maintenance

The security practitioner must ensure operations and maintenance of security features. He or she must ensure that the security features are operational throughout the asset’s lifecycle. Any vulnerability identified for the IT asset must be mitigated.

Disposal and Decommissioning

Finally, the security practitioner must ensure secure disposal and decommission of the IT asset once it reaches the end of its life.

Let us discuss protection of privacy in the next section.

Protecting Privacy

Protecting privacy means safeguarding the confidentiality of personal information. Worldwide, laws for the protection of privacy have been adopted.

The laws on privacy date back to 1361 AD when the Justices of Peace Act was enacted in England. Since then there have been many acts, laws, and regulations enacted globally.

A security professional must be aware of privacy requirements for compliance with laws and regulations.

Let us discuss important factors regarding personal information that a security practitioner must be aware of.

One of the fundamental requirements states that personal information must be obtained fairly and legitimately. It must be used only for the originally specified purpose and not for any other.

The information collected must be relevant, adequate, accurate, and up to date. The information must be accessible to the subject, kept secure, and destroyed after its purpose is completed.

As the European Union and the United States have different privacy laws, American companies found it difficult to do business in Europe. The U.S. Department of Commerce, in consultation with the European Commission, developed a "safe harbor" framework.

It is created to:

  • Bridge the differences between U.S. privacy laws and EU Council Directive on Data Protection

  • Provide a streamlined and cost-effective means for the U.S. organizations to satisfy the Directive’s “adequacy” requirement.

Let us discuss Data Retention in the next section.

Ensuring Appropriate Retention

Every organization has different types of data; each type has a different set of requirements.

A holistic data retention strategy requires:

  • The involvement of all stakeholders,

  • Setting up of appropriate accountability and responsibilities,

  • Monitoring, reviewing and updating existing data retention policies.

The steps to build a sound retention policy are:

  • Evaluate regulatory requirements, business needs, and legal obligations

  • Classify assets based on their value to the organization

  • Determine asset retention periods and destruction practice

  • Create record retention policy

  • Train the staff on retention policy requirements

  • Regularly audit practices for record retention and destruction

  • Review the retention policy periodically

  • As a best practice, maintain documentation of policy, implementation, training, and audits

Let us discuss Data Security Controls in the next section.

Data Security Controls

There are different security controls for stored data and the data on the network.

Data at Rest

The protection of data at rest or stored data is a fundamental requirement for an organization.

Sensitive information stored on backup tapes, off-site storage, password files, and other types of data storage has to be protected from disclosure or undetected alteration. This can be achieved by implementing security controls such as encryption, hashing, compressing, use of strong passwords, labeling, marking, storage, and documentation.

Examples of encryption tools are self-encrypting USB drives and file and media encryption software.

Data in Transit

Another important requirement is to protect sensitive information moving over the network, also known as data in transit. This information can be protected using security controls including cryptographic functions such as encryption, hashing, and others.

Encryption can be done in the following ways:

End-to-End Encryption

In this type of communication, the data is encrypted, but the routing information remains visible. It is generally used by end users within the organization, wherein the data is encrypted at the sender’s end and gets decrypted at the receiver’s end.

Link Encryption

In this type of communication, also known as tunneling, the data, as well as the routing information, is encrypted.

Let us discuss best practices for securing data in transit in the next section.

Data in Transit—Best Practices

Some of the best practices for securing data in transit include the following:

  • Use of Secure Socket Layer or SSL or Transport Layer Security or TLS

  • Message encryption before transmitting emails

  • Use of Pretty Good Privacy or PGP and Secure/Multipurpose Internet Mail Extensions (read without the hash) or S/MIME (Read as S-Mime) also for encryption

  • Using end-to-end encryption for intranet communication

  • Internet Protocol Security or IPsec is recommended for secure Virtual Private Network connectivity

  • Secure Shell or SSH for the administration of network devices

  • Wireless networking secured with WPA2 or WiFi Protected Access 2, the current security standard in wireless networking

  • Using a baseline, which is the minimum set of security controls implemented to protect the IT systems in the organization. The baseline helps maintain the security posture of the organization and must be documented.

Let us discuss the concepts of scoping and tailoring in the next two sections.

Scoping and Tailoring

Let us first understand Scoping.

Scoping ensures an adequate level of protection by identifying the security requirements based on the organization’s mission and business processes as supported by the information system.

Scoping guidance is a method which provides an enterprise with specific terms and conditions on the applicability and implementation of individual security controls.

Many considerations can potentially impact how baseline security controls are applied by the enterprise. System security plans must clearly define which security controls employ scoping guidance, and include a description of the considerations taken into account.

For an information system, the authorizing official must review and approve the applied scoping guidance.

Let us discuss the concept of tailoring in the next section.

Scoping and Tailoring (contd.)

Let us now discuss the concept of tailoring. The tailoring process involves the customization of the initial security control baseline. The baseline is adjusted to align security control requirements more closely with the actual information system and/or operating environment.

Tailoring uses the following mechanisms.

  • Scoping guidance, which defines specific terms and conditions on the applicability and implementation of specific security controls.

  • Compensating security controls, which includes management, operational, and technical controls implemented instead of the security controls identified in the initial baseline.

  • Organization-defined parameters, which are applied to portions of security controls to support specific organizational requirements and objectives.

The security practitioner must understand the impact of scoping and tailoring on information security.

Let us discuss some important standards in the next few sections.

Standards Selection—US DoD

A security professional must be aware of different security standards available and the entities or organizations responsible for them.

U.S. Department of Defense or DoD (Read as D-O-D) Policies includes:

United States National Security Agency or NSA (Read as N-S-A) Information Assurance or IA Mitigation Guidance provides guidance on Information Assurance security solutions so that customers can benefit from NSA’s unique and deep understanding of risks, vulnerabilities, mitigations, and threats.

Department of Defense Instruction or DoDI 8510.01 (read as: D-O-D-I-eight five one zero dot zero one), establishes the Defense Information Assurance Certification & Accreditation Process or DIACAP (Read as: Diacap) for authorizing the operation of DoD Information Systems, for managing the implementation of IA capabilities and services, and for providing visibility of accreditation decisions regarding the operation of DoD Information Systems.

National Institute of Standards and Technology (NIST) (Read as N-I-S-T), Computer Security Division, focuses on providing measurements and standards to protect information systems.

NIST Publications include:

Federal Information Processing Standards or FIPS provide official guidance on topics such as minimum security requirements, standards for security categorization for federal information and information systems, personal identity verification and digital signature standards, among others.

Special Publications or SP 800 Series provides documents of general interest to the computer security community and reports on research, and guidelines. Some of the publications include SP 800-37, Guide for Applying Risk Management Framework to Federal Information Systems, SP 800-53, Security and Privacy Controls for Federal Information Systems and Organizations, and SP 800-60, Guide for Mapping Types of Information and Information Systems to Security Categories.

Let us continue to discuss standards in the next section.

Standards Selection—International Standards

From NIST we have:

Risk Management Framework which provides an effective framework for selecting the appropriate security controls for an information system.

National Checklist Program or NCP which provides detailed low-level guidance on setting the security configuration of operating systems and applications.

International standards include:

  • Cyber Security Strategy of the European Union’ represents the EU’s comprehensive vision on how best to prevent and respond to cyber disruptions and incidents.

  • ‘10 Steps to Cyber Security‘offers practical actions organizational leaders can direct to improve the protection of networks and the information carried by them.

  • ‘National Cyber Security Strategies: An Implementation Guide’ was developed by European Network and Information Security Agency or ENISA (Read as E-ni-Sa) and introduces a set of concrete actions, which if implemented will lead to a coherent and holistic national cybersecurity strategy.

  • International Organization for Standardization or ISO standards include:

    • ISO/IEC 27001 (Read as I-S-O-I-E-C-twenty seven thousand and one).

    • ISO/ IEC 27002 (Read as I-S-O-I-E-C-twenty seven thousand and two).

  • International Telecommunications Union-Telecommunications or ITU-T (Read as I-T-U-T) Standardization standards include:

    • Recommendations X.800 – X.849 (Read as X dot Eight Hundred dash X dot Eight Forty-Nine) defines a security baseline against which network operators can assess their network and information security status.

    • Recommendation X. 1205 provides a definition for cybersecurity and taxonomy of security threats from an organization point of view.

Let us discuss the National Cyber Security Framework Manual in the next section.

Standards Selection—National Cyber Security Framework Manual

This framework, from the NATO Cooperative Cyber Defense Centre for Excellence, gives detailed background information and in-depth theoretical frameworks to help the reader understand the various facets of National Cyber Security, according to different levels of public policy formulation.

The four levels of government are:

  • Political

  • Strategic

  • Operational

  • Tactical/Technical

Each has their own perspectives on National Cyber Security, and each is addressed in individual sections within the Manual.

Additionally, the Manual gives examples of relevant institutions in National Cyber Security, from top-level policy coordination bodies down to cyber crisis management structures and similar institutions.

Let us discuss the Center for Strategic and International Studies in the next section.

Standards Selection—Center for Strategic and International Studies

The Center for Strategic and International Studies or CSIS 20 Critical Security Controls initiative provides a unified list of twenty critical controls identified through a consensus of federal and private industry security professionals as the most critical security issues seen in the industry.

The CSIS team includes officials from the NSA, U.S. Computer Emergency Readiness Team or US-CERT (Read as US-Cert), DoD JTF-GNO or Department of Defense Joint Task Force on Global Network Operations, the Department of Energy Nuclear Laboratories, Department of State, DoD Cyber Crime Center, and the private sector.

The five “critical tenets” of the CSIS initiative, as listed on the SANS website, are: Offence informs defense, Prioritization, Metrics, Continuous monitoring and Automation.

Let us discuss Critical Security Controls in the following section.

Standards Selection—Critical Security Controls

The current version of Critical Security Controls, Version 5 includes:

  • Inventory of Authorized and Unauthorized Devices

  • Inventory of Authorized and Unauthorized Software

  • Secure Configurations for Hardware and Software on Mobile Devices, Laptops, Workstations, and Servers

  • Continuous Vulnerability Assessment and Remediation

  • Malware Defense

  • Application Software Security

  • Wireless Access Control

  • Data Recovery Capability

  • Security Skills Assessment and Appropriate Training to Fill Gaps

  • Secure Configurations for Network Devices such as Firewalls, Routers, and Switches

  • Limitation and Control of Network Ports, Protocols, and Services

  • Controlled Use of Administrative Privileges

  • Boundary Defense

  • Maintenance, Monitoring, and Analysis of Audit Logs

  • Controlled Access Based on the Need to Know

  • Account Monitoring and Control

  • Data Protection

  • Incident Response and Management

  • Secure Network Engineering

  • Penetration Tests and Red Team Exercises

Let us discuss the Security Content Automation Protocol in the next section.

Standards Selection—Security Content Automation Protocol

The Security Content Automation Protocol or SCAP, developed by NIST, is a suite of specifications that standardize the format and nomenclature. Information on software flaws and security configurations is communicated both to machines and humans using SCAP.

SCAP version 1.2 comprises eleven component specifications in the following five categories:

  • Languages provide standard vocabularies and conventions for expressing security policy, technical check mechanisms, and assessment results.

  • Reporting formats provide the necessary constructs to express collected information in standardized formats.

  • An enumeration defines a standard nomenclature and an official dictionary or list of items expressed using that nomenclature.

  • Measurement and scoring systems refer to evaluating specific characteristics of a security weakness, such as software vulnerabilities and security configuration issues, and based on those characteristics generating a score that reflects their relative severity.

  • Integrity helps to preserve the integrity of SCAP content and results.

Let us discuss the Framework for Improving Critical Infrastructure Cybersecurity in the next section.

Framework for Improving Critical Infrastructure Cybersecurity

This framework, released by NIST, was created through collaboration between industry and government and consists of standards, guidelines, and practices to promote the protection of critical infrastructure.

The prioritized, flexible, repeatable, and cost-effective approach to the Framework helps owners and operators of critical infrastructure to manage cybersecurity-related risks.

The Framework is a risk-based approach to managing cybersecurity risk and is composed of the following three parts:

  • The Framework Core is a set of cybersecurity activities, desired outcomes, and applicable references that are common across critical infrastructure sectors.

  • The Framework Implementation Tiers provide context on how an organization views cybersecurity risk and the processes in place to manage it.

  • The Framework Profiles represent the outcomes based on business needs that an organization has selected from the Framework Categories and Subcategories.

Let us look at a business scenario in the next section.

Business Scenario

Hilda Jacobs, General Manager – IT Security at Nutri Worldwide Inc., was given the responsibility of selecting appropriate data security controls as part of asset security.

Hilda selected the controls according to the organization's different requirements for the data at rest and data in transit based on the existing risk. She also created a best practices document by referring to available standards for data security.

Question: For implementing Information Security Management System, Hilda Jacobs should refer to which standard?

Answer: ISO 27001

Curious about the CISSP course? Watch our Course Preview for free!


Let us summarize the topics covered in this lesson:

  • Asset security covers different requirements including the concepts, principles, and standards to secure assets.

  • It addresses the collection, handling, processing, and securing of information throughout the IT lifecycle.

  • It highlights the use of various controls to provide different levels confidentiality, integrity, and availability of all IT services throughout the organization.

  • Security practitioners must understand and implement security controls for both data at rest and data in transit.

  • Security professionals have to be familiar with leading security standards and the bodies responsible for them.


This concludes the domain ‘Asset Security.’ The next lesson will focus on the domain ‘Security Engineering.’

Find our CISSP®- Certified Information Systems Security Professional Online Classroom training classes in top cities:

Name Date Place
CISSP®- Certified Information Systems Security Professional 28 May -19 Jun 2021, Weekdays batch Your City View Details
CISSP®- Certified Information Systems Security Professional 5 Jun -27 Jun 2021, Weekend batch Atlanta View Details
CISSP®- Certified Information Systems Security Professional 14 Jun -5 Jul 2021, Weekdays batch Washington View Details
  • Disclaimer
  • PMP, PMI, PMBOK, CAPM, PgMP, PfMP, ACP, PBA, RMP, SP, and OPM3 are registered marks of the Project Management Institute, Inc.

Request more information

For individuals
For business
Phone Number*
Your Message (Optional)
We are looking into your query.
Our consultants will get in touch with you soon.

A Simplilearn representative will get back to you in one business day.

First Name*
Last Name*
Work Email*
Phone Number*
Job Title*