Sensitive Data Exposure (Fuzzing)

Fuzzing is a technique where invalid, random, or unexpected data is used to produce either unexpected states or gain access to hidden features.

Security Assessment

Security_Assessment_ Fuzzer

CVSS Vector: AV:N/AC:L/PR:N/UI:N/S:U/C:L/I:N/A:N

Vulnerability Information

Sensitive data exposure is associated with how teams handle security controls for certain information. Missing or poor encryption is one of the most common vulnerabilities that lead to the exposure of sensitive data. Cybercriminals typically leverage sensitive data exposure to get a hold of passwords, cryptographic keys, tokens, and other information they can use for system compromise. Some commonly known flaws that lead to the exposure of sensitive data include:

Lack of SSL/HTTPS Security on Websites

As web applications gain mainstream use for modern enterprises, it is important to keep users/visitors protected. SSL Certificates are used to encrypt data between websites/applications and web servers. Organizations with misconfigured SSL/HTTPS security, risk compromising the users’ privacy and data integrity since it can easily be intercepted in transit.

SQL Injection Vulnerabilities in Databases

Without proper security controls, attackers can exploit malicious statements to retrieve the contents of a database. This allows them to create SQL statements that let them perform a wide variety of database administration actions. Hackers can retrieve sensitive information, such as user credentials or application configuration information, which they then use to penetrate and compromise the system further.

Guides

The following section outlines the best practices and tools that can be used to prevent sensitive data exposure. 

  1. Identify and Classify Sensitive Data 

    It is important to determine and classify sensitive data with extra security controls. This data should then be filtered by the level of sensitivity, and then be secured with the appropriate security controls.
  2. Apply Access Controls

    Security teams should focus their energy on the processes of authentication, authorization, and session management through the provisioning of a robust Identity and Access Management (IAM) mechanism. With the right access controls in place, organizations must ensure that only the intended individuals can view and modify sensitive data.
  3. Perform Proper Data Encryption with Strong, Updated Protocols

    Sensitive data should never be stored in plain text. It is important to ensure that user credentials and other personal information are protected using modern cryptographic algorithms that address the latest security vulnerabilities. 
  4. Store Passwords Using Strong, Adaptive, and Salted Hashing Functions

    Given the advancement of security controls, attackers have also devised clever ways to retrieve passwords. For instance, a hacker can use a rainbow table of precalculated hashes to access a password file that uses unsalted hashes. Salted hashes enhance password security by adding random inputs to a hash function, guaranteeing a unique output, and are thus recommended over unsalted hashes.
  5. Disable Caching and Autocomplete on Data Collection forms

    While caching and autocomplete features help improve user experience, they contain security risks that may attract attackers. Hackers may rely on a user’s browser to log in to an account easily since the autocomplete feature fills in the credentials.

    Caching stores sections of web pages for easier loading in subsequent visits, which allows attackers to use it to map out a user’s movements. Attackers also use cache data to tailor malware. As a best practice, it is recommended that caching and autocomplete of forms are disabled by default, and only activated as needed.

  6. Minimize Data Surface Area

    Security teams should reduce the system’s data attack surface area by considering careful API design, ensuring only the bare minimum amount of data is included in server responses. While doing so, it must be also ensured that the server response does not expose information about the system’s configuration. Random testing and Data filtering should also be performed at the server-side to reduce the risk of attackers intercepting sensitive data in unfiltered traffic in transit.

For more information about Crashtest Security visit crashtest-security.com