Skip to main content
Monitoring

Continuous, automated testing of your security controls

Updated over a month ago

The Monitoring page provides a single place to view the status of your security controls.

Prerequisite

Admins, Information Security Leads, DevOps Engineer, Workspace Managers, and Control Managers can access this section within Drata.

Access Monitoring page

Select Monitoring from the left navigation menu to access this page.

At the top of the Monitoring page, you can view the overall summary which includes percentage Of Tests Passed, number of Failed Tests, and number of Passed Tests.

You can also filter your tests based on the test results, category, test type, test status, ticket statuses, and if the test has exclusions.

Test Result filter description

The following list describes the filter for test results.

  • Failed: Drata has captured all necessary data and determined the required conditions are not being met.

  • Passed: Drata has captured all necessary data and determined the required conditions are being met.

  • Error: Some blocker exists preventing Drata from capturing all necessary data, hence a determination cannot be made if the required conditions are being met.

Category filter description

The following list describes the filters for Category.

  • Policy: Filter for monitoring tests that automatically verifies for the required policy’s existence, review, or acknowledgment. External policy management and HRIS connection types can use some of these tests.

  • In Drata: Filter for tests within Drata.

  • Device: Filter for tests that automatically monitor device compliance.

  • Infrastructure: Filter for tests that automatically monitor infrastructure-related compliance requirements like Load Balancer Used or Availability Zone Used. This category can also overlaps with the Observability filter.

  • Identity Provider: Filter for tests that automatically monitor authentication (access) and authorization compliance requirements like MFA usage.

  • Version Control: Filter for monitoring tests that automatically monitor code base-related requirements like the code review process.

  • Ticketing: Filter for monitoring tests that can be powered by a ticketing integration like Jira. Examples include Security issues are prioritized.

  • Observability: Filter for monitoring tests that automatically verify for the correct configurations of infrastructure. These tests can be powered by infrastructure monitoring systems like Cloud Security Posture Management (CSPM) that look for misconfigurations or potential risks in your infrastructure, Vulnerability Scanners to detect vulnerabilities, and more.

Test Status filter description

The following list describes the filters for Test Status.

  • Enabled: The integration on which this test needs to run is connected, or policy drafts have been started - DOES impact control readiness.

  • Disabled: The test was ready to run or was running, but the customer deemed the test unnecessary for their business - DOES NOT impact control readiness.

  • Unused: The integration on which this test needs to run is not connected, or no policy drafts have been started - DOES NOT impact control readiness.

  • New: The test was newly made available by Drata and the integration on which this test needs to run is connected - DOES NOT impact control readiness until status is updated to enabled.

Test table overview

Within the table, along each row of tests, there are several different icons. One of the icons displays when the test was last run. Next to this icon, is an icon that displays the type of test or the category that the test is in. You can hover over this icon to see the category name. Next to this icon, is an icon that showcases the status of the test. The green checkmark means the test passed, a Fix Now button indicates the test has failed, and a yellow caution means there is an error.

If you select Fix Now button, a drawer will open, providing more details on why the test failed and offering direction on how to fix it.

  • Note: If you have multiple accounts connected that a test is run on, you will be able to differentiate between those accounts based on the logos to the left of the accounts.

In the table, each row represents a different test and includes several icons or information to help you understand the test details at a glance:

  • Last Tested: Displays when the test was last run.

  • Test Type icon: Displays an icon that indicates the type or category of the test. Hover over this icon to see the category name.

  • Test Status: Displays the status of the test.

    • A green checkmark means the test passed.

    • A Fix Now button means the test failed.

      • If you select the Fix Now button, a drawer will open with more details on why the test failed and instructions on how to fix it.

      • Note: If you have multiple accounts connected to a test, you can differentiate between them by the logos on the left side of the accounts.

    • A yellow caution sign means there is an error.

Test History

When you scroll down to Test history section, you can select the See Raw Test Evidence button. This will take you to Event Tracking and allow you to drill into the specific results and JSON file from each time the test has ran. Learn more about examining the specific results of the test here.

Did this answer your question?