💡 Still using the classic Drata experience? Refer to Monitoring for the original UI.
Monitoring helps you review test results and maintain audit readiness across frameworks such as SOC 2, ISO 27001, and HIPAA. It provides visibility into the status of every test in your workspace so you can track readiness and take corrective action when needed.
Mapped vs unmapped tests
Each test in Monitoring can be either mapped or unmapped to a control.
Mapped tests
Tests that are connected to one or more controls. Their results impact control readiness and any frameworks linked to those controls.
Example: An MFA test mapped to SOC 2 CC6.1. If the test fails, the control is not ready and your SOC 2 readiness score reflects the gap.
Unmapped tests
Tests that are not connected to any control. Their results do not affect readiness scores but still provide visibility into risk areas you may want to map later.
Prerequisites
Must have the one of the following Drata RBAC roles:
Admins
Information Security Leads
Workspace Managers
Control Managers
DevOps Engineers
Users with these roles may be assigned read-only access by an admin. If you have read-only access, you can view Monitoring data but cannot make changes.
Navigate to Monitoring
From the main navigation, go to: Compliance → Monitoring
The Monitoring page has two tabs: Production and Codebase.
Production vs Codebase
The Monitoring page has two tabs that show different compliance tests in your workspace.
Tab | Description |
Production | Shows tests that run against your connected business systems. These often include identity providers (Okta, Azure AD), cloud infrastructure (AWS, GCP, Azure), security tools, and HR platforms. |
Codebase | Shows compliance tests that scan connected code repositories using infrastructure-as-code (IaC) analysis. These detect misconfigurations, missing guardrails, and policy violations early in development. |
In general, use Production to monitor your live environment and Codebase to prevent issues earlier in development.
Monitoring summary (Production)
At the top of the Production tab, a summary of test results across your connected systems is displayed. Draft tests (tests that are not published) are not included in these metrics.
The following metrics are displayed:
Metric | Description |
% of tests passed | Percentage of tests that passed their last run |
Passed tests | Tests currently compliant |
Failed tests | Tests that require remediation |
Error tests | Test that could not complete their run. |
This summary helps you quickly understand your compliance readiness.
Monitoring summary (Codebase)
The Codebase tab displays security and compliance test results from your connected code repositories. These tests use infrastructure-as-code (IaC) rules to detect misconfigurations early in development.
At the top of this tab, the following metrics are displayed:
Metric | Description |
Repositories monitored | Number of connected repositories being scanned |
Failed tests | Number of code-level tests that failed |
Passed tests | Number of successful tests |
These results help engineering and security teams identify risks before code is deployed.
Explore the test table
Note: The Codebase tab uses the same table and actions.
The Monitoring table lists all of your tests and includes the following information:
Column | Description |
Name | Test name with a Draft label if applicable. |
Result | Latest test outcome (Passed, Failed, Error). |
Findings | Number of issues identified for this test. If the test is running (aka testing), a dash is displayed instead. |
Status | Whether the test is Enabled, Disabled, or Testing… |
Category (Production tab only) | Type of test (Device, Identity Provider, Infrastructure, Policy, etc.). |
Active connection (Production tab only) | Integration connected to the test (for example, Okta, AWS, GitHub). |
Example image of the Monitoring table
Take bulk actions
Use bulk actions to manage multiple tests at once. Select multiple tests to run one of these actions.
Action | Description |
Test now | Run the selected tests immediately. |
Enable | Turn on the selected tests so they run daily with Autopilot. |
Disable | Turn off the selected tests if they are not relevant. |
More > Download CSV | Export test details for record-keeping or auditor review. |
Steps:
Select one or more tests using the checkboxes in the test table.
Choose the action you want to perform.
Example image of the Monitoring table bulk action
Filter and Search Tests
You can narrow down tests by:
New: Highlights newly added tests so you can review and enable them proactively.
Result: Passed, Failed, Error.
Status: Enabled, Disabled, Unused, Testing.
Category (Production tab only): Device, Identity Provider, Infrastructure, Policy, etc.
Type (Production tab only): Drata, Custom Draft, Custom Published.
Exclusions: Filter tests which have exclusions.
Connection (Production tab only): Select a connection.
Control: Select a control to filter which tests are mapped to that control.
Framework: Select a control to filter which tests are mapped to that control.
Tickets: In progress or done.


